Apr 17 20:01:31.425831 ip-10-0-131-172 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 20:01:31.425842 ip-10-0-131-172 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 20:01:31.425850 ip-10-0-131-172 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 20:01:31.426078 ip-10-0-131-172 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 20:01:41.559412 ip-10-0-131-172 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 20:01:41.559427 ip-10-0-131-172 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 7803e7435bad4e4187b8d35b9aab21a7 -- Apr 17 20:04:21.027128 ip-10-0-131-172 systemd[1]: Starting Kubernetes Kubelet... Apr 17 20:04:21.407160 ip-10-0-131-172 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 20:04:21.407160 ip-10-0-131-172 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 20:04:21.407160 ip-10-0-131-172 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 20:04:21.407160 ip-10-0-131-172 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 20:04:21.407160 ip-10-0-131-172 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 20:04:21.409688 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.409599 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 20:04:21.413492 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413476 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:04:21.413492 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413493 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:04:21.413556 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413496 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:04:21.413556 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413500 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:04:21.413556 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413503 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:04:21.413556 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413506 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:04:21.413556 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413509 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:04:21.413556 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413511 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:04:21.413556 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413514 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:04:21.413556 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413517 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:04:21.413556 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413520 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:04:21.413556 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413522 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:04:21.413556 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413525 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:04:21.413556 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413527 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:04:21.413556 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413530 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:04:21.413556 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413539 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:04:21.413556 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413541 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:04:21.413556 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413544 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:04:21.413556 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413546 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:04:21.413556 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413549 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:04:21.413556 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413552 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:04:21.413556 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413555 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:04:21.414049 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413557 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:04:21.414049 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413560 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:04:21.414049 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413563 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:04:21.414049 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413565 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:04:21.414049 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413568 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:04:21.414049 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413571 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:04:21.414049 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413574 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:04:21.414049 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413578 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:04:21.414049 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413581 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:04:21.414049 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413583 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:04:21.414049 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413586 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:04:21.414049 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413589 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:04:21.414049 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413591 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:04:21.414049 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413594 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:04:21.414049 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413596 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:04:21.414049 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413599 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:04:21.414049 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413601 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:04:21.414049 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413604 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:04:21.414049 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413607 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:04:21.414049 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413612 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:04:21.414533 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413616 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:04:21.414533 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413620 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:04:21.414533 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413622 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:04:21.414533 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413625 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:04:21.414533 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413627 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:04:21.414533 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413630 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:04:21.414533 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413633 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:04:21.414533 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413635 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:04:21.414533 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413638 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:04:21.414533 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413640 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:04:21.414533 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413643 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:04:21.414533 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413645 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:04:21.414533 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413648 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:04:21.414533 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413653 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:04:21.414533 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413657 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:04:21.414533 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413661 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:04:21.414533 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413664 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:04:21.414533 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413667 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:04:21.414533 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413670 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:04:21.415007 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413672 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:04:21.415007 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413675 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:04:21.415007 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413679 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:04:21.415007 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413682 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:04:21.415007 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413684 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:04:21.415007 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413687 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:04:21.415007 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413690 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:04:21.415007 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413693 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:04:21.415007 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413695 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:04:21.415007 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413698 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:04:21.415007 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413700 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:04:21.415007 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413703 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:04:21.415007 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413706 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:04:21.415007 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413709 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:04:21.415007 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413714 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:04:21.415007 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413716 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:04:21.415007 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413719 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:04:21.415007 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413721 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:04:21.415007 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413724 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:04:21.415007 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413727 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:04:21.415478 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413731 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:04:21.415478 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413734 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:04:21.415478 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413737 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:04:21.415478 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413739 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:04:21.415478 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.413742 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:04:21.415478 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414126 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:04:21.415478 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414133 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:04:21.415478 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414136 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:04:21.415478 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414139 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:04:21.415478 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414141 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:04:21.415478 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414144 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:04:21.415478 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414146 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:04:21.415478 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414149 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:04:21.415478 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414152 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:04:21.415478 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414154 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:04:21.415478 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414157 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:04:21.415478 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414160 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:04:21.415478 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414163 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:04:21.415478 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414165 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:04:21.415478 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414168 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:04:21.415981 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414170 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:04:21.415981 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414173 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:04:21.415981 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414175 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:04:21.415981 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414178 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:04:21.415981 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414181 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:04:21.415981 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414183 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:04:21.415981 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414185 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:04:21.415981 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414188 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:04:21.415981 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414191 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:04:21.415981 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414193 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:04:21.415981 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414196 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:04:21.415981 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414199 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:04:21.415981 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414202 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:04:21.415981 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414204 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:04:21.415981 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414207 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:04:21.415981 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414209 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:04:21.415981 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414212 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:04:21.415981 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414214 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:04:21.415981 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414217 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:04:21.415981 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414220 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:04:21.416509 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414223 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:04:21.416509 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414226 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:04:21.416509 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414228 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:04:21.416509 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414230 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:04:21.416509 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414233 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:04:21.416509 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414235 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:04:21.416509 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414238 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:04:21.416509 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414240 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:04:21.416509 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414242 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:04:21.416509 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414245 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:04:21.416509 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414248 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:04:21.416509 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414250 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:04:21.416509 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414253 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:04:21.416509 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414255 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:04:21.416509 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414258 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:04:21.416509 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414260 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:04:21.416509 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414263 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:04:21.416509 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414265 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:04:21.416509 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414268 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:04:21.416509 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414270 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:04:21.417015 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414273 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:04:21.417015 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414275 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:04:21.417015 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414277 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:04:21.417015 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414281 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:04:21.417015 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414284 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:04:21.417015 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414287 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:04:21.417015 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414290 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:04:21.417015 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414292 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:04:21.417015 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414294 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:04:21.417015 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414297 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:04:21.417015 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414300 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:04:21.417015 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414303 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:04:21.417015 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414305 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:04:21.417015 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414308 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:04:21.417015 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414312 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:04:21.417015 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414315 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:04:21.417015 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414318 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:04:21.417015 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414321 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:04:21.417015 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414323 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:04:21.417559 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414326 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:04:21.417559 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414328 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:04:21.417559 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414331 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:04:21.417559 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414333 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:04:21.417559 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414337 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:04:21.417559 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414339 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:04:21.417559 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414342 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:04:21.417559 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414344 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:04:21.417559 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414347 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:04:21.417559 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414349 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:04:21.417559 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414351 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:04:21.417559 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.414354 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:04:21.417559 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.414937 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 20:04:21.417559 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.414946 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 20:04:21.417559 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.414954 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 20:04:21.417559 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.414958 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 20:04:21.417559 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.414963 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 20:04:21.417559 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.414966 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 20:04:21.417559 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.414971 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 20:04:21.417559 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.414976 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 20:04:21.417559 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.414979 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 20:04:21.418091 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.414983 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 20:04:21.418091 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.414986 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 20:04:21.418091 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.414991 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 20:04:21.418091 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.414994 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 20:04:21.418091 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.414998 2576 flags.go:64] FLAG: --cgroup-root="" Apr 17 20:04:21.418091 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415001 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 20:04:21.418091 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415004 2576 flags.go:64] FLAG: --client-ca-file="" Apr 17 20:04:21.418091 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415007 2576 flags.go:64] FLAG: --cloud-config="" Apr 17 20:04:21.418091 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415011 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 17 20:04:21.418091 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415014 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 20:04:21.418091 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415018 2576 flags.go:64] FLAG: --cluster-domain="" Apr 17 20:04:21.418091 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415021 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 20:04:21.418091 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415024 2576 flags.go:64] FLAG: --config-dir="" Apr 17 20:04:21.418091 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415027 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 20:04:21.418091 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415030 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 20:04:21.418091 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415034 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 20:04:21.418091 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415037 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 20:04:21.418091 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415040 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 20:04:21.418091 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415043 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 20:04:21.418091 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415046 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 17 20:04:21.418091 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415049 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 20:04:21.418091 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415052 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 20:04:21.418091 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415055 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 20:04:21.418091 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415058 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 20:04:21.418091 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415063 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 20:04:21.418698 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415066 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 20:04:21.418698 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415069 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 20:04:21.418698 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415071 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 20:04:21.418698 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415075 2576 flags.go:64] FLAG: --enable-server="true" Apr 17 20:04:21.418698 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415078 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 20:04:21.418698 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415083 2576 flags.go:64] FLAG: --event-burst="100" Apr 17 20:04:21.418698 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415086 2576 flags.go:64] FLAG: --event-qps="50" Apr 17 20:04:21.418698 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415090 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 20:04:21.418698 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415096 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 20:04:21.418698 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415099 2576 flags.go:64] FLAG: --eviction-hard="" Apr 17 20:04:21.418698 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415103 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 20:04:21.418698 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415106 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 20:04:21.418698 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415109 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 20:04:21.418698 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415112 2576 flags.go:64] FLAG: --eviction-soft="" Apr 17 20:04:21.418698 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415115 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 20:04:21.418698 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415118 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 20:04:21.418698 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415121 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 20:04:21.418698 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415124 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 20:04:21.418698 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415127 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 20:04:21.418698 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415129 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 20:04:21.418698 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415132 2576 flags.go:64] FLAG: --feature-gates="" Apr 17 20:04:21.418698 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415136 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 20:04:21.418698 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415139 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 20:04:21.418698 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415142 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 20:04:21.418698 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415146 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 20:04:21.419366 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415149 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 17 20:04:21.419366 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415152 2576 flags.go:64] FLAG: --help="false" Apr 17 20:04:21.419366 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415155 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-131-172.ec2.internal" Apr 17 20:04:21.419366 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415158 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 20:04:21.419366 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415161 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 20:04:21.419366 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415164 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 20:04:21.419366 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415168 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 20:04:21.419366 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415171 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 20:04:21.419366 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415174 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 20:04:21.419366 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415177 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 20:04:21.419366 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415180 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 20:04:21.419366 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415183 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 20:04:21.419366 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415186 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 20:04:21.419366 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415189 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 20:04:21.419366 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415193 2576 flags.go:64] FLAG: --kube-reserved="" Apr 17 20:04:21.419366 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415197 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 20:04:21.419366 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415200 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 20:04:21.419366 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415203 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 20:04:21.419366 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415206 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 20:04:21.419366 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415209 2576 flags.go:64] FLAG: --lock-file="" Apr 17 20:04:21.419366 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415212 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 20:04:21.419366 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415214 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 20:04:21.419366 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415217 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 20:04:21.419366 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415223 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 20:04:21.419950 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415225 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 20:04:21.419950 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415228 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 20:04:21.419950 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415231 2576 flags.go:64] FLAG: --logging-format="text" Apr 17 20:04:21.419950 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415234 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 20:04:21.419950 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415238 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 20:04:21.419950 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415240 2576 flags.go:64] FLAG: --manifest-url="" Apr 17 20:04:21.419950 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415243 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 17 20:04:21.419950 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415248 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 20:04:21.419950 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415251 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 20:04:21.419950 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415255 2576 flags.go:64] FLAG: --max-pods="110" Apr 17 20:04:21.419950 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415258 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 20:04:21.419950 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415262 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 20:04:21.419950 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415265 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 20:04:21.419950 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415268 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 20:04:21.419950 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415271 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 20:04:21.419950 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415274 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 20:04:21.419950 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415277 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 20:04:21.419950 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415285 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 20:04:21.419950 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415288 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 20:04:21.419950 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415291 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 20:04:21.419950 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415294 2576 flags.go:64] FLAG: --pod-cidr="" Apr 17 20:04:21.419950 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415297 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 20:04:21.419950 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415304 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 20:04:21.420490 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415308 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 20:04:21.420490 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415312 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 17 20:04:21.420490 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415315 2576 flags.go:64] FLAG: --port="10250" Apr 17 20:04:21.420490 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415318 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 20:04:21.420490 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415321 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0c2cd0e650a02264b" Apr 17 20:04:21.420490 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415324 2576 flags.go:64] FLAG: --qos-reserved="" Apr 17 20:04:21.420490 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415327 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 17 20:04:21.420490 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415331 2576 flags.go:64] FLAG: --register-node="true" Apr 17 20:04:21.420490 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415333 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 17 20:04:21.420490 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415336 2576 flags.go:64] FLAG: --register-with-taints="" Apr 17 20:04:21.420490 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415340 2576 flags.go:64] FLAG: --registry-burst="10" Apr 17 20:04:21.420490 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415343 2576 flags.go:64] FLAG: --registry-qps="5" Apr 17 20:04:21.420490 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415345 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 17 20:04:21.420490 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415348 2576 flags.go:64] FLAG: --reserved-memory="" Apr 17 20:04:21.420490 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415353 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 20:04:21.420490 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415356 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 20:04:21.420490 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415359 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 20:04:21.420490 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415361 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 20:04:21.420490 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415364 2576 flags.go:64] FLAG: --runonce="false" Apr 17 20:04:21.420490 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415367 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 20:04:21.420490 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415370 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 20:04:21.420490 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415373 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 17 20:04:21.420490 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415380 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 20:04:21.420490 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415382 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 20:04:21.420490 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415385 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 20:04:21.420490 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415388 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 20:04:21.421128 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415392 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 20:04:21.421128 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415394 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 20:04:21.421128 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415397 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 20:04:21.421128 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415400 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 20:04:21.421128 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415403 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 20:04:21.421128 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415406 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 20:04:21.421128 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415411 2576 flags.go:64] FLAG: --system-cgroups="" Apr 17 20:04:21.421128 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415414 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 20:04:21.421128 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415420 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 20:04:21.421128 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415423 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 17 20:04:21.421128 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415426 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 20:04:21.421128 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415430 2576 flags.go:64] FLAG: --tls-min-version="" Apr 17 20:04:21.421128 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415433 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 20:04:21.421128 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415436 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 20:04:21.421128 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415439 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 20:04:21.421128 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415442 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 20:04:21.421128 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415445 2576 flags.go:64] FLAG: --v="2" Apr 17 20:04:21.421128 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415449 2576 flags.go:64] FLAG: --version="false" Apr 17 20:04:21.421128 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415453 2576 flags.go:64] FLAG: --vmodule="" Apr 17 20:04:21.421128 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415457 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 20:04:21.421128 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.415460 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 20:04:21.421128 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415552 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:04:21.421128 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415556 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:04:21.421128 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415559 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:04:21.421880 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415562 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:04:21.421880 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415565 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:04:21.421880 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415568 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:04:21.421880 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415571 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:04:21.421880 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415574 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:04:21.421880 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415577 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:04:21.421880 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415580 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:04:21.421880 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415582 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:04:21.421880 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415585 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:04:21.421880 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415588 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:04:21.421880 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415592 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:04:21.421880 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415596 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:04:21.421880 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415598 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:04:21.421880 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415602 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:04:21.421880 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415608 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:04:21.421880 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415611 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:04:21.421880 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415615 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:04:21.421880 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415618 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:04:21.421880 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415620 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:04:21.422633 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415623 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:04:21.422633 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415626 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:04:21.422633 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415628 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:04:21.422633 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415631 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:04:21.422633 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415633 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:04:21.422633 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415636 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:04:21.422633 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415638 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:04:21.422633 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415641 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:04:21.422633 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415644 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:04:21.422633 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415646 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:04:21.422633 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415649 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:04:21.422633 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415652 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:04:21.422633 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415654 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:04:21.422633 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415657 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:04:21.422633 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415659 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:04:21.422633 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415662 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:04:21.422633 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415664 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:04:21.422633 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415668 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:04:21.422633 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415671 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:04:21.422633 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415673 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:04:21.423230 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415676 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:04:21.423230 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415678 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:04:21.423230 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415681 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:04:21.423230 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415683 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:04:21.423230 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415686 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:04:21.423230 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415689 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:04:21.423230 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415691 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:04:21.423230 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415695 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:04:21.423230 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415697 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:04:21.423230 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415700 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:04:21.423230 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415703 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:04:21.423230 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415706 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:04:21.423230 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415708 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:04:21.423230 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415711 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:04:21.423230 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415713 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:04:21.423230 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415716 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:04:21.423230 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415718 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:04:21.423230 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415721 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:04:21.423230 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415723 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:04:21.423230 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415726 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:04:21.423714 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415728 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:04:21.423714 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415731 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:04:21.423714 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415733 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:04:21.423714 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415736 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:04:21.423714 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415738 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:04:21.423714 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415741 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:04:21.423714 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415743 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:04:21.423714 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415746 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:04:21.423714 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415748 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:04:21.423714 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415752 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:04:21.423714 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415755 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:04:21.423714 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415757 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:04:21.423714 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415760 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:04:21.423714 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415762 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:04:21.423714 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415765 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:04:21.423714 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415768 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:04:21.423714 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415770 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:04:21.423714 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415773 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:04:21.423714 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415775 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:04:21.424203 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415779 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:04:21.424203 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415782 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:04:21.424203 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415785 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:04:21.424203 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415787 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:04:21.424203 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.415797 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:04:21.424203 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.416425 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 20:04:21.424203 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.423906 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 20:04:21.424203 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.423922 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 20:04:21.424203 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.423979 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:04:21.424203 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.423984 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:04:21.424203 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.423988 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:04:21.424203 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.423991 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:04:21.424203 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.423995 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:04:21.424203 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.423998 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:04:21.424203 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424001 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:04:21.424588 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424003 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:04:21.424588 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424006 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:04:21.424588 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424010 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:04:21.424588 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424012 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:04:21.424588 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424015 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:04:21.424588 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424018 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:04:21.424588 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424021 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:04:21.424588 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424024 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:04:21.424588 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424026 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:04:21.424588 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424029 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:04:21.424588 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424031 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:04:21.424588 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424034 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:04:21.424588 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424036 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:04:21.424588 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424039 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:04:21.424588 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424041 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:04:21.424588 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424044 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:04:21.424588 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424047 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:04:21.424588 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424050 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:04:21.424588 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424053 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:04:21.424588 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424055 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:04:21.425098 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424058 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:04:21.425098 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424060 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:04:21.425098 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424063 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:04:21.425098 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424066 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:04:21.425098 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424069 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:04:21.425098 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424072 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:04:21.425098 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424075 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:04:21.425098 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424078 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:04:21.425098 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424081 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:04:21.425098 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424083 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:04:21.425098 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424086 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:04:21.425098 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424088 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:04:21.425098 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424091 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:04:21.425098 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424093 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:04:21.425098 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424096 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:04:21.425098 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424099 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:04:21.425098 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424102 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:04:21.425098 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424104 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:04:21.425098 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424107 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:04:21.425098 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424109 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:04:21.425575 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424112 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:04:21.425575 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424114 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:04:21.425575 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424117 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:04:21.425575 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424120 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:04:21.425575 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424123 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:04:21.425575 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424125 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:04:21.425575 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424128 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:04:21.425575 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424130 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:04:21.425575 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424133 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:04:21.425575 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424135 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:04:21.425575 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424138 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:04:21.425575 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424140 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:04:21.425575 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424142 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:04:21.425575 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424147 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:04:21.425575 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424151 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:04:21.425575 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424154 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:04:21.425575 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424159 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:04:21.425575 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424163 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:04:21.425575 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424167 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:04:21.426053 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424170 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:04:21.426053 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424173 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:04:21.426053 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424175 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:04:21.426053 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424178 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:04:21.426053 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424180 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:04:21.426053 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424183 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:04:21.426053 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424185 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:04:21.426053 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424188 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:04:21.426053 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424190 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:04:21.426053 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424193 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:04:21.426053 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424196 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:04:21.426053 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424199 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:04:21.426053 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424201 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:04:21.426053 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424204 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:04:21.426053 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424207 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:04:21.426053 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424209 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:04:21.426053 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424212 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:04:21.426053 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424215 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:04:21.426053 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424218 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:04:21.426053 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424220 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:04:21.426547 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.424225 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 20:04:21.426547 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424326 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:04:21.426547 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424331 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:04:21.426547 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424334 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:04:21.426547 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424337 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:04:21.426547 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424339 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:04:21.426547 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424342 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:04:21.426547 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424345 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:04:21.426547 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424348 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:04:21.426547 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424352 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:04:21.426547 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424354 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:04:21.426547 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424357 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:04:21.426547 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424360 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:04:21.426547 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424362 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:04:21.426547 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424365 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:04:21.426547 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424367 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:04:21.426963 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424370 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:04:21.426963 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424372 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:04:21.426963 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424375 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:04:21.426963 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424378 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:04:21.426963 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424380 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:04:21.426963 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424383 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:04:21.426963 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424385 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:04:21.426963 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424388 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:04:21.426963 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424391 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:04:21.426963 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424394 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:04:21.426963 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424396 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:04:21.426963 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424399 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:04:21.426963 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424401 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:04:21.426963 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424403 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:04:21.426963 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424406 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:04:21.426963 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424409 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:04:21.426963 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424411 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:04:21.426963 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424414 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:04:21.426963 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424416 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:04:21.427413 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424419 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:04:21.427413 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424422 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:04:21.427413 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424425 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:04:21.427413 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424429 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:04:21.427413 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424431 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:04:21.427413 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424434 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:04:21.427413 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424437 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:04:21.427413 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424439 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:04:21.427413 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424442 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:04:21.427413 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424444 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:04:21.427413 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424447 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:04:21.427413 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424450 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:04:21.427413 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424452 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:04:21.427413 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424455 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:04:21.427413 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424457 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:04:21.427413 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424460 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:04:21.427413 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424462 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:04:21.427413 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424465 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:04:21.427413 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424467 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:04:21.427894 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424470 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:04:21.427894 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424472 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:04:21.427894 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424476 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:04:21.427894 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424478 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:04:21.427894 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424481 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:04:21.427894 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424484 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:04:21.427894 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424486 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:04:21.427894 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424489 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:04:21.427894 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424491 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:04:21.427894 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424494 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:04:21.427894 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424496 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:04:21.427894 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424499 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:04:21.427894 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424501 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:04:21.427894 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424503 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:04:21.427894 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424506 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:04:21.427894 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424508 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:04:21.427894 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424511 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:04:21.427894 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424513 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:04:21.427894 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424516 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:04:21.428379 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424520 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:04:21.428379 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424523 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:04:21.428379 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424526 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:04:21.428379 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424529 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:04:21.428379 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424532 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:04:21.428379 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424534 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:04:21.428379 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424537 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:04:21.428379 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424540 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:04:21.428379 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424543 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:04:21.428379 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424546 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:04:21.428379 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424548 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:04:21.428379 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424551 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:04:21.428379 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424554 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:04:21.428379 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:21.424556 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:04:21.428379 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.424561 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 20:04:21.428749 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.425292 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 20:04:21.428956 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.428942 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 20:04:21.429822 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.429812 2576 server.go:1019] "Starting client certificate rotation" Apr 17 20:04:21.429941 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.429921 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 20:04:21.429992 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.429958 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 20:04:21.450168 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.450145 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 20:04:21.452649 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.452634 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 20:04:21.469073 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.469047 2576 log.go:25] "Validated CRI v1 runtime API" Apr 17 20:04:21.474035 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.474018 2576 log.go:25] "Validated CRI v1 image API" Apr 17 20:04:21.476489 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.476470 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 20:04:21.477535 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.477511 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 20:04:21.479157 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.479134 2576 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 94f075ae-4a6a-4f7e-b5dc-b529829c77e2:/dev/nvme0n1p3 c6d74c25-efff-433c-af1c-a54740ddccf2:/dev/nvme0n1p4] Apr 17 20:04:21.479221 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.479159 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 20:04:21.485261 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.485151 2576 manager.go:217] Machine: {Timestamp:2026-04-17 20:04:21.48339192 +0000 UTC m=+0.351211672 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098536 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2552653157506ed0052ccaa187f76c SystemUUID:ec255265-3157-506e-d005-2ccaa187f76c BootID:7803e743-5bad-4e41-87b8-d35b9aab21a7 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:c5:2b:6b:ab:bb Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:c5:2b:6b:ab:bb Speed:0 Mtu:9001} {Name:ovs-system MacAddress:f2:c6:e8:46:20:39 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 20:04:21.485261 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.485258 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 20:04:21.485373 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.485338 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 20:04:21.485667 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.485647 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 20:04:21.485810 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.485667 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-131-172.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 20:04:21.485870 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.485820 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 20:04:21.485870 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.485829 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 20:04:21.485870 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.485861 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 20:04:21.486383 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.486373 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 20:04:21.487575 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.487565 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 17 20:04:21.487685 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.487676 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 20:04:21.489544 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.489533 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 17 20:04:21.489582 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.489555 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 20:04:21.489582 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.489567 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 20:04:21.489582 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.489577 2576 kubelet.go:397] "Adding apiserver pod source" Apr 17 20:04:21.489666 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.489586 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 20:04:21.490614 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.490600 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 20:04:21.490652 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.490628 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 20:04:21.493146 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.493129 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 20:04:21.494442 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.494428 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 20:04:21.495913 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.495900 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 20:04:21.495953 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.495919 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 20:04:21.495953 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.495929 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 20:04:21.495953 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.495939 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 20:04:21.495953 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.495949 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 20:04:21.496065 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.495957 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 20:04:21.496065 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.495964 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 20:04:21.496065 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.495969 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 20:04:21.496065 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.495976 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 20:04:21.496065 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.495982 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 20:04:21.496065 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.496002 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 20:04:21.496065 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.496011 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 20:04:21.496608 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.496598 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 20:04:21.496641 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.496609 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 20:04:21.496802 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.496787 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-dksxk" Apr 17 20:04:21.500703 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.500688 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 20:04:21.500759 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.500728 2576 server.go:1295] "Started kubelet" Apr 17 20:04:21.500811 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.500789 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 20:04:21.500974 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.500922 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 20:04:21.501071 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.501002 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 20:04:21.501409 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.501391 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-172.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 20:04:21.501494 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:21.501409 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 20:04:21.501582 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:21.501521 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-172.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 20:04:21.501638 ip-10-0-131-172 systemd[1]: Started Kubernetes Kubelet. Apr 17 20:04:21.502185 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.502127 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 20:04:21.503389 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.503373 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 17 20:04:21.504447 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.504423 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-dksxk" Apr 17 20:04:21.509098 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:21.509064 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 20:04:21.509893 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.509877 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 20:04:21.510268 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.510252 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 20:04:21.510974 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.510956 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 20:04:21.511121 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.511103 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 20:04:21.511186 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.511131 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 20:04:21.511186 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:21.508057 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-172.ec2.internal.18a73d8ec9c489dd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-172.ec2.internal,UID:ip-10-0-131-172.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-131-172.ec2.internal,},FirstTimestamp:2026-04-17 20:04:21.500701149 +0000 UTC m=+0.368520904,LastTimestamp:2026-04-17 20:04:21.500701149 +0000 UTC m=+0.368520904,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-172.ec2.internal,}" Apr 17 20:04:21.511321 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.511308 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 17 20:04:21.511376 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.511321 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 17 20:04:21.511690 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.511528 2576 factory.go:55] Registering systemd factory Apr 17 20:04:21.511690 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.511553 2576 factory.go:223] Registration of the systemd container factory successfully Apr 17 20:04:21.511810 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.511752 2576 factory.go:153] Registering CRI-O factory Apr 17 20:04:21.511810 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.511767 2576 factory.go:223] Registration of the crio container factory successfully Apr 17 20:04:21.511937 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.511834 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 20:04:21.511937 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.511875 2576 factory.go:103] Registering Raw factory Apr 17 20:04:21.511937 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.511891 2576 manager.go:1196] Started watching for new ooms in manager Apr 17 20:04:21.512429 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:21.511908 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-172.ec2.internal\" not found" Apr 17 20:04:21.512492 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.512467 2576 manager.go:319] Starting recovery of all containers Apr 17 20:04:21.514028 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.514007 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:04:21.516639 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:21.516617 2576 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-131-172.ec2.internal\" not found" node="ip-10-0-131-172.ec2.internal" Apr 17 20:04:21.523099 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.522965 2576 manager.go:324] Recovery completed Apr 17 20:04:21.527088 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.527074 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:04:21.529472 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.529459 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-172.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:04:21.529523 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.529489 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-172.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:04:21.529523 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.529499 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-172.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:04:21.529970 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.529957 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 20:04:21.530042 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.529970 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 20:04:21.530042 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.529989 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 17 20:04:21.532064 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.532051 2576 policy_none.go:49] "None policy: Start" Apr 17 20:04:21.532128 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.532069 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 20:04:21.532128 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.532082 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 17 20:04:21.572420 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.572404 2576 manager.go:341] "Starting Device Plugin manager" Apr 17 20:04:21.577906 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:21.572445 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 20:04:21.577906 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.572459 2576 server.go:85] "Starting device plugin registration server" Apr 17 20:04:21.577906 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.572700 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 20:04:21.577906 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.572710 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 20:04:21.577906 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.572790 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 20:04:21.577906 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.572960 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 20:04:21.577906 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.572979 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 20:04:21.577906 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:21.573904 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 20:04:21.577906 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:21.573941 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-172.ec2.internal\" not found" Apr 17 20:04:21.631907 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.631864 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 20:04:21.633109 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.633090 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 20:04:21.633222 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.633116 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 20:04:21.633222 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.633138 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 20:04:21.633222 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.633145 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 20:04:21.633222 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:21.633183 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 20:04:21.636314 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.636289 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:04:21.673086 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.673028 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:04:21.673926 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.673909 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-172.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:04:21.674001 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.673943 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-172.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:04:21.674001 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.673953 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-172.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:04:21.674001 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.673974 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-172.ec2.internal" Apr 17 20:04:21.683266 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.683251 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-131-172.ec2.internal" Apr 17 20:04:21.683323 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:21.683273 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-172.ec2.internal\": node \"ip-10-0-131-172.ec2.internal\" not found" Apr 17 20:04:21.700553 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:21.700529 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-172.ec2.internal\" not found" Apr 17 20:04:21.733780 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.733747 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-172.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-131-172.ec2.internal"] Apr 17 20:04:21.733888 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.733858 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:04:21.735496 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.735472 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-172.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:04:21.735618 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.735506 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-172.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:04:21.735618 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.735520 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-172.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:04:21.736824 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.736810 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:04:21.736983 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.736969 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-172.ec2.internal" Apr 17 20:04:21.737027 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.736997 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:04:21.737517 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.737499 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-172.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:04:21.737594 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.737529 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-172.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:04:21.737594 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.737493 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-172.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:04:21.737594 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.737558 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-172.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:04:21.737594 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.737568 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-172.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:04:21.737594 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.737540 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-172.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:04:21.739157 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.739131 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-172.ec2.internal" Apr 17 20:04:21.739261 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.739196 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:04:21.740231 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.740213 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-172.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:04:21.740316 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.740248 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-172.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:04:21.740316 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.740265 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-172.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:04:21.765543 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:21.765517 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-172.ec2.internal\" not found" node="ip-10-0-131-172.ec2.internal" Apr 17 20:04:21.769956 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:21.769940 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-172.ec2.internal\" not found" node="ip-10-0-131-172.ec2.internal" Apr 17 20:04:21.801439 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:21.801412 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-172.ec2.internal\" not found" Apr 17 20:04:21.811723 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.811696 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2d343d1c847cd1fba773386a10885afd-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-172.ec2.internal\" (UID: \"2d343d1c847cd1fba773386a10885afd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-172.ec2.internal" Apr 17 20:04:21.811817 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.811727 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2d343d1c847cd1fba773386a10885afd-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-172.ec2.internal\" (UID: \"2d343d1c847cd1fba773386a10885afd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-172.ec2.internal" Apr 17 20:04:21.811817 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.811750 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/980e96a3ec854dc0a95f332d23ffd70e-config\") pod \"kube-apiserver-proxy-ip-10-0-131-172.ec2.internal\" (UID: \"980e96a3ec854dc0a95f332d23ffd70e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-172.ec2.internal" Apr 17 20:04:21.902130 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:21.902091 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-172.ec2.internal\" not found" Apr 17 20:04:21.912428 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.912408 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/980e96a3ec854dc0a95f332d23ffd70e-config\") pod \"kube-apiserver-proxy-ip-10-0-131-172.ec2.internal\" (UID: \"980e96a3ec854dc0a95f332d23ffd70e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-172.ec2.internal" Apr 17 20:04:21.912489 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.912437 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2d343d1c847cd1fba773386a10885afd-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-172.ec2.internal\" (UID: \"2d343d1c847cd1fba773386a10885afd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-172.ec2.internal" Apr 17 20:04:21.912489 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.912458 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2d343d1c847cd1fba773386a10885afd-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-172.ec2.internal\" (UID: \"2d343d1c847cd1fba773386a10885afd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-172.ec2.internal" Apr 17 20:04:21.912564 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.912504 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/980e96a3ec854dc0a95f332d23ffd70e-config\") pod \"kube-apiserver-proxy-ip-10-0-131-172.ec2.internal\" (UID: \"980e96a3ec854dc0a95f332d23ffd70e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-172.ec2.internal" Apr 17 20:04:21.912564 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.912511 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2d343d1c847cd1fba773386a10885afd-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-172.ec2.internal\" (UID: \"2d343d1c847cd1fba773386a10885afd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-172.ec2.internal" Apr 17 20:04:21.912564 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:21.912557 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2d343d1c847cd1fba773386a10885afd-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-172.ec2.internal\" (UID: \"2d343d1c847cd1fba773386a10885afd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-172.ec2.internal" Apr 17 20:04:22.002834 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:22.002755 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-172.ec2.internal\" not found" Apr 17 20:04:22.067216 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:22.067192 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-172.ec2.internal" Apr 17 20:04:22.072670 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:22.072652 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-172.ec2.internal" Apr 17 20:04:22.103344 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:22.103310 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-172.ec2.internal\" not found" Apr 17 20:04:22.203850 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:22.203811 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-172.ec2.internal\" not found" Apr 17 20:04:22.304415 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:22.304346 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-172.ec2.internal\" not found" Apr 17 20:04:22.404998 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:22.404969 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-172.ec2.internal\" not found" Apr 17 20:04:22.430424 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:22.430398 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 20:04:22.430824 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:22.430527 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 20:04:22.430824 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:22.430544 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 20:04:22.505699 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:22.505673 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-172.ec2.internal\" not found" Apr 17 20:04:22.507898 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:22.507812 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 19:59:21 +0000 UTC" deadline="2027-09-21 10:33:44.976974774 +0000 UTC" Apr 17 20:04:22.507898 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:22.507884 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12518h29m22.469095528s" Apr 17 20:04:22.510026 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:22.510005 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 20:04:22.523549 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:22.523526 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 20:04:22.545617 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:22.545592 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-7p67q" Apr 17 20:04:22.553566 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:22.553547 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-7p67q" Apr 17 20:04:22.576924 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:22.576890 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod980e96a3ec854dc0a95f332d23ffd70e.slice/crio-678c278f60fe9f8427a3e6feac6b3f97d842d1255cb3e69a4cd3d9a7c6f3dd97 WatchSource:0}: Error finding container 678c278f60fe9f8427a3e6feac6b3f97d842d1255cb3e69a4cd3d9a7c6f3dd97: Status 404 returned error can't find the container with id 678c278f60fe9f8427a3e6feac6b3f97d842d1255cb3e69a4cd3d9a7c6f3dd97 Apr 17 20:04:22.578038 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:22.578016 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d343d1c847cd1fba773386a10885afd.slice/crio-10b199a48ffae2a57a17d4307c3b50101030b5af2079555231c53b3c1e2d1c50 WatchSource:0}: Error finding container 10b199a48ffae2a57a17d4307c3b50101030b5af2079555231c53b3c1e2d1c50: Status 404 returned error can't find the container with id 10b199a48ffae2a57a17d4307c3b50101030b5af2079555231c53b3c1e2d1c50 Apr 17 20:04:22.581775 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:22.581760 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 20:04:22.605965 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:22.605944 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-172.ec2.internal\" not found" Apr 17 20:04:22.636163 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:22.636121 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-172.ec2.internal" event={"ID":"980e96a3ec854dc0a95f332d23ffd70e","Type":"ContainerStarted","Data":"678c278f60fe9f8427a3e6feac6b3f97d842d1255cb3e69a4cd3d9a7c6f3dd97"} Apr 17 20:04:22.637050 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:22.637030 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-172.ec2.internal" event={"ID":"2d343d1c847cd1fba773386a10885afd","Type":"ContainerStarted","Data":"10b199a48ffae2a57a17d4307c3b50101030b5af2079555231c53b3c1e2d1c50"} Apr 17 20:04:22.706233 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:22.706205 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-172.ec2.internal\" not found" Apr 17 20:04:22.806700 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:22.806670 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-172.ec2.internal\" not found" Apr 17 20:04:22.907335 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:22.907256 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-172.ec2.internal\" not found" Apr 17 20:04:22.968351 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:22.968321 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:04:23.011453 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.011423 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-172.ec2.internal" Apr 17 20:04:23.023403 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.023380 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 20:04:23.024875 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.024855 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-172.ec2.internal" Apr 17 20:04:23.032403 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.032378 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 20:04:23.055197 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.055166 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:04:23.382329 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.382249 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:04:23.491701 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.491670 2576 apiserver.go:52] "Watching apiserver" Apr 17 20:04:23.496924 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.496896 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 20:04:23.498661 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.498634 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-m6qnv","kube-system/kube-apiserver-proxy-ip-10-0-131-172.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qnm9q","openshift-cluster-node-tuning-operator/tuned-79b7g","openshift-dns/node-resolver-fd4vp","openshift-multus/multus-k5c2g","openshift-multus/network-metrics-daemon-cnnh2","openshift-ovn-kubernetes/ovnkube-node-z2wkz","kube-system/konnectivity-agent-g8vzm","openshift-image-registry/node-ca-2m9c7","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-172.ec2.internal","openshift-multus/multus-additional-cni-plugins-fwvg8","openshift-network-diagnostics/network-check-target-zgppr"] Apr 17 20:04:23.500171 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.500147 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnnh2" Apr 17 20:04:23.500267 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:23.500217 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cnnh2" podUID="7e87cc22-ed21-46b2-9dac-b1a3906efc5c" Apr 17 20:04:23.502235 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.502215 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qnm9q" Apr 17 20:04:23.502334 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.502295 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-79b7g" Apr 17 20:04:23.504796 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.504508 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 20:04:23.504796 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.504536 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-8zd92\"" Apr 17 20:04:23.504796 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.504631 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.504796 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.504671 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 20:04:23.504796 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.504732 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:04:23.505372 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.505351 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-mwk7d\"" Apr 17 20:04:23.506280 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.505556 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 20:04:23.506280 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.505630 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 20:04:23.507867 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.507489 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 20:04:23.507867 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.507493 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 20:04:23.507999 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.507897 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 20:04:23.508329 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.508294 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-5xzvn\"" Apr 17 20:04:23.508430 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.508412 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 20:04:23.508588 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.508572 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fd4vp" Apr 17 20:04:23.508779 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.508763 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-m6qnv" Apr 17 20:04:23.511093 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.511073 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 20:04:23.511209 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.511136 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 20:04:23.511312 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.511263 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:04:23.511312 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.511306 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 20:04:23.511435 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.511376 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 20:04:23.511500 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.511435 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-hbspv\"" Apr 17 20:04:23.511558 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.511517 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-qrlkt\"" Apr 17 20:04:23.512458 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.512442 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.512894 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.512761 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-g8vzm" Apr 17 20:04:23.513830 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.513814 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2m9c7" Apr 17 20:04:23.514463 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.514402 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 20:04:23.515825 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.514812 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 20:04:23.515825 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.514981 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 20:04:23.515825 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.514986 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 20:04:23.515825 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.515128 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 20:04:23.515825 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.515366 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-vgm4k\"" Apr 17 20:04:23.515825 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.515386 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-q6mkk\"" Apr 17 20:04:23.515825 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.515407 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 20:04:23.515825 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.515625 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 20:04:23.515825 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.515642 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 20:04:23.516440 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.516419 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 20:04:23.516523 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.516444 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-7mfpt\"" Apr 17 20:04:23.516523 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.516495 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 20:04:23.516651 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.516613 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 20:04:23.516956 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.516923 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgppr" Apr 17 20:04:23.517052 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:23.516988 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgppr" podUID="8ea69030-bfbd-46e0-bba4-3408bc57cb1c" Apr 17 20:04:23.517052 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.517032 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fwvg8" Apr 17 20:04:23.518713 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.518694 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-dkw2q\"" Apr 17 20:04:23.518798 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.518756 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 20:04:23.518993 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.518974 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 20:04:23.519757 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.519735 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9b02a788-0b0e-471a-9bdb-94dc0228f8a3-device-dir\") pod \"aws-ebs-csi-driver-node-qnm9q\" (UID: \"9b02a788-0b0e-471a-9bdb-94dc0228f8a3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qnm9q" Apr 17 20:04:23.519860 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.519774 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/76870744-6d88-4402-b9a0-678580dd9213-etc-kubernetes\") pod \"tuned-79b7g\" (UID: \"76870744-6d88-4402-b9a0-678580dd9213\") " pod="openshift-cluster-node-tuning-operator/tuned-79b7g" Apr 17 20:04:23.519860 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.519799 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/76870744-6d88-4402-b9a0-678580dd9213-run\") pod \"tuned-79b7g\" (UID: \"76870744-6d88-4402-b9a0-678580dd9213\") " pod="openshift-cluster-node-tuning-operator/tuned-79b7g" Apr 17 20:04:23.519860 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.519822 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-log-socket\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.520008 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.519868 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d7xk\" (UniqueName: \"kubernetes.io/projected/8c7c8612-982d-4a5d-81bd-d18fab9adeef-kube-api-access-7d7xk\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.520008 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.519893 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f40c767d-e93d-48a0-940e-7c8f96979338-iptables-alerter-script\") pod \"iptables-alerter-m6qnv\" (UID: \"f40c767d-e93d-48a0-940e-7c8f96979338\") " pod="openshift-network-operator/iptables-alerter-m6qnv" Apr 17 20:04:23.520008 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.519917 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-host-var-lib-cni-bin\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.520008 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.519939 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9b02a788-0b0e-471a-9bdb-94dc0228f8a3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-qnm9q\" (UID: \"9b02a788-0b0e-471a-9bdb-94dc0228f8a3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qnm9q" Apr 17 20:04:23.520157 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.520007 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.520157 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.520046 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-multus-socket-dir-parent\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.520157 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.520075 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/76870744-6d88-4402-b9a0-678580dd9213-host\") pod \"tuned-79b7g\" (UID: \"76870744-6d88-4402-b9a0-678580dd9213\") " pod="openshift-cluster-node-tuning-operator/tuned-79b7g" Apr 17 20:04:23.520157 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.520109 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dklvr\" (UniqueName: \"kubernetes.io/projected/76870744-6d88-4402-b9a0-678580dd9213-kube-api-access-dklvr\") pod \"tuned-79b7g\" (UID: \"76870744-6d88-4402-b9a0-678580dd9213\") " pod="openshift-cluster-node-tuning-operator/tuned-79b7g" Apr 17 20:04:23.520157 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.520132 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-multus-cni-dir\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.520385 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.520168 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9b02a788-0b0e-471a-9bdb-94dc0228f8a3-etc-selinux\") pod \"aws-ebs-csi-driver-node-qnm9q\" (UID: \"9b02a788-0b0e-471a-9bdb-94dc0228f8a3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qnm9q" Apr 17 20:04:23.520385 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.520224 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzmzp\" (UniqueName: \"kubernetes.io/projected/9b02a788-0b0e-471a-9bdb-94dc0228f8a3-kube-api-access-dzmzp\") pod \"aws-ebs-csi-driver-node-qnm9q\" (UID: \"9b02a788-0b0e-471a-9bdb-94dc0228f8a3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qnm9q" Apr 17 20:04:23.520385 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.520254 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8c7c8612-982d-4a5d-81bd-d18fab9adeef-ovn-node-metrics-cert\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.520385 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.520285 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-system-cni-dir\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.520385 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.520333 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-os-release\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.520385 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.520371 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgm72\" (UniqueName: \"kubernetes.io/projected/959e3ae9-cf8e-4e50-b31f-76cf29653018-kube-api-access-bgm72\") pod \"node-resolver-fd4vp\" (UID: \"959e3ae9-cf8e-4e50-b31f-76cf29653018\") " pod="openshift-dns/node-resolver-fd4vp" Apr 17 20:04:23.520703 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.520398 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb82g\" (UniqueName: \"kubernetes.io/projected/f40c767d-e93d-48a0-940e-7c8f96979338-kube-api-access-vb82g\") pod \"iptables-alerter-m6qnv\" (UID: \"f40c767d-e93d-48a0-940e-7c8f96979338\") " pod="openshift-network-operator/iptables-alerter-m6qnv" Apr 17 20:04:23.520703 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.520423 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-host-run-netns\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.520703 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.520449 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-host-run-k8s-cni-cncf-io\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.520703 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.520478 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/951f5411-0db0-4850-b6dc-cdf6bfae4395-agent-certs\") pod \"konnectivity-agent-g8vzm\" (UID: \"951f5411-0db0-4850-b6dc-cdf6bfae4395\") " pod="kube-system/konnectivity-agent-g8vzm" Apr 17 20:04:23.520703 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.520511 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-run-openvswitch\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.520703 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.520548 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-host-run-ovn-kubernetes\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.520703 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.520602 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-host-cni-netd\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.520703 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.520650 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/959e3ae9-cf8e-4e50-b31f-76cf29653018-tmp-dir\") pod \"node-resolver-fd4vp\" (UID: \"959e3ae9-cf8e-4e50-b31f-76cf29653018\") " pod="openshift-dns/node-resolver-fd4vp" Apr 17 20:04:23.520703 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.520697 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e87cc22-ed21-46b2-9dac-b1a3906efc5c-metrics-certs\") pod \"network-metrics-daemon-cnnh2\" (UID: \"7e87cc22-ed21-46b2-9dac-b1a3906efc5c\") " pod="openshift-multus/network-metrics-daemon-cnnh2" Apr 17 20:04:23.521251 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.520733 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/76870744-6d88-4402-b9a0-678580dd9213-etc-modprobe-d\") pod \"tuned-79b7g\" (UID: \"76870744-6d88-4402-b9a0-678580dd9213\") " pod="openshift-cluster-node-tuning-operator/tuned-79b7g" Apr 17 20:04:23.521251 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.520763 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/76870744-6d88-4402-b9a0-678580dd9213-lib-modules\") pod \"tuned-79b7g\" (UID: \"76870744-6d88-4402-b9a0-678580dd9213\") " pod="openshift-cluster-node-tuning-operator/tuned-79b7g" Apr 17 20:04:23.521251 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.520789 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-host-slash\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.521251 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.520831 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9b02a788-0b0e-471a-9bdb-94dc0228f8a3-sys-fs\") pod \"aws-ebs-csi-driver-node-qnm9q\" (UID: \"9b02a788-0b0e-471a-9bdb-94dc0228f8a3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qnm9q" Apr 17 20:04:23.521251 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.520873 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/76870744-6d88-4402-b9a0-678580dd9213-etc-sysctl-conf\") pod \"tuned-79b7g\" (UID: \"76870744-6d88-4402-b9a0-678580dd9213\") " pod="openshift-cluster-node-tuning-operator/tuned-79b7g" Apr 17 20:04:23.521251 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.520912 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-host-kubelet\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.521251 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.520935 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-host-var-lib-cni-multus\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.521251 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.520961 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-multus-conf-dir\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.521251 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.520975 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-systemd-units\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.521251 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.520989 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-node-log\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.521251 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.521008 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-cnibin\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.521251 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.521039 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-cni-binary-copy\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.521251 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.521062 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-host-run-netns\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.521251 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.521098 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/76870744-6d88-4402-b9a0-678580dd9213-tmp\") pod \"tuned-79b7g\" (UID: \"76870744-6d88-4402-b9a0-678580dd9213\") " pod="openshift-cluster-node-tuning-operator/tuned-79b7g" Apr 17 20:04:23.521251 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.521143 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-host-cni-bin\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.521251 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.521185 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-host-run-multus-certs\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.521251 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.521256 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/76870744-6d88-4402-b9a0-678580dd9213-var-lib-kubelet\") pod \"tuned-79b7g\" (UID: \"76870744-6d88-4402-b9a0-678580dd9213\") " pod="openshift-cluster-node-tuning-operator/tuned-79b7g" Apr 17 20:04:23.522022 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.521284 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-etc-openvswitch\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.522022 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.521332 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8c7c8612-982d-4a5d-81bd-d18fab9adeef-ovnkube-script-lib\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.522022 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.521362 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9b02a788-0b0e-471a-9bdb-94dc0228f8a3-registration-dir\") pod \"aws-ebs-csi-driver-node-qnm9q\" (UID: \"9b02a788-0b0e-471a-9bdb-94dc0228f8a3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qnm9q" Apr 17 20:04:23.522022 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.521388 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/76870744-6d88-4402-b9a0-678580dd9213-sys\") pod \"tuned-79b7g\" (UID: \"76870744-6d88-4402-b9a0-678580dd9213\") " pod="openshift-cluster-node-tuning-operator/tuned-79b7g" Apr 17 20:04:23.522022 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.521426 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f40c767d-e93d-48a0-940e-7c8f96979338-host-slash\") pod \"iptables-alerter-m6qnv\" (UID: \"f40c767d-e93d-48a0-940e-7c8f96979338\") " pod="openshift-network-operator/iptables-alerter-m6qnv" Apr 17 20:04:23.522022 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.521456 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-multus-daemon-config\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.522022 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.521483 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-run-ovn\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.522022 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.521504 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-etc-kubernetes\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.522022 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.521534 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djx7r\" (UniqueName: \"kubernetes.io/projected/7e87cc22-ed21-46b2-9dac-b1a3906efc5c-kube-api-access-djx7r\") pod \"network-metrics-daemon-cnnh2\" (UID: \"7e87cc22-ed21-46b2-9dac-b1a3906efc5c\") " pod="openshift-multus/network-metrics-daemon-cnnh2" Apr 17 20:04:23.522022 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.521558 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/76870744-6d88-4402-b9a0-678580dd9213-etc-sysctl-d\") pod \"tuned-79b7g\" (UID: \"76870744-6d88-4402-b9a0-678580dd9213\") " pod="openshift-cluster-node-tuning-operator/tuned-79b7g" Apr 17 20:04:23.522022 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.521579 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/76870744-6d88-4402-b9a0-678580dd9213-etc-tuned\") pod \"tuned-79b7g\" (UID: \"76870744-6d88-4402-b9a0-678580dd9213\") " pod="openshift-cluster-node-tuning-operator/tuned-79b7g" Apr 17 20:04:23.522022 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.521599 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-run-systemd\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.522022 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.521637 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-hostroot\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.522022 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.521663 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2tdn\" (UniqueName: \"kubernetes.io/projected/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-kube-api-access-g2tdn\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.522022 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.521689 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/951f5411-0db0-4850-b6dc-cdf6bfae4395-konnectivity-ca\") pod \"konnectivity-agent-g8vzm\" (UID: \"951f5411-0db0-4850-b6dc-cdf6bfae4395\") " pod="kube-system/konnectivity-agent-g8vzm" Apr 17 20:04:23.522022 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.521714 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8c7c8612-982d-4a5d-81bd-d18fab9adeef-ovnkube-config\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.522022 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.521739 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8c7c8612-982d-4a5d-81bd-d18fab9adeef-env-overrides\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.522780 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.521783 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9b02a788-0b0e-471a-9bdb-94dc0228f8a3-socket-dir\") pod \"aws-ebs-csi-driver-node-qnm9q\" (UID: \"9b02a788-0b0e-471a-9bdb-94dc0228f8a3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qnm9q" Apr 17 20:04:23.522780 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.521816 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/76870744-6d88-4402-b9a0-678580dd9213-etc-systemd\") pod \"tuned-79b7g\" (UID: \"76870744-6d88-4402-b9a0-678580dd9213\") " pod="openshift-cluster-node-tuning-operator/tuned-79b7g" Apr 17 20:04:23.522780 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.521895 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/959e3ae9-cf8e-4e50-b31f-76cf29653018-hosts-file\") pod \"node-resolver-fd4vp\" (UID: \"959e3ae9-cf8e-4e50-b31f-76cf29653018\") " pod="openshift-dns/node-resolver-fd4vp" Apr 17 20:04:23.522780 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.521931 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-host-var-lib-kubelet\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.522780 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.521993 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/76870744-6d88-4402-b9a0-678580dd9213-etc-sysconfig\") pod \"tuned-79b7g\" (UID: \"76870744-6d88-4402-b9a0-678580dd9213\") " pod="openshift-cluster-node-tuning-operator/tuned-79b7g" Apr 17 20:04:23.522780 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.522019 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-var-lib-openvswitch\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.554580 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.554546 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 19:59:22 +0000 UTC" deadline="2027-11-04 04:02:40.801889488 +0000 UTC" Apr 17 20:04:23.554580 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.554579 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13567h58m17.247314187s" Apr 17 20:04:23.612470 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.612442 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 20:04:23.622481 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.622448 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-system-cni-dir\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.622632 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.622489 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-os-release\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.622632 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.622519 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bgm72\" (UniqueName: \"kubernetes.io/projected/959e3ae9-cf8e-4e50-b31f-76cf29653018-kube-api-access-bgm72\") pod \"node-resolver-fd4vp\" (UID: \"959e3ae9-cf8e-4e50-b31f-76cf29653018\") " pod="openshift-dns/node-resolver-fd4vp" Apr 17 20:04:23.622632 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.622547 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vb82g\" (UniqueName: \"kubernetes.io/projected/f40c767d-e93d-48a0-940e-7c8f96979338-kube-api-access-vb82g\") pod \"iptables-alerter-m6qnv\" (UID: \"f40c767d-e93d-48a0-940e-7c8f96979338\") " pod="openshift-network-operator/iptables-alerter-m6qnv" Apr 17 20:04:23.622632 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.622576 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-host-run-netns\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.622632 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.622616 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-os-release\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.622899 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.622631 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-host-run-k8s-cni-cncf-io\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.622899 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.622649 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-system-cni-dir\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.622899 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.622644 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-host-run-netns\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.622899 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.622679 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-host-run-k8s-cni-cncf-io\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.622899 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.622682 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/951f5411-0db0-4850-b6dc-cdf6bfae4395-agent-certs\") pod \"konnectivity-agent-g8vzm\" (UID: \"951f5411-0db0-4850-b6dc-cdf6bfae4395\") " pod="kube-system/konnectivity-agent-g8vzm" Apr 17 20:04:23.622899 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.622729 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-run-openvswitch\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.622899 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.622755 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-host-run-ovn-kubernetes\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.622899 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.622781 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-host-cni-netd\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.622899 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.622793 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-run-openvswitch\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.622899 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.622806 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/959e3ae9-cf8e-4e50-b31f-76cf29653018-tmp-dir\") pod \"node-resolver-fd4vp\" (UID: \"959e3ae9-cf8e-4e50-b31f-76cf29653018\") " pod="openshift-dns/node-resolver-fd4vp" Apr 17 20:04:23.622899 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.622811 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-host-run-ovn-kubernetes\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.622899 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.622830 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e87cc22-ed21-46b2-9dac-b1a3906efc5c-metrics-certs\") pod \"network-metrics-daemon-cnnh2\" (UID: \"7e87cc22-ed21-46b2-9dac-b1a3906efc5c\") " pod="openshift-multus/network-metrics-daemon-cnnh2" Apr 17 20:04:23.622899 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.622852 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-host-cni-netd\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.622899 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.622869 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/76870744-6d88-4402-b9a0-678580dd9213-etc-modprobe-d\") pod \"tuned-79b7g\" (UID: \"76870744-6d88-4402-b9a0-678580dd9213\") " pod="openshift-cluster-node-tuning-operator/tuned-79b7g" Apr 17 20:04:23.622899 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.622891 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/76870744-6d88-4402-b9a0-678580dd9213-lib-modules\") pod \"tuned-79b7g\" (UID: \"76870744-6d88-4402-b9a0-678580dd9213\") " pod="openshift-cluster-node-tuning-operator/tuned-79b7g" Apr 17 20:04:23.623539 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.622913 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-host-slash\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.623539 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:23.622921 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:04:23.623539 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.622935 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9b02a788-0b0e-471a-9bdb-94dc0228f8a3-sys-fs\") pod \"aws-ebs-csi-driver-node-qnm9q\" (UID: \"9b02a788-0b0e-471a-9bdb-94dc0228f8a3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qnm9q" Apr 17 20:04:23.623539 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.622957 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/76870744-6d88-4402-b9a0-678580dd9213-etc-sysctl-conf\") pod \"tuned-79b7g\" (UID: \"76870744-6d88-4402-b9a0-678580dd9213\") " pod="openshift-cluster-node-tuning-operator/tuned-79b7g" Apr 17 20:04:23.623539 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:23.622974 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e87cc22-ed21-46b2-9dac-b1a3906efc5c-metrics-certs podName:7e87cc22-ed21-46b2-9dac-b1a3906efc5c nodeName:}" failed. No retries permitted until 2026-04-17 20:04:24.122952707 +0000 UTC m=+2.990772463 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7e87cc22-ed21-46b2-9dac-b1a3906efc5c-metrics-certs") pod "network-metrics-daemon-cnnh2" (UID: "7e87cc22-ed21-46b2-9dac-b1a3906efc5c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:04:23.623539 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623002 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-host-kubelet\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.623539 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623007 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-host-slash\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.623539 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623027 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-host-var-lib-cni-multus\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.623539 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623050 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-multus-conf-dir\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.623539 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623056 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/76870744-6d88-4402-b9a0-678580dd9213-lib-modules\") pod \"tuned-79b7g\" (UID: \"76870744-6d88-4402-b9a0-678580dd9213\") " pod="openshift-cluster-node-tuning-operator/tuned-79b7g" Apr 17 20:04:23.623539 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623059 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/76870744-6d88-4402-b9a0-678580dd9213-etc-modprobe-d\") pod \"tuned-79b7g\" (UID: \"76870744-6d88-4402-b9a0-678580dd9213\") " pod="openshift-cluster-node-tuning-operator/tuned-79b7g" Apr 17 20:04:23.623539 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623077 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7221eba4-90c3-41d0-809f-2cfbbcbf40d8-cni-binary-copy\") pod \"multus-additional-cni-plugins-fwvg8\" (UID: \"7221eba4-90c3-41d0-809f-2cfbbcbf40d8\") " pod="openshift-multus/multus-additional-cni-plugins-fwvg8" Apr 17 20:04:23.623539 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623081 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/76870744-6d88-4402-b9a0-678580dd9213-etc-sysctl-conf\") pod \"tuned-79b7g\" (UID: \"76870744-6d88-4402-b9a0-678580dd9213\") " pod="openshift-cluster-node-tuning-operator/tuned-79b7g" Apr 17 20:04:23.623539 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623106 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6csmb\" (UniqueName: \"kubernetes.io/projected/7221eba4-90c3-41d0-809f-2cfbbcbf40d8-kube-api-access-6csmb\") pod \"multus-additional-cni-plugins-fwvg8\" (UID: \"7221eba4-90c3-41d0-809f-2cfbbcbf40d8\") " pod="openshift-multus/multus-additional-cni-plugins-fwvg8" Apr 17 20:04:23.623539 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623110 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 20:04:23.623539 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623122 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9b02a788-0b0e-471a-9bdb-94dc0228f8a3-sys-fs\") pod \"aws-ebs-csi-driver-node-qnm9q\" (UID: \"9b02a788-0b0e-471a-9bdb-94dc0228f8a3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qnm9q" Apr 17 20:04:23.623539 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623127 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/959e3ae9-cf8e-4e50-b31f-76cf29653018-tmp-dir\") pod \"node-resolver-fd4vp\" (UID: \"959e3ae9-cf8e-4e50-b31f-76cf29653018\") " pod="openshift-dns/node-resolver-fd4vp" Apr 17 20:04:23.624234 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623140 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-systemd-units\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.624234 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623122 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-host-var-lib-cni-multus\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.624234 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623176 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-systemd-units\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.624234 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623180 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-node-log\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.624234 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623170 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-host-kubelet\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.624234 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623200 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-multus-conf-dir\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.624234 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623219 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-node-log\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.624234 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623211 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-cnibin\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.624234 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623265 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-cni-binary-copy\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.624234 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623291 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-host-run-netns\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.624234 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623268 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-cnibin\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.624234 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623321 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8p7b\" (UniqueName: \"kubernetes.io/projected/eb52d0ba-f85e-4263-9b90-db812ebeca7c-kube-api-access-s8p7b\") pod \"node-ca-2m9c7\" (UID: \"eb52d0ba-f85e-4263-9b90-db812ebeca7c\") " pod="openshift-image-registry/node-ca-2m9c7" Apr 17 20:04:23.624234 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623347 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7221eba4-90c3-41d0-809f-2cfbbcbf40d8-cnibin\") pod \"multus-additional-cni-plugins-fwvg8\" (UID: \"7221eba4-90c3-41d0-809f-2cfbbcbf40d8\") " pod="openshift-multus/multus-additional-cni-plugins-fwvg8" Apr 17 20:04:23.624234 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623351 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-host-run-netns\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.624234 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623374 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/76870744-6d88-4402-b9a0-678580dd9213-tmp\") pod \"tuned-79b7g\" (UID: \"76870744-6d88-4402-b9a0-678580dd9213\") " pod="openshift-cluster-node-tuning-operator/tuned-79b7g" Apr 17 20:04:23.624234 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623398 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-host-cni-bin\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.624234 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623427 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-host-run-multus-certs\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.624234 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623455 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7221eba4-90c3-41d0-809f-2cfbbcbf40d8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fwvg8\" (UID: \"7221eba4-90c3-41d0-809f-2cfbbcbf40d8\") " pod="openshift-multus/multus-additional-cni-plugins-fwvg8" Apr 17 20:04:23.625020 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623469 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-host-cni-bin\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.625020 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623481 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/76870744-6d88-4402-b9a0-678580dd9213-var-lib-kubelet\") pod \"tuned-79b7g\" (UID: \"76870744-6d88-4402-b9a0-678580dd9213\") " pod="openshift-cluster-node-tuning-operator/tuned-79b7g" Apr 17 20:04:23.625020 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623508 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-etc-openvswitch\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.625020 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623519 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-host-run-multus-certs\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.625020 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623534 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8c7c8612-982d-4a5d-81bd-d18fab9adeef-ovnkube-script-lib\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.625020 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623560 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eb52d0ba-f85e-4263-9b90-db812ebeca7c-host\") pod \"node-ca-2m9c7\" (UID: \"eb52d0ba-f85e-4263-9b90-db812ebeca7c\") " pod="openshift-image-registry/node-ca-2m9c7" Apr 17 20:04:23.625020 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623571 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/76870744-6d88-4402-b9a0-678580dd9213-var-lib-kubelet\") pod \"tuned-79b7g\" (UID: \"76870744-6d88-4402-b9a0-678580dd9213\") " pod="openshift-cluster-node-tuning-operator/tuned-79b7g" Apr 17 20:04:23.625020 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623588 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9b02a788-0b0e-471a-9bdb-94dc0228f8a3-registration-dir\") pod \"aws-ebs-csi-driver-node-qnm9q\" (UID: \"9b02a788-0b0e-471a-9bdb-94dc0228f8a3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qnm9q" Apr 17 20:04:23.625020 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623613 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/76870744-6d88-4402-b9a0-678580dd9213-sys\") pod \"tuned-79b7g\" (UID: \"76870744-6d88-4402-b9a0-678580dd9213\") " pod="openshift-cluster-node-tuning-operator/tuned-79b7g" Apr 17 20:04:23.625020 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623629 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-etc-openvswitch\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.625020 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623646 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f40c767d-e93d-48a0-940e-7c8f96979338-host-slash\") pod \"iptables-alerter-m6qnv\" (UID: \"f40c767d-e93d-48a0-940e-7c8f96979338\") " pod="openshift-network-operator/iptables-alerter-m6qnv" Apr 17 20:04:23.625020 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623682 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9b02a788-0b0e-471a-9bdb-94dc0228f8a3-registration-dir\") pod \"aws-ebs-csi-driver-node-qnm9q\" (UID: \"9b02a788-0b0e-471a-9bdb-94dc0228f8a3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qnm9q" Apr 17 20:04:23.625020 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623690 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-multus-daemon-config\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.625020 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623716 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-run-ovn\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.625020 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623729 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/76870744-6d88-4402-b9a0-678580dd9213-sys\") pod \"tuned-79b7g\" (UID: \"76870744-6d88-4402-b9a0-678580dd9213\") " pod="openshift-cluster-node-tuning-operator/tuned-79b7g" Apr 17 20:04:23.625020 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623739 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-etc-kubernetes\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.625020 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623744 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f40c767d-e93d-48a0-940e-7c8f96979338-host-slash\") pod \"iptables-alerter-m6qnv\" (UID: \"f40c767d-e93d-48a0-940e-7c8f96979338\") " pod="openshift-network-operator/iptables-alerter-m6qnv" Apr 17 20:04:23.625750 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623782 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7221eba4-90c3-41d0-809f-2cfbbcbf40d8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fwvg8\" (UID: \"7221eba4-90c3-41d0-809f-2cfbbcbf40d8\") " pod="openshift-multus/multus-additional-cni-plugins-fwvg8" Apr 17 20:04:23.625750 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623800 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-run-ovn\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.625750 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623816 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-djx7r\" (UniqueName: \"kubernetes.io/projected/7e87cc22-ed21-46b2-9dac-b1a3906efc5c-kube-api-access-djx7r\") pod \"network-metrics-daemon-cnnh2\" (UID: \"7e87cc22-ed21-46b2-9dac-b1a3906efc5c\") " pod="openshift-multus/network-metrics-daemon-cnnh2" Apr 17 20:04:23.625750 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623863 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/76870744-6d88-4402-b9a0-678580dd9213-etc-sysctl-d\") pod \"tuned-79b7g\" (UID: \"76870744-6d88-4402-b9a0-678580dd9213\") " pod="openshift-cluster-node-tuning-operator/tuned-79b7g" Apr 17 20:04:23.625750 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623890 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/76870744-6d88-4402-b9a0-678580dd9213-etc-tuned\") pod \"tuned-79b7g\" (UID: \"76870744-6d88-4402-b9a0-678580dd9213\") " pod="openshift-cluster-node-tuning-operator/tuned-79b7g" Apr 17 20:04:23.625750 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623894 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-cni-binary-copy\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.625750 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623902 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-etc-kubernetes\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.625750 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623912 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-run-systemd\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.625750 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623939 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-hostroot\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.625750 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623964 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g2tdn\" (UniqueName: \"kubernetes.io/projected/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-kube-api-access-g2tdn\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.625750 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623988 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/951f5411-0db0-4850-b6dc-cdf6bfae4395-konnectivity-ca\") pod \"konnectivity-agent-g8vzm\" (UID: \"951f5411-0db0-4850-b6dc-cdf6bfae4395\") " pod="kube-system/konnectivity-agent-g8vzm" Apr 17 20:04:23.625750 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623993 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-run-systemd\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.625750 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.624012 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8c7c8612-982d-4a5d-81bd-d18fab9adeef-ovnkube-config\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.625750 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.623995 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/76870744-6d88-4402-b9a0-678580dd9213-etc-sysctl-d\") pod \"tuned-79b7g\" (UID: \"76870744-6d88-4402-b9a0-678580dd9213\") " pod="openshift-cluster-node-tuning-operator/tuned-79b7g" Apr 17 20:04:23.625750 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.624232 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-multus-daemon-config\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.625750 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.624256 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8c7c8612-982d-4a5d-81bd-d18fab9adeef-ovnkube-script-lib\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.625750 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.624357 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-hostroot\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.625750 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.624459 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8c7c8612-982d-4a5d-81bd-d18fab9adeef-env-overrides\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.626564 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.624495 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9b02a788-0b0e-471a-9bdb-94dc0228f8a3-socket-dir\") pod \"aws-ebs-csi-driver-node-qnm9q\" (UID: \"9b02a788-0b0e-471a-9bdb-94dc0228f8a3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qnm9q" Apr 17 20:04:23.626564 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.624527 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/76870744-6d88-4402-b9a0-678580dd9213-etc-systemd\") pod \"tuned-79b7g\" (UID: \"76870744-6d88-4402-b9a0-678580dd9213\") " pod="openshift-cluster-node-tuning-operator/tuned-79b7g" Apr 17 20:04:23.626564 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.624553 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/959e3ae9-cf8e-4e50-b31f-76cf29653018-hosts-file\") pod \"node-resolver-fd4vp\" (UID: \"959e3ae9-cf8e-4e50-b31f-76cf29653018\") " pod="openshift-dns/node-resolver-fd4vp" Apr 17 20:04:23.626564 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.624580 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-host-var-lib-kubelet\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.626564 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.624610 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7221eba4-90c3-41d0-809f-2cfbbcbf40d8-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fwvg8\" (UID: \"7221eba4-90c3-41d0-809f-2cfbbcbf40d8\") " pod="openshift-multus/multus-additional-cni-plugins-fwvg8" Apr 17 20:04:23.626564 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.624642 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whvrs\" (UniqueName: \"kubernetes.io/projected/8ea69030-bfbd-46e0-bba4-3408bc57cb1c-kube-api-access-whvrs\") pod \"network-check-target-zgppr\" (UID: \"8ea69030-bfbd-46e0-bba4-3408bc57cb1c\") " pod="openshift-network-diagnostics/network-check-target-zgppr" Apr 17 20:04:23.626564 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.624673 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/76870744-6d88-4402-b9a0-678580dd9213-etc-sysconfig\") pod \"tuned-79b7g\" (UID: \"76870744-6d88-4402-b9a0-678580dd9213\") " pod="openshift-cluster-node-tuning-operator/tuned-79b7g" Apr 17 20:04:23.626564 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.624684 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/959e3ae9-cf8e-4e50-b31f-76cf29653018-hosts-file\") pod \"node-resolver-fd4vp\" (UID: \"959e3ae9-cf8e-4e50-b31f-76cf29653018\") " pod="openshift-dns/node-resolver-fd4vp" Apr 17 20:04:23.626564 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.624699 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-var-lib-openvswitch\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.626564 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.624716 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9b02a788-0b0e-471a-9bdb-94dc0228f8a3-socket-dir\") pod \"aws-ebs-csi-driver-node-qnm9q\" (UID: \"9b02a788-0b0e-471a-9bdb-94dc0228f8a3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qnm9q" Apr 17 20:04:23.626564 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.624728 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7221eba4-90c3-41d0-809f-2cfbbcbf40d8-os-release\") pod \"multus-additional-cni-plugins-fwvg8\" (UID: \"7221eba4-90c3-41d0-809f-2cfbbcbf40d8\") " pod="openshift-multus/multus-additional-cni-plugins-fwvg8" Apr 17 20:04:23.626564 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.624744 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/76870744-6d88-4402-b9a0-678580dd9213-etc-systemd\") pod \"tuned-79b7g\" (UID: \"76870744-6d88-4402-b9a0-678580dd9213\") " pod="openshift-cluster-node-tuning-operator/tuned-79b7g" Apr 17 20:04:23.626564 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.624759 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9b02a788-0b0e-471a-9bdb-94dc0228f8a3-device-dir\") pod \"aws-ebs-csi-driver-node-qnm9q\" (UID: \"9b02a788-0b0e-471a-9bdb-94dc0228f8a3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qnm9q" Apr 17 20:04:23.626564 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.624786 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/76870744-6d88-4402-b9a0-678580dd9213-etc-kubernetes\") pod \"tuned-79b7g\" (UID: \"76870744-6d88-4402-b9a0-678580dd9213\") " pod="openshift-cluster-node-tuning-operator/tuned-79b7g" Apr 17 20:04:23.626564 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.624787 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-host-var-lib-kubelet\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.626564 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.624823 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/76870744-6d88-4402-b9a0-678580dd9213-run\") pod \"tuned-79b7g\" (UID: \"76870744-6d88-4402-b9a0-678580dd9213\") " pod="openshift-cluster-node-tuning-operator/tuned-79b7g" Apr 17 20:04:23.626564 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.624871 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-log-socket\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.627333 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.624878 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/76870744-6d88-4402-b9a0-678580dd9213-etc-kubernetes\") pod \"tuned-79b7g\" (UID: \"76870744-6d88-4402-b9a0-678580dd9213\") " pod="openshift-cluster-node-tuning-operator/tuned-79b7g" Apr 17 20:04:23.627333 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.624876 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9b02a788-0b0e-471a-9bdb-94dc0228f8a3-device-dir\") pod \"aws-ebs-csi-driver-node-qnm9q\" (UID: \"9b02a788-0b0e-471a-9bdb-94dc0228f8a3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qnm9q" Apr 17 20:04:23.627333 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.624897 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7d7xk\" (UniqueName: \"kubernetes.io/projected/8c7c8612-982d-4a5d-81bd-d18fab9adeef-kube-api-access-7d7xk\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.627333 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.624926 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/76870744-6d88-4402-b9a0-678580dd9213-run\") pod \"tuned-79b7g\" (UID: \"76870744-6d88-4402-b9a0-678580dd9213\") " pod="openshift-cluster-node-tuning-operator/tuned-79b7g" Apr 17 20:04:23.627333 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.624934 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f40c767d-e93d-48a0-940e-7c8f96979338-iptables-alerter-script\") pod \"iptables-alerter-m6qnv\" (UID: \"f40c767d-e93d-48a0-940e-7c8f96979338\") " pod="openshift-network-operator/iptables-alerter-m6qnv" Apr 17 20:04:23.627333 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.624952 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8c7c8612-982d-4a5d-81bd-d18fab9adeef-ovnkube-config\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.627333 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.624963 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-host-var-lib-cni-bin\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.627333 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.624967 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-var-lib-openvswitch\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.627333 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.624994 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7221eba4-90c3-41d0-809f-2cfbbcbf40d8-system-cni-dir\") pod \"multus-additional-cni-plugins-fwvg8\" (UID: \"7221eba4-90c3-41d0-809f-2cfbbcbf40d8\") " pod="openshift-multus/multus-additional-cni-plugins-fwvg8" Apr 17 20:04:23.627333 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.625015 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-log-socket\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.627333 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.625031 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/76870744-6d88-4402-b9a0-678580dd9213-etc-sysconfig\") pod \"tuned-79b7g\" (UID: \"76870744-6d88-4402-b9a0-678580dd9213\") " pod="openshift-cluster-node-tuning-operator/tuned-79b7g" Apr 17 20:04:23.627333 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.625044 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9b02a788-0b0e-471a-9bdb-94dc0228f8a3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-qnm9q\" (UID: \"9b02a788-0b0e-471a-9bdb-94dc0228f8a3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qnm9q" Apr 17 20:04:23.627333 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.625083 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.627333 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.625103 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9b02a788-0b0e-471a-9bdb-94dc0228f8a3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-qnm9q\" (UID: \"9b02a788-0b0e-471a-9bdb-94dc0228f8a3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qnm9q" Apr 17 20:04:23.627333 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.625113 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-multus-socket-dir-parent\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.627333 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.625117 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-host-var-lib-cni-bin\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.627333 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.625150 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/76870744-6d88-4402-b9a0-678580dd9213-host\") pod \"tuned-79b7g\" (UID: \"76870744-6d88-4402-b9a0-678580dd9213\") " pod="openshift-cluster-node-tuning-operator/tuned-79b7g" Apr 17 20:04:23.627986 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.625172 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-multus-socket-dir-parent\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.627986 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.625203 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dklvr\" (UniqueName: \"kubernetes.io/projected/76870744-6d88-4402-b9a0-678580dd9213-kube-api-access-dklvr\") pod \"tuned-79b7g\" (UID: \"76870744-6d88-4402-b9a0-678580dd9213\") " pod="openshift-cluster-node-tuning-operator/tuned-79b7g" Apr 17 20:04:23.627986 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.625221 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-multus-cni-dir\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.627986 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.625249 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/eb52d0ba-f85e-4263-9b90-db812ebeca7c-serviceca\") pod \"node-ca-2m9c7\" (UID: \"eb52d0ba-f85e-4263-9b90-db812ebeca7c\") " pod="openshift-image-registry/node-ca-2m9c7" Apr 17 20:04:23.627986 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.625289 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8c7c8612-982d-4a5d-81bd-d18fab9adeef-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.627986 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.625310 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9b02a788-0b0e-471a-9bdb-94dc0228f8a3-etc-selinux\") pod \"aws-ebs-csi-driver-node-qnm9q\" (UID: \"9b02a788-0b0e-471a-9bdb-94dc0228f8a3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qnm9q" Apr 17 20:04:23.627986 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.625329 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/76870744-6d88-4402-b9a0-678580dd9213-host\") pod \"tuned-79b7g\" (UID: \"76870744-6d88-4402-b9a0-678580dd9213\") " pod="openshift-cluster-node-tuning-operator/tuned-79b7g" Apr 17 20:04:23.627986 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.625349 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dzmzp\" (UniqueName: \"kubernetes.io/projected/9b02a788-0b0e-471a-9bdb-94dc0228f8a3-kube-api-access-dzmzp\") pod \"aws-ebs-csi-driver-node-qnm9q\" (UID: \"9b02a788-0b0e-471a-9bdb-94dc0228f8a3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qnm9q" Apr 17 20:04:23.627986 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.625379 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8c7c8612-982d-4a5d-81bd-d18fab9adeef-ovn-node-metrics-cert\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.627986 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.625430 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8c7c8612-982d-4a5d-81bd-d18fab9adeef-env-overrides\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.627986 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.625497 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-multus-cni-dir\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.627986 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.625574 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9b02a788-0b0e-471a-9bdb-94dc0228f8a3-etc-selinux\") pod \"aws-ebs-csi-driver-node-qnm9q\" (UID: \"9b02a788-0b0e-471a-9bdb-94dc0228f8a3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qnm9q" Apr 17 20:04:23.627986 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.625593 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f40c767d-e93d-48a0-940e-7c8f96979338-iptables-alerter-script\") pod \"iptables-alerter-m6qnv\" (UID: \"f40c767d-e93d-48a0-940e-7c8f96979338\") " pod="openshift-network-operator/iptables-alerter-m6qnv" Apr 17 20:04:23.627986 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.625704 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/951f5411-0db0-4850-b6dc-cdf6bfae4395-konnectivity-ca\") pod \"konnectivity-agent-g8vzm\" (UID: \"951f5411-0db0-4850-b6dc-cdf6bfae4395\") " pod="kube-system/konnectivity-agent-g8vzm" Apr 17 20:04:23.627986 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.627137 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/951f5411-0db0-4850-b6dc-cdf6bfae4395-agent-certs\") pod \"konnectivity-agent-g8vzm\" (UID: \"951f5411-0db0-4850-b6dc-cdf6bfae4395\") " pod="kube-system/konnectivity-agent-g8vzm" Apr 17 20:04:23.627986 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.627373 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/76870744-6d88-4402-b9a0-678580dd9213-tmp\") pod \"tuned-79b7g\" (UID: \"76870744-6d88-4402-b9a0-678580dd9213\") " pod="openshift-cluster-node-tuning-operator/tuned-79b7g" Apr 17 20:04:23.627986 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.627956 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8c7c8612-982d-4a5d-81bd-d18fab9adeef-ovn-node-metrics-cert\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.628745 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.628112 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/76870744-6d88-4402-b9a0-678580dd9213-etc-tuned\") pod \"tuned-79b7g\" (UID: \"76870744-6d88-4402-b9a0-678580dd9213\") " pod="openshift-cluster-node-tuning-operator/tuned-79b7g" Apr 17 20:04:23.636244 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.635542 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgm72\" (UniqueName: \"kubernetes.io/projected/959e3ae9-cf8e-4e50-b31f-76cf29653018-kube-api-access-bgm72\") pod \"node-resolver-fd4vp\" (UID: \"959e3ae9-cf8e-4e50-b31f-76cf29653018\") " pod="openshift-dns/node-resolver-fd4vp" Apr 17 20:04:23.636244 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.635879 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb82g\" (UniqueName: \"kubernetes.io/projected/f40c767d-e93d-48a0-940e-7c8f96979338-kube-api-access-vb82g\") pod \"iptables-alerter-m6qnv\" (UID: \"f40c767d-e93d-48a0-940e-7c8f96979338\") " pod="openshift-network-operator/iptables-alerter-m6qnv" Apr 17 20:04:23.638800 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.638778 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzmzp\" (UniqueName: \"kubernetes.io/projected/9b02a788-0b0e-471a-9bdb-94dc0228f8a3-kube-api-access-dzmzp\") pod \"aws-ebs-csi-driver-node-qnm9q\" (UID: \"9b02a788-0b0e-471a-9bdb-94dc0228f8a3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qnm9q" Apr 17 20:04:23.639550 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.639450 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d7xk\" (UniqueName: \"kubernetes.io/projected/8c7c8612-982d-4a5d-81bd-d18fab9adeef-kube-api-access-7d7xk\") pod \"ovnkube-node-z2wkz\" (UID: \"8c7c8612-982d-4a5d-81bd-d18fab9adeef\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.639550 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.639494 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2tdn\" (UniqueName: \"kubernetes.io/projected/a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b-kube-api-access-g2tdn\") pod \"multus-k5c2g\" (UID: \"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b\") " pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.640202 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.640184 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dklvr\" (UniqueName: \"kubernetes.io/projected/76870744-6d88-4402-b9a0-678580dd9213-kube-api-access-dklvr\") pod \"tuned-79b7g\" (UID: \"76870744-6d88-4402-b9a0-678580dd9213\") " pod="openshift-cluster-node-tuning-operator/tuned-79b7g" Apr 17 20:04:23.640826 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.640808 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-djx7r\" (UniqueName: \"kubernetes.io/projected/7e87cc22-ed21-46b2-9dac-b1a3906efc5c-kube-api-access-djx7r\") pod \"network-metrics-daemon-cnnh2\" (UID: \"7e87cc22-ed21-46b2-9dac-b1a3906efc5c\") " pod="openshift-multus/network-metrics-daemon-cnnh2" Apr 17 20:04:23.726587 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.726555 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/eb52d0ba-f85e-4263-9b90-db812ebeca7c-serviceca\") pod \"node-ca-2m9c7\" (UID: \"eb52d0ba-f85e-4263-9b90-db812ebeca7c\") " pod="openshift-image-registry/node-ca-2m9c7" Apr 17 20:04:23.726760 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.726607 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7221eba4-90c3-41d0-809f-2cfbbcbf40d8-cni-binary-copy\") pod \"multus-additional-cni-plugins-fwvg8\" (UID: \"7221eba4-90c3-41d0-809f-2cfbbcbf40d8\") " pod="openshift-multus/multus-additional-cni-plugins-fwvg8" Apr 17 20:04:23.726760 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.726632 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6csmb\" (UniqueName: \"kubernetes.io/projected/7221eba4-90c3-41d0-809f-2cfbbcbf40d8-kube-api-access-6csmb\") pod \"multus-additional-cni-plugins-fwvg8\" (UID: \"7221eba4-90c3-41d0-809f-2cfbbcbf40d8\") " pod="openshift-multus/multus-additional-cni-plugins-fwvg8" Apr 17 20:04:23.726760 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.726739 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s8p7b\" (UniqueName: \"kubernetes.io/projected/eb52d0ba-f85e-4263-9b90-db812ebeca7c-kube-api-access-s8p7b\") pod \"node-ca-2m9c7\" (UID: \"eb52d0ba-f85e-4263-9b90-db812ebeca7c\") " pod="openshift-image-registry/node-ca-2m9c7" Apr 17 20:04:23.726952 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.726766 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7221eba4-90c3-41d0-809f-2cfbbcbf40d8-cnibin\") pod \"multus-additional-cni-plugins-fwvg8\" (UID: \"7221eba4-90c3-41d0-809f-2cfbbcbf40d8\") " pod="openshift-multus/multus-additional-cni-plugins-fwvg8" Apr 17 20:04:23.726952 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.726794 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7221eba4-90c3-41d0-809f-2cfbbcbf40d8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fwvg8\" (UID: \"7221eba4-90c3-41d0-809f-2cfbbcbf40d8\") " pod="openshift-multus/multus-additional-cni-plugins-fwvg8" Apr 17 20:04:23.726952 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.726822 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eb52d0ba-f85e-4263-9b90-db812ebeca7c-host\") pod \"node-ca-2m9c7\" (UID: \"eb52d0ba-f85e-4263-9b90-db812ebeca7c\") " pod="openshift-image-registry/node-ca-2m9c7" Apr 17 20:04:23.726952 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.726867 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7221eba4-90c3-41d0-809f-2cfbbcbf40d8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fwvg8\" (UID: \"7221eba4-90c3-41d0-809f-2cfbbcbf40d8\") " pod="openshift-multus/multus-additional-cni-plugins-fwvg8" Apr 17 20:04:23.726952 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.726906 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7221eba4-90c3-41d0-809f-2cfbbcbf40d8-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fwvg8\" (UID: \"7221eba4-90c3-41d0-809f-2cfbbcbf40d8\") " pod="openshift-multus/multus-additional-cni-plugins-fwvg8" Apr 17 20:04:23.726952 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.726938 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7221eba4-90c3-41d0-809f-2cfbbcbf40d8-cnibin\") pod \"multus-additional-cni-plugins-fwvg8\" (UID: \"7221eba4-90c3-41d0-809f-2cfbbcbf40d8\") " pod="openshift-multus/multus-additional-cni-plugins-fwvg8" Apr 17 20:04:23.727287 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.726933 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-whvrs\" (UniqueName: \"kubernetes.io/projected/8ea69030-bfbd-46e0-bba4-3408bc57cb1c-kube-api-access-whvrs\") pod \"network-check-target-zgppr\" (UID: \"8ea69030-bfbd-46e0-bba4-3408bc57cb1c\") " pod="openshift-network-diagnostics/network-check-target-zgppr" Apr 17 20:04:23.727287 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.726964 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eb52d0ba-f85e-4263-9b90-db812ebeca7c-host\") pod \"node-ca-2m9c7\" (UID: \"eb52d0ba-f85e-4263-9b90-db812ebeca7c\") " pod="openshift-image-registry/node-ca-2m9c7" Apr 17 20:04:23.727287 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.727013 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7221eba4-90c3-41d0-809f-2cfbbcbf40d8-os-release\") pod \"multus-additional-cni-plugins-fwvg8\" (UID: \"7221eba4-90c3-41d0-809f-2cfbbcbf40d8\") " pod="openshift-multus/multus-additional-cni-plugins-fwvg8" Apr 17 20:04:23.727287 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.727037 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7221eba4-90c3-41d0-809f-2cfbbcbf40d8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fwvg8\" (UID: \"7221eba4-90c3-41d0-809f-2cfbbcbf40d8\") " pod="openshift-multus/multus-additional-cni-plugins-fwvg8" Apr 17 20:04:23.727287 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.727052 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7221eba4-90c3-41d0-809f-2cfbbcbf40d8-system-cni-dir\") pod \"multus-additional-cni-plugins-fwvg8\" (UID: \"7221eba4-90c3-41d0-809f-2cfbbcbf40d8\") " pod="openshift-multus/multus-additional-cni-plugins-fwvg8" Apr 17 20:04:23.727287 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.727119 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7221eba4-90c3-41d0-809f-2cfbbcbf40d8-os-release\") pod \"multus-additional-cni-plugins-fwvg8\" (UID: \"7221eba4-90c3-41d0-809f-2cfbbcbf40d8\") " pod="openshift-multus/multus-additional-cni-plugins-fwvg8" Apr 17 20:04:23.727287 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.727218 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7221eba4-90c3-41d0-809f-2cfbbcbf40d8-system-cni-dir\") pod \"multus-additional-cni-plugins-fwvg8\" (UID: \"7221eba4-90c3-41d0-809f-2cfbbcbf40d8\") " pod="openshift-multus/multus-additional-cni-plugins-fwvg8" Apr 17 20:04:23.727287 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.727215 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7221eba4-90c3-41d0-809f-2cfbbcbf40d8-cni-binary-copy\") pod \"multus-additional-cni-plugins-fwvg8\" (UID: \"7221eba4-90c3-41d0-809f-2cfbbcbf40d8\") " pod="openshift-multus/multus-additional-cni-plugins-fwvg8" Apr 17 20:04:23.727577 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.727341 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/eb52d0ba-f85e-4263-9b90-db812ebeca7c-serviceca\") pod \"node-ca-2m9c7\" (UID: \"eb52d0ba-f85e-4263-9b90-db812ebeca7c\") " pod="openshift-image-registry/node-ca-2m9c7" Apr 17 20:04:23.727577 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.727456 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7221eba4-90c3-41d0-809f-2cfbbcbf40d8-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fwvg8\" (UID: \"7221eba4-90c3-41d0-809f-2cfbbcbf40d8\") " pod="openshift-multus/multus-additional-cni-plugins-fwvg8" Apr 17 20:04:23.727577 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.727500 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7221eba4-90c3-41d0-809f-2cfbbcbf40d8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fwvg8\" (UID: \"7221eba4-90c3-41d0-809f-2cfbbcbf40d8\") " pod="openshift-multus/multus-additional-cni-plugins-fwvg8" Apr 17 20:04:23.732612 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:23.732591 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:04:23.732612 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:23.732613 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:04:23.732737 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:23.732623 2576 projected.go:194] Error preparing data for projected volume kube-api-access-whvrs for pod openshift-network-diagnostics/network-check-target-zgppr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:04:23.732737 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:23.732675 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8ea69030-bfbd-46e0-bba4-3408bc57cb1c-kube-api-access-whvrs podName:8ea69030-bfbd-46e0-bba4-3408bc57cb1c nodeName:}" failed. No retries permitted until 2026-04-17 20:04:24.232660327 +0000 UTC m=+3.100480066 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-whvrs" (UniqueName: "kubernetes.io/projected/8ea69030-bfbd-46e0-bba4-3408bc57cb1c-kube-api-access-whvrs") pod "network-check-target-zgppr" (UID: "8ea69030-bfbd-46e0-bba4-3408bc57cb1c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:04:23.734774 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.734749 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6csmb\" (UniqueName: \"kubernetes.io/projected/7221eba4-90c3-41d0-809f-2cfbbcbf40d8-kube-api-access-6csmb\") pod \"multus-additional-cni-plugins-fwvg8\" (UID: \"7221eba4-90c3-41d0-809f-2cfbbcbf40d8\") " pod="openshift-multus/multus-additional-cni-plugins-fwvg8" Apr 17 20:04:23.734878 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.734790 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8p7b\" (UniqueName: \"kubernetes.io/projected/eb52d0ba-f85e-4263-9b90-db812ebeca7c-kube-api-access-s8p7b\") pod \"node-ca-2m9c7\" (UID: \"eb52d0ba-f85e-4263-9b90-db812ebeca7c\") " pod="openshift-image-registry/node-ca-2m9c7" Apr 17 20:04:23.817917 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.817882 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qnm9q" Apr 17 20:04:23.823578 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.823557 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-79b7g" Apr 17 20:04:23.832260 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.832236 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-k5c2g" Apr 17 20:04:23.839810 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.839788 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fd4vp" Apr 17 20:04:23.848397 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.848376 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-m6qnv" Apr 17 20:04:23.855448 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.855427 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:23.863131 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.863109 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-g8vzm" Apr 17 20:04:23.871682 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.871663 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2m9c7" Apr 17 20:04:23.877176 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.877158 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fwvg8" Apr 17 20:04:23.910961 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:23.910698 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:04:24.130242 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:24.130215 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e87cc22-ed21-46b2-9dac-b1a3906efc5c-metrics-certs\") pod \"network-metrics-daemon-cnnh2\" (UID: \"7e87cc22-ed21-46b2-9dac-b1a3906efc5c\") " pod="openshift-multus/network-metrics-daemon-cnnh2" Apr 17 20:04:24.130418 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:24.130341 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:04:24.130418 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:24.130397 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e87cc22-ed21-46b2-9dac-b1a3906efc5c-metrics-certs podName:7e87cc22-ed21-46b2-9dac-b1a3906efc5c nodeName:}" failed. No retries permitted until 2026-04-17 20:04:25.130381157 +0000 UTC m=+3.998200913 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7e87cc22-ed21-46b2-9dac-b1a3906efc5c-metrics-certs") pod "network-metrics-daemon-cnnh2" (UID: "7e87cc22-ed21-46b2-9dac-b1a3906efc5c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:04:24.269119 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:24.269081 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4c3ac1f_f0bf_4ff3_9fef_fb513199e95b.slice/crio-5b82ff0f358cd427e78d40ec1776ebe2ae0dd1c18bdfefa75a1c3cdb3976d587 WatchSource:0}: Error finding container 5b82ff0f358cd427e78d40ec1776ebe2ae0dd1c18bdfefa75a1c3cdb3976d587: Status 404 returned error can't find the container with id 5b82ff0f358cd427e78d40ec1776ebe2ae0dd1c18bdfefa75a1c3cdb3976d587 Apr 17 20:04:24.269878 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:24.269856 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b02a788_0b0e_471a_9bdb_94dc0228f8a3.slice/crio-550c21ea198bc287cf311b53a786c82c6cdbc4c43c0d86e290fed34df170a451 WatchSource:0}: Error finding container 550c21ea198bc287cf311b53a786c82c6cdbc4c43c0d86e290fed34df170a451: Status 404 returned error can't find the container with id 550c21ea198bc287cf311b53a786c82c6cdbc4c43c0d86e290fed34df170a451 Apr 17 20:04:24.270797 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:24.270775 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c7c8612_982d_4a5d_81bd_d18fab9adeef.slice/crio-ccc81cc5914af4f79b1c699688ea663405cc15b842106f55fe9d01244cc13d4e WatchSource:0}: Error finding container ccc81cc5914af4f79b1c699688ea663405cc15b842106f55fe9d01244cc13d4e: Status 404 returned error can't find the container with id ccc81cc5914af4f79b1c699688ea663405cc15b842106f55fe9d01244cc13d4e Apr 17 20:04:24.272067 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:24.272045 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76870744_6d88_4402_b9a0_678580dd9213.slice/crio-f34597dbdb7bea89da1e0e6491b96711cb3c1af287a169c5b4731ae0fb9375df WatchSource:0}: Error finding container f34597dbdb7bea89da1e0e6491b96711cb3c1af287a169c5b4731ae0fb9375df: Status 404 returned error can't find the container with id f34597dbdb7bea89da1e0e6491b96711cb3c1af287a169c5b4731ae0fb9375df Apr 17 20:04:24.274869 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:24.274534 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf40c767d_e93d_48a0_940e_7c8f96979338.slice/crio-1711cf4db0fe2df1c180b3f3bfa11afbc39c716d76d5aaf09482d47b55c930e4 WatchSource:0}: Error finding container 1711cf4db0fe2df1c180b3f3bfa11afbc39c716d76d5aaf09482d47b55c930e4: Status 404 returned error can't find the container with id 1711cf4db0fe2df1c180b3f3bfa11afbc39c716d76d5aaf09482d47b55c930e4 Apr 17 20:04:24.275546 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:24.275512 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7221eba4_90c3_41d0_809f_2cfbbcbf40d8.slice/crio-1d8a4e9a6d9d44782e1ec9d1accdf58ae2ddccedd0e02ce2d925f4ab44b3a086 WatchSource:0}: Error finding container 1d8a4e9a6d9d44782e1ec9d1accdf58ae2ddccedd0e02ce2d925f4ab44b3a086: Status 404 returned error can't find the container with id 1d8a4e9a6d9d44782e1ec9d1accdf58ae2ddccedd0e02ce2d925f4ab44b3a086 Apr 17 20:04:24.280244 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:24.280220 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod959e3ae9_cf8e_4e50_b31f_76cf29653018.slice/crio-8ddbfa9d876258fabb6439b53c482fddf1138927baa722bdb701a6405f62c96a WatchSource:0}: Error finding container 8ddbfa9d876258fabb6439b53c482fddf1138927baa722bdb701a6405f62c96a: Status 404 returned error can't find the container with id 8ddbfa9d876258fabb6439b53c482fddf1138927baa722bdb701a6405f62c96a Apr 17 20:04:24.281693 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:24.281669 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb52d0ba_f85e_4263_9b90_db812ebeca7c.slice/crio-1ed3c4a89b20f8e91c838d3302f9e369179c1b3299b930630dd2e52668b0d695 WatchSource:0}: Error finding container 1ed3c4a89b20f8e91c838d3302f9e369179c1b3299b930630dd2e52668b0d695: Status 404 returned error can't find the container with id 1ed3c4a89b20f8e91c838d3302f9e369179c1b3299b930630dd2e52668b0d695 Apr 17 20:04:24.331764 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:24.331632 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-whvrs\" (UniqueName: \"kubernetes.io/projected/8ea69030-bfbd-46e0-bba4-3408bc57cb1c-kube-api-access-whvrs\") pod \"network-check-target-zgppr\" (UID: \"8ea69030-bfbd-46e0-bba4-3408bc57cb1c\") " pod="openshift-network-diagnostics/network-check-target-zgppr" Apr 17 20:04:24.331859 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:24.331787 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:04:24.331859 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:24.331810 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:04:24.331859 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:24.331822 2576 projected.go:194] Error preparing data for projected volume kube-api-access-whvrs for pod openshift-network-diagnostics/network-check-target-zgppr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:04:24.331961 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:24.331886 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8ea69030-bfbd-46e0-bba4-3408bc57cb1c-kube-api-access-whvrs podName:8ea69030-bfbd-46e0-bba4-3408bc57cb1c nodeName:}" failed. No retries permitted until 2026-04-17 20:04:25.331869688 +0000 UTC m=+4.199689427 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-whvrs" (UniqueName: "kubernetes.io/projected/8ea69030-bfbd-46e0-bba4-3408bc57cb1c-kube-api-access-whvrs") pod "network-check-target-zgppr" (UID: "8ea69030-bfbd-46e0-bba4-3408bc57cb1c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:04:24.555316 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:24.555153 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 19:59:22 +0000 UTC" deadline="2027-12-17 18:58:19.645480199 +0000 UTC" Apr 17 20:04:24.555316 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:24.555183 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14614h53m55.09030019s" Apr 17 20:04:24.634406 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:24.634377 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnnh2" Apr 17 20:04:24.634558 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:24.634512 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cnnh2" podUID="7e87cc22-ed21-46b2-9dac-b1a3906efc5c" Apr 17 20:04:24.646868 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:24.646325 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-172.ec2.internal" event={"ID":"980e96a3ec854dc0a95f332d23ffd70e","Type":"ContainerStarted","Data":"21a423f64c326e541c26c3759c434bd4341a065216d25a207c9861b75cd94947"} Apr 17 20:04:24.650344 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:24.650196 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-g8vzm" event={"ID":"951f5411-0db0-4850-b6dc-cdf6bfae4395","Type":"ContainerStarted","Data":"e78b330b2c823f3d3b8f99d3ad61226d37a7ac43db876c9c9c5df14c32f832a3"} Apr 17 20:04:24.656191 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:24.655755 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-79b7g" event={"ID":"76870744-6d88-4402-b9a0-678580dd9213","Type":"ContainerStarted","Data":"f34597dbdb7bea89da1e0e6491b96711cb3c1af287a169c5b4731ae0fb9375df"} Apr 17 20:04:24.658102 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:24.658075 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fwvg8" event={"ID":"7221eba4-90c3-41d0-809f-2cfbbcbf40d8","Type":"ContainerStarted","Data":"1d8a4e9a6d9d44782e1ec9d1accdf58ae2ddccedd0e02ce2d925f4ab44b3a086"} Apr 17 20:04:24.662783 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:24.662733 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fd4vp" event={"ID":"959e3ae9-cf8e-4e50-b31f-76cf29653018","Type":"ContainerStarted","Data":"8ddbfa9d876258fabb6439b53c482fddf1138927baa722bdb701a6405f62c96a"} Apr 17 20:04:24.666581 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:24.666557 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2m9c7" event={"ID":"eb52d0ba-f85e-4263-9b90-db812ebeca7c","Type":"ContainerStarted","Data":"1ed3c4a89b20f8e91c838d3302f9e369179c1b3299b930630dd2e52668b0d695"} Apr 17 20:04:24.669672 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:24.669647 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-m6qnv" event={"ID":"f40c767d-e93d-48a0-940e-7c8f96979338","Type":"ContainerStarted","Data":"1711cf4db0fe2df1c180b3f3bfa11afbc39c716d76d5aaf09482d47b55c930e4"} Apr 17 20:04:24.672881 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:24.672856 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" event={"ID":"8c7c8612-982d-4a5d-81bd-d18fab9adeef","Type":"ContainerStarted","Data":"ccc81cc5914af4f79b1c699688ea663405cc15b842106f55fe9d01244cc13d4e"} Apr 17 20:04:24.675603 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:24.675578 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qnm9q" event={"ID":"9b02a788-0b0e-471a-9bdb-94dc0228f8a3","Type":"ContainerStarted","Data":"550c21ea198bc287cf311b53a786c82c6cdbc4c43c0d86e290fed34df170a451"} Apr 17 20:04:24.676995 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:24.676969 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-k5c2g" event={"ID":"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b","Type":"ContainerStarted","Data":"5b82ff0f358cd427e78d40ec1776ebe2ae0dd1c18bdfefa75a1c3cdb3976d587"} Apr 17 20:04:25.138398 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:25.138362 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e87cc22-ed21-46b2-9dac-b1a3906efc5c-metrics-certs\") pod \"network-metrics-daemon-cnnh2\" (UID: \"7e87cc22-ed21-46b2-9dac-b1a3906efc5c\") " pod="openshift-multus/network-metrics-daemon-cnnh2" Apr 17 20:04:25.138565 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:25.138515 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:04:25.138626 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:25.138577 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e87cc22-ed21-46b2-9dac-b1a3906efc5c-metrics-certs podName:7e87cc22-ed21-46b2-9dac-b1a3906efc5c nodeName:}" failed. No retries permitted until 2026-04-17 20:04:27.138558324 +0000 UTC m=+6.006378069 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7e87cc22-ed21-46b2-9dac-b1a3906efc5c-metrics-certs") pod "network-metrics-daemon-cnnh2" (UID: "7e87cc22-ed21-46b2-9dac-b1a3906efc5c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:04:25.340985 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:25.340303 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-whvrs\" (UniqueName: \"kubernetes.io/projected/8ea69030-bfbd-46e0-bba4-3408bc57cb1c-kube-api-access-whvrs\") pod \"network-check-target-zgppr\" (UID: \"8ea69030-bfbd-46e0-bba4-3408bc57cb1c\") " pod="openshift-network-diagnostics/network-check-target-zgppr" Apr 17 20:04:25.340985 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:25.340456 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:04:25.340985 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:25.340484 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:04:25.340985 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:25.340497 2576 projected.go:194] Error preparing data for projected volume kube-api-access-whvrs for pod openshift-network-diagnostics/network-check-target-zgppr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:04:25.340985 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:25.340557 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8ea69030-bfbd-46e0-bba4-3408bc57cb1c-kube-api-access-whvrs podName:8ea69030-bfbd-46e0-bba4-3408bc57cb1c nodeName:}" failed. No retries permitted until 2026-04-17 20:04:27.340538124 +0000 UTC m=+6.208357868 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-whvrs" (UniqueName: "kubernetes.io/projected/8ea69030-bfbd-46e0-bba4-3408bc57cb1c-kube-api-access-whvrs") pod "network-check-target-zgppr" (UID: "8ea69030-bfbd-46e0-bba4-3408bc57cb1c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:04:25.634093 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:25.634016 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgppr" Apr 17 20:04:25.634521 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:25.634153 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgppr" podUID="8ea69030-bfbd-46e0-bba4-3408bc57cb1c" Apr 17 20:04:25.687862 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:25.687248 2576 generic.go:358] "Generic (PLEG): container finished" podID="2d343d1c847cd1fba773386a10885afd" containerID="1c49d78657bd962dde9b75547956298ab47221769d23149d8f0ab40ef571fbea" exitCode=0 Apr 17 20:04:25.687862 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:25.687777 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-172.ec2.internal" event={"ID":"2d343d1c847cd1fba773386a10885afd","Type":"ContainerDied","Data":"1c49d78657bd962dde9b75547956298ab47221769d23149d8f0ab40ef571fbea"} Apr 17 20:04:25.702201 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:25.702154 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-172.ec2.internal" podStartSLOduration=2.702137697 podStartE2EDuration="2.702137697s" podCreationTimestamp="2026-04-17 20:04:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:04:24.660638764 +0000 UTC m=+3.528458527" watchObservedRunningTime="2026-04-17 20:04:25.702137697 +0000 UTC m=+4.569957460" Apr 17 20:04:26.634480 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:26.633988 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnnh2" Apr 17 20:04:26.634480 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:26.634129 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cnnh2" podUID="7e87cc22-ed21-46b2-9dac-b1a3906efc5c" Apr 17 20:04:26.699626 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:26.699575 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-172.ec2.internal" event={"ID":"2d343d1c847cd1fba773386a10885afd","Type":"ContainerStarted","Data":"b98d33b9b19b342aaa9d04e4a146282590d75cddb1e7b1e291a3a8e8288b1b5c"} Apr 17 20:04:27.158074 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:27.158034 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e87cc22-ed21-46b2-9dac-b1a3906efc5c-metrics-certs\") pod \"network-metrics-daemon-cnnh2\" (UID: \"7e87cc22-ed21-46b2-9dac-b1a3906efc5c\") " pod="openshift-multus/network-metrics-daemon-cnnh2" Apr 17 20:04:27.158263 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:27.158204 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:04:27.158336 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:27.158284 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e87cc22-ed21-46b2-9dac-b1a3906efc5c-metrics-certs podName:7e87cc22-ed21-46b2-9dac-b1a3906efc5c nodeName:}" failed. No retries permitted until 2026-04-17 20:04:31.158263161 +0000 UTC m=+10.026082916 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7e87cc22-ed21-46b2-9dac-b1a3906efc5c-metrics-certs") pod "network-metrics-daemon-cnnh2" (UID: "7e87cc22-ed21-46b2-9dac-b1a3906efc5c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:04:27.359698 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:27.359656 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-whvrs\" (UniqueName: \"kubernetes.io/projected/8ea69030-bfbd-46e0-bba4-3408bc57cb1c-kube-api-access-whvrs\") pod \"network-check-target-zgppr\" (UID: \"8ea69030-bfbd-46e0-bba4-3408bc57cb1c\") " pod="openshift-network-diagnostics/network-check-target-zgppr" Apr 17 20:04:27.359894 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:27.359857 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:04:27.359894 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:27.359880 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:04:27.360024 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:27.359895 2576 projected.go:194] Error preparing data for projected volume kube-api-access-whvrs for pod openshift-network-diagnostics/network-check-target-zgppr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:04:27.360024 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:27.359965 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8ea69030-bfbd-46e0-bba4-3408bc57cb1c-kube-api-access-whvrs podName:8ea69030-bfbd-46e0-bba4-3408bc57cb1c nodeName:}" failed. No retries permitted until 2026-04-17 20:04:31.359944302 +0000 UTC m=+10.227764058 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-whvrs" (UniqueName: "kubernetes.io/projected/8ea69030-bfbd-46e0-bba4-3408bc57cb1c-kube-api-access-whvrs") pod "network-check-target-zgppr" (UID: "8ea69030-bfbd-46e0-bba4-3408bc57cb1c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:04:27.636536 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:27.636441 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgppr" Apr 17 20:04:27.637036 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:27.636567 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgppr" podUID="8ea69030-bfbd-46e0-bba4-3408bc57cb1c" Apr 17 20:04:28.633859 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:28.633816 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnnh2" Apr 17 20:04:28.634057 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:28.633995 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cnnh2" podUID="7e87cc22-ed21-46b2-9dac-b1a3906efc5c" Apr 17 20:04:29.634343 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:29.633773 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgppr" Apr 17 20:04:29.634343 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:29.633926 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgppr" podUID="8ea69030-bfbd-46e0-bba4-3408bc57cb1c" Apr 17 20:04:30.634598 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:30.634067 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnnh2" Apr 17 20:04:30.634598 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:30.634216 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cnnh2" podUID="7e87cc22-ed21-46b2-9dac-b1a3906efc5c" Apr 17 20:04:31.193314 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:31.193243 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e87cc22-ed21-46b2-9dac-b1a3906efc5c-metrics-certs\") pod \"network-metrics-daemon-cnnh2\" (UID: \"7e87cc22-ed21-46b2-9dac-b1a3906efc5c\") " pod="openshift-multus/network-metrics-daemon-cnnh2" Apr 17 20:04:31.193502 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:31.193398 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:04:31.193502 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:31.193467 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e87cc22-ed21-46b2-9dac-b1a3906efc5c-metrics-certs podName:7e87cc22-ed21-46b2-9dac-b1a3906efc5c nodeName:}" failed. No retries permitted until 2026-04-17 20:04:39.193448047 +0000 UTC m=+18.061267808 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7e87cc22-ed21-46b2-9dac-b1a3906efc5c-metrics-certs") pod "network-metrics-daemon-cnnh2" (UID: "7e87cc22-ed21-46b2-9dac-b1a3906efc5c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:04:31.394734 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:31.394670 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-whvrs\" (UniqueName: \"kubernetes.io/projected/8ea69030-bfbd-46e0-bba4-3408bc57cb1c-kube-api-access-whvrs\") pod \"network-check-target-zgppr\" (UID: \"8ea69030-bfbd-46e0-bba4-3408bc57cb1c\") " pod="openshift-network-diagnostics/network-check-target-zgppr" Apr 17 20:04:31.394958 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:31.394820 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:04:31.394958 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:31.394863 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:04:31.394958 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:31.394878 2576 projected.go:194] Error preparing data for projected volume kube-api-access-whvrs for pod openshift-network-diagnostics/network-check-target-zgppr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:04:31.394958 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:31.394946 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8ea69030-bfbd-46e0-bba4-3408bc57cb1c-kube-api-access-whvrs podName:8ea69030-bfbd-46e0-bba4-3408bc57cb1c nodeName:}" failed. No retries permitted until 2026-04-17 20:04:39.394925523 +0000 UTC m=+18.262745267 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-whvrs" (UniqueName: "kubernetes.io/projected/8ea69030-bfbd-46e0-bba4-3408bc57cb1c-kube-api-access-whvrs") pod "network-check-target-zgppr" (UID: "8ea69030-bfbd-46e0-bba4-3408bc57cb1c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:04:31.634390 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:31.634357 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgppr" Apr 17 20:04:31.634575 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:31.634468 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgppr" podUID="8ea69030-bfbd-46e0-bba4-3408bc57cb1c" Apr 17 20:04:32.633734 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:32.633695 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnnh2" Apr 17 20:04:32.634238 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:32.633865 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cnnh2" podUID="7e87cc22-ed21-46b2-9dac-b1a3906efc5c" Apr 17 20:04:33.634562 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:33.634476 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgppr" Apr 17 20:04:33.635098 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:33.634591 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgppr" podUID="8ea69030-bfbd-46e0-bba4-3408bc57cb1c" Apr 17 20:04:34.634345 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:34.634312 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnnh2" Apr 17 20:04:34.634534 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:34.634448 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cnnh2" podUID="7e87cc22-ed21-46b2-9dac-b1a3906efc5c" Apr 17 20:04:35.634236 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:35.634201 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgppr" Apr 17 20:04:35.634693 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:35.634305 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgppr" podUID="8ea69030-bfbd-46e0-bba4-3408bc57cb1c" Apr 17 20:04:36.633547 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:36.633527 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnnh2" Apr 17 20:04:36.633650 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:36.633633 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cnnh2" podUID="7e87cc22-ed21-46b2-9dac-b1a3906efc5c" Apr 17 20:04:37.634025 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:37.633984 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgppr" Apr 17 20:04:37.634477 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:37.634119 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgppr" podUID="8ea69030-bfbd-46e0-bba4-3408bc57cb1c" Apr 17 20:04:38.633457 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:38.633386 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnnh2" Apr 17 20:04:38.633670 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:38.633541 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cnnh2" podUID="7e87cc22-ed21-46b2-9dac-b1a3906efc5c" Apr 17 20:04:39.252580 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:39.252537 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e87cc22-ed21-46b2-9dac-b1a3906efc5c-metrics-certs\") pod \"network-metrics-daemon-cnnh2\" (UID: \"7e87cc22-ed21-46b2-9dac-b1a3906efc5c\") " pod="openshift-multus/network-metrics-daemon-cnnh2" Apr 17 20:04:39.253012 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:39.252710 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:04:39.253012 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:39.252792 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e87cc22-ed21-46b2-9dac-b1a3906efc5c-metrics-certs podName:7e87cc22-ed21-46b2-9dac-b1a3906efc5c nodeName:}" failed. No retries permitted until 2026-04-17 20:04:55.252769789 +0000 UTC m=+34.120589531 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7e87cc22-ed21-46b2-9dac-b1a3906efc5c-metrics-certs") pod "network-metrics-daemon-cnnh2" (UID: "7e87cc22-ed21-46b2-9dac-b1a3906efc5c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:04:39.453752 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:39.453719 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-whvrs\" (UniqueName: \"kubernetes.io/projected/8ea69030-bfbd-46e0-bba4-3408bc57cb1c-kube-api-access-whvrs\") pod \"network-check-target-zgppr\" (UID: \"8ea69030-bfbd-46e0-bba4-3408bc57cb1c\") " pod="openshift-network-diagnostics/network-check-target-zgppr" Apr 17 20:04:39.453930 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:39.453887 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:04:39.453930 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:39.453907 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:04:39.453930 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:39.453918 2576 projected.go:194] Error preparing data for projected volume kube-api-access-whvrs for pod openshift-network-diagnostics/network-check-target-zgppr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:04:39.454051 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:39.453967 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8ea69030-bfbd-46e0-bba4-3408bc57cb1c-kube-api-access-whvrs podName:8ea69030-bfbd-46e0-bba4-3408bc57cb1c nodeName:}" failed. No retries permitted until 2026-04-17 20:04:55.453952978 +0000 UTC m=+34.321772717 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-whvrs" (UniqueName: "kubernetes.io/projected/8ea69030-bfbd-46e0-bba4-3408bc57cb1c-kube-api-access-whvrs") pod "network-check-target-zgppr" (UID: "8ea69030-bfbd-46e0-bba4-3408bc57cb1c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:04:39.634029 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:39.633984 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgppr" Apr 17 20:04:39.634219 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:39.634107 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgppr" podUID="8ea69030-bfbd-46e0-bba4-3408bc57cb1c" Apr 17 20:04:40.633697 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:40.633663 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnnh2" Apr 17 20:04:40.634174 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:40.633795 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cnnh2" podUID="7e87cc22-ed21-46b2-9dac-b1a3906efc5c" Apr 17 20:04:41.634872 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:41.634702 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgppr" Apr 17 20:04:41.635372 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:41.634932 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgppr" podUID="8ea69030-bfbd-46e0-bba4-3408bc57cb1c" Apr 17 20:04:41.727162 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:41.727134 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2m9c7" event={"ID":"eb52d0ba-f85e-4263-9b90-db812ebeca7c","Type":"ContainerStarted","Data":"685d7c18f9f98da2480ffdd736655bad4abc2fb2afdd2a781dcaa9771b54d40d"} Apr 17 20:04:41.728643 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:41.728625 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2wkz_8c7c8612-982d-4a5d-81bd-d18fab9adeef/ovn-acl-logging/0.log" Apr 17 20:04:41.728941 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:41.728921 2576 generic.go:358] "Generic (PLEG): container finished" podID="8c7c8612-982d-4a5d-81bd-d18fab9adeef" containerID="482a791ed515f1a8e11ac1a47fed5b08eeebdfbe3c800e09e043805f3bcadf24" exitCode=1 Apr 17 20:04:41.728996 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:41.728981 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" event={"ID":"8c7c8612-982d-4a5d-81bd-d18fab9adeef","Type":"ContainerStarted","Data":"a1b1e27afb1f670c4fd56abdcea27ac81f37e2e08d940098cbe5c7905ddcee6d"} Apr 17 20:04:41.729041 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:41.729002 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" event={"ID":"8c7c8612-982d-4a5d-81bd-d18fab9adeef","Type":"ContainerDied","Data":"482a791ed515f1a8e11ac1a47fed5b08eeebdfbe3c800e09e043805f3bcadf24"} Apr 17 20:04:41.729041 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:41.729018 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" event={"ID":"8c7c8612-982d-4a5d-81bd-d18fab9adeef","Type":"ContainerStarted","Data":"44b3d87c6e057548d416bc6c5a90f77e6d9524afc7ce46f392979ccbecae57ae"} Apr 17 20:04:41.730116 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:41.730092 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qnm9q" event={"ID":"9b02a788-0b0e-471a-9bdb-94dc0228f8a3","Type":"ContainerStarted","Data":"72513c2f833902a57839e64f6c189bb799c4325aa34e3802ee5a5f756841ac39"} Apr 17 20:04:41.731305 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:41.731286 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-k5c2g" event={"ID":"a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b","Type":"ContainerStarted","Data":"e93234ab826eb685c5c83135be21add058d8fcd24ddba515da1c4b95886613c6"} Apr 17 20:04:41.732441 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:41.732421 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-g8vzm" event={"ID":"951f5411-0db0-4850-b6dc-cdf6bfae4395","Type":"ContainerStarted","Data":"37c9cd133c3285240a3be17f5a3fefb27627fe6b31797d78190476657ab52f92"} Apr 17 20:04:41.733511 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:41.733491 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-79b7g" event={"ID":"76870744-6d88-4402-b9a0-678580dd9213","Type":"ContainerStarted","Data":"684bb1953296dbbd484f72cbb7a38bcda7b1aeee966695171c6350afdd03b7b2"} Apr 17 20:04:41.734548 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:41.734527 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fwvg8" event={"ID":"7221eba4-90c3-41d0-809f-2cfbbcbf40d8","Type":"ContainerStarted","Data":"eeb0fbd9b5c229ca9b2818163f908daaf06e7e691ab78f2fe1ad3e35c4de5502"} Apr 17 20:04:41.735885 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:41.735868 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fd4vp" event={"ID":"959e3ae9-cf8e-4e50-b31f-76cf29653018","Type":"ContainerStarted","Data":"bfc1e61d39aaaa70284005e37703566dd40af7ff09b7601b182a9b73250aaae1"} Apr 17 20:04:41.756209 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:41.756162 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-172.ec2.internal" podStartSLOduration=18.756146709 podStartE2EDuration="18.756146709s" podCreationTimestamp="2026-04-17 20:04:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:04:26.714856737 +0000 UTC m=+5.582676498" watchObservedRunningTime="2026-04-17 20:04:41.756146709 +0000 UTC m=+20.623966471" Apr 17 20:04:41.770208 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:41.770168 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-79b7g" podStartSLOduration=3.774462915 podStartE2EDuration="20.770154903s" podCreationTimestamp="2026-04-17 20:04:21 +0000 UTC" firstStartedPulling="2026-04-17 20:04:24.274240175 +0000 UTC m=+3.142059913" lastFinishedPulling="2026-04-17 20:04:41.269932157 +0000 UTC m=+20.137751901" observedRunningTime="2026-04-17 20:04:41.769373147 +0000 UTC m=+20.637192907" watchObservedRunningTime="2026-04-17 20:04:41.770154903 +0000 UTC m=+20.637974663" Apr 17 20:04:41.782634 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:41.782554 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-fd4vp" podStartSLOduration=3.795034354 podStartE2EDuration="20.782539801s" podCreationTimestamp="2026-04-17 20:04:21 +0000 UTC" firstStartedPulling="2026-04-17 20:04:24.282429279 +0000 UTC m=+3.150249033" lastFinishedPulling="2026-04-17 20:04:41.269934727 +0000 UTC m=+20.137754480" observedRunningTime="2026-04-17 20:04:41.78198179 +0000 UTC m=+20.649801554" watchObservedRunningTime="2026-04-17 20:04:41.782539801 +0000 UTC m=+20.650359561" Apr 17 20:04:41.796256 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:41.796202 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-2m9c7" podStartSLOduration=8.455268245 podStartE2EDuration="20.796182248s" podCreationTimestamp="2026-04-17 20:04:21 +0000 UTC" firstStartedPulling="2026-04-17 20:04:24.283303866 +0000 UTC m=+3.151123605" lastFinishedPulling="2026-04-17 20:04:36.624217855 +0000 UTC m=+15.492037608" observedRunningTime="2026-04-17 20:04:41.79528334 +0000 UTC m=+20.663103101" watchObservedRunningTime="2026-04-17 20:04:41.796182248 +0000 UTC m=+20.664002024" Apr 17 20:04:41.810872 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:41.810815 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-k5c2g" podStartSLOduration=3.7729842160000002 podStartE2EDuration="20.810800155s" podCreationTimestamp="2026-04-17 20:04:21 +0000 UTC" firstStartedPulling="2026-04-17 20:04:24.270713346 +0000 UTC m=+3.138533086" lastFinishedPulling="2026-04-17 20:04:41.308529264 +0000 UTC m=+20.176349025" observedRunningTime="2026-04-17 20:04:41.810270772 +0000 UTC m=+20.678090573" watchObservedRunningTime="2026-04-17 20:04:41.810800155 +0000 UTC m=+20.678619916" Apr 17 20:04:41.825834 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:41.825793 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-g8vzm" podStartSLOduration=3.8368965040000003 podStartE2EDuration="20.825778022s" podCreationTimestamp="2026-04-17 20:04:21 +0000 UTC" firstStartedPulling="2026-04-17 20:04:24.281116106 +0000 UTC m=+3.148935851" lastFinishedPulling="2026-04-17 20:04:41.269997593 +0000 UTC m=+20.137817369" observedRunningTime="2026-04-17 20:04:41.825257793 +0000 UTC m=+20.693077555" watchObservedRunningTime="2026-04-17 20:04:41.825778022 +0000 UTC m=+20.693597783" Apr 17 20:04:42.630518 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:42.630335 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 20:04:42.633357 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:42.633337 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnnh2" Apr 17 20:04:42.633466 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:42.633444 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cnnh2" podUID="7e87cc22-ed21-46b2-9dac-b1a3906efc5c" Apr 17 20:04:42.739218 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:42.739190 2576 generic.go:358] "Generic (PLEG): container finished" podID="7221eba4-90c3-41d0-809f-2cfbbcbf40d8" containerID="eeb0fbd9b5c229ca9b2818163f908daaf06e7e691ab78f2fe1ad3e35c4de5502" exitCode=0 Apr 17 20:04:42.739791 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:42.739266 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fwvg8" event={"ID":"7221eba4-90c3-41d0-809f-2cfbbcbf40d8","Type":"ContainerDied","Data":"eeb0fbd9b5c229ca9b2818163f908daaf06e7e691ab78f2fe1ad3e35c4de5502"} Apr 17 20:04:42.740471 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:42.740446 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-m6qnv" event={"ID":"f40c767d-e93d-48a0-940e-7c8f96979338","Type":"ContainerStarted","Data":"a28469d3fd68ce9331844b252f77cf2005e69a95a9e53073f717a8678096be8e"} Apr 17 20:04:42.743120 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:42.743103 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2wkz_8c7c8612-982d-4a5d-81bd-d18fab9adeef/ovn-acl-logging/0.log" Apr 17 20:04:42.743456 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:42.743433 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" event={"ID":"8c7c8612-982d-4a5d-81bd-d18fab9adeef","Type":"ContainerStarted","Data":"bee7d53626addd699e02d9625d372b1ce71a16792822a3e461503704ae98f1a3"} Apr 17 20:04:42.743532 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:42.743464 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" event={"ID":"8c7c8612-982d-4a5d-81bd-d18fab9adeef","Type":"ContainerStarted","Data":"d4c8cb426e3a9d856fb6f4f5c45bb0da1b0e1a6a4990d6f190a49de214217aba"} Apr 17 20:04:42.743532 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:42.743479 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" event={"ID":"8c7c8612-982d-4a5d-81bd-d18fab9adeef","Type":"ContainerStarted","Data":"0aac600bf136e41ba80dfae9e96f000b283ff01da108c32d1121b22735960016"} Apr 17 20:04:42.744982 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:42.744961 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qnm9q" event={"ID":"9b02a788-0b0e-471a-9bdb-94dc0228f8a3","Type":"ContainerStarted","Data":"37f71647ea6f0895a6e68f70fe027b3ac6183778ee090ad47624e03bcab0ebf1"} Apr 17 20:04:42.770970 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:42.770916 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-m6qnv" podStartSLOduration=4.779745819 podStartE2EDuration="21.770903013s" podCreationTimestamp="2026-04-17 20:04:21 +0000 UTC" firstStartedPulling="2026-04-17 20:04:24.27883061 +0000 UTC m=+3.146650363" lastFinishedPulling="2026-04-17 20:04:41.269987805 +0000 UTC m=+20.137807557" observedRunningTime="2026-04-17 20:04:42.770491365 +0000 UTC m=+21.638311125" watchObservedRunningTime="2026-04-17 20:04:42.770903013 +0000 UTC m=+21.638722773" Apr 17 20:04:43.583646 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:43.583532 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T20:04:42.630513802Z","UUID":"94a853d0-7d7e-4b49-b89d-9a083491c7e4","Handler":null,"Name":"","Endpoint":""} Apr 17 20:04:43.586599 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:43.586572 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 20:04:43.586722 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:43.586608 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 20:04:43.633987 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:43.633959 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgppr" Apr 17 20:04:43.634174 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:43.634118 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgppr" podUID="8ea69030-bfbd-46e0-bba4-3408bc57cb1c" Apr 17 20:04:44.634335 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:44.634111 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnnh2" Apr 17 20:04:44.634754 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:44.634458 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cnnh2" podUID="7e87cc22-ed21-46b2-9dac-b1a3906efc5c" Apr 17 20:04:44.697928 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:44.697896 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-g8vzm" Apr 17 20:04:44.698614 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:44.698588 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-g8vzm" Apr 17 20:04:44.751492 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:44.751464 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2wkz_8c7c8612-982d-4a5d-81bd-d18fab9adeef/ovn-acl-logging/0.log" Apr 17 20:04:44.751901 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:44.751871 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" event={"ID":"8c7c8612-982d-4a5d-81bd-d18fab9adeef","Type":"ContainerStarted","Data":"e9ce6d7aecb74198e97783f49c62db72c20dfc730008bb3d3803f13deb9d39d1"} Apr 17 20:04:44.753987 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:44.753932 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qnm9q" event={"ID":"9b02a788-0b0e-471a-9bdb-94dc0228f8a3","Type":"ContainerStarted","Data":"b8f75bfa917539a17fa52dd3a52b517ced9e7ea1a2d0bcff681a688ed6455986"} Apr 17 20:04:44.754259 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:44.754237 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-g8vzm" Apr 17 20:04:44.755180 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:44.755163 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-g8vzm" Apr 17 20:04:44.769629 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:44.769588 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qnm9q" podStartSLOduration=4.319415703 podStartE2EDuration="23.769575657s" podCreationTimestamp="2026-04-17 20:04:21 +0000 UTC" firstStartedPulling="2026-04-17 20:04:24.272191406 +0000 UTC m=+3.140011152" lastFinishedPulling="2026-04-17 20:04:43.722351367 +0000 UTC m=+22.590171106" observedRunningTime="2026-04-17 20:04:44.769344035 +0000 UTC m=+23.637163823" watchObservedRunningTime="2026-04-17 20:04:44.769575657 +0000 UTC m=+23.637395418" Apr 17 20:04:45.634328 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:45.634299 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgppr" Apr 17 20:04:45.634525 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:45.634418 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgppr" podUID="8ea69030-bfbd-46e0-bba4-3408bc57cb1c" Apr 17 20:04:46.633751 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:46.633721 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnnh2" Apr 17 20:04:46.633920 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:46.633852 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cnnh2" podUID="7e87cc22-ed21-46b2-9dac-b1a3906efc5c" Apr 17 20:04:46.762562 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:46.762063 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2wkz_8c7c8612-982d-4a5d-81bd-d18fab9adeef/ovn-acl-logging/0.log" Apr 17 20:04:46.762562 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:46.762439 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" event={"ID":"8c7c8612-982d-4a5d-81bd-d18fab9adeef","Type":"ContainerStarted","Data":"e5a12d59474b3c49fd777441f2846e8aa440a0c866123d91369a4a6530fc691d"} Apr 17 20:04:46.763881 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:46.762817 2576 scope.go:117] "RemoveContainer" containerID="482a791ed515f1a8e11ac1a47fed5b08eeebdfbe3c800e09e043805f3bcadf24" Apr 17 20:04:46.763881 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:46.762928 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:46.763881 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:46.762966 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:46.763881 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:46.762980 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:46.783636 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:46.783298 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:46.787403 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:46.786351 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:04:47.633913 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:47.633718 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgppr" Apr 17 20:04:47.634077 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:47.633972 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgppr" podUID="8ea69030-bfbd-46e0-bba4-3408bc57cb1c" Apr 17 20:04:47.765192 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:47.765160 2576 generic.go:358] "Generic (PLEG): container finished" podID="7221eba4-90c3-41d0-809f-2cfbbcbf40d8" containerID="40ec3eaa2c789f40f4ae1d2912e9ec71e6d02378d54c7ea153778d04aa2d2722" exitCode=0 Apr 17 20:04:47.765608 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:47.765244 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fwvg8" event={"ID":"7221eba4-90c3-41d0-809f-2cfbbcbf40d8","Type":"ContainerDied","Data":"40ec3eaa2c789f40f4ae1d2912e9ec71e6d02378d54c7ea153778d04aa2d2722"} Apr 17 20:04:47.768418 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:47.768401 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2wkz_8c7c8612-982d-4a5d-81bd-d18fab9adeef/ovn-acl-logging/0.log" Apr 17 20:04:47.768724 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:47.768701 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" event={"ID":"8c7c8612-982d-4a5d-81bd-d18fab9adeef","Type":"ContainerStarted","Data":"ce7080dd7ac9e90607fa530d9a85af3d6daeec2539f46b72e5b3f45ddb696899"} Apr 17 20:04:47.810598 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:47.810553 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" podStartSLOduration=9.741862982 podStartE2EDuration="26.810537607s" podCreationTimestamp="2026-04-17 20:04:21 +0000 UTC" firstStartedPulling="2026-04-17 20:04:24.273594973 +0000 UTC m=+3.141414717" lastFinishedPulling="2026-04-17 20:04:41.342269271 +0000 UTC m=+20.210089342" observedRunningTime="2026-04-17 20:04:47.810001423 +0000 UTC m=+26.677821184" watchObservedRunningTime="2026-04-17 20:04:47.810537607 +0000 UTC m=+26.678357386" Apr 17 20:04:48.634334 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:48.634305 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnnh2" Apr 17 20:04:48.634480 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:48.634401 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cnnh2" podUID="7e87cc22-ed21-46b2-9dac-b1a3906efc5c" Apr 17 20:04:48.665819 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:48.665739 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-cnnh2"] Apr 17 20:04:48.666257 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:48.666234 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-zgppr"] Apr 17 20:04:48.666359 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:48.666324 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgppr" Apr 17 20:04:48.666428 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:48.666407 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgppr" podUID="8ea69030-bfbd-46e0-bba4-3408bc57cb1c" Apr 17 20:04:48.771834 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:48.771804 2576 generic.go:358] "Generic (PLEG): container finished" podID="7221eba4-90c3-41d0-809f-2cfbbcbf40d8" containerID="c4515b80b1311c03fa0a5d305ba857d5e73150031109fe284817da0bf4f9c662" exitCode=0 Apr 17 20:04:48.772292 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:48.771883 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fwvg8" event={"ID":"7221eba4-90c3-41d0-809f-2cfbbcbf40d8","Type":"ContainerDied","Data":"c4515b80b1311c03fa0a5d305ba857d5e73150031109fe284817da0bf4f9c662"} Apr 17 20:04:48.772292 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:48.771998 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnnh2" Apr 17 20:04:48.772292 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:48.772119 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cnnh2" podUID="7e87cc22-ed21-46b2-9dac-b1a3906efc5c" Apr 17 20:04:49.775735 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:49.775531 2576 generic.go:358] "Generic (PLEG): container finished" podID="7221eba4-90c3-41d0-809f-2cfbbcbf40d8" containerID="04ca1bf9db116b07293bf8b3915ff48b24e2753917d9ea54ab3b5299f3d2b66b" exitCode=0 Apr 17 20:04:49.776152 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:49.775612 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fwvg8" event={"ID":"7221eba4-90c3-41d0-809f-2cfbbcbf40d8","Type":"ContainerDied","Data":"04ca1bf9db116b07293bf8b3915ff48b24e2753917d9ea54ab3b5299f3d2b66b"} Apr 17 20:04:50.634281 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:50.634201 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnnh2" Apr 17 20:04:50.634457 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:50.634201 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgppr" Apr 17 20:04:50.634457 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:50.634338 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cnnh2" podUID="7e87cc22-ed21-46b2-9dac-b1a3906efc5c" Apr 17 20:04:50.634577 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:50.634455 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgppr" podUID="8ea69030-bfbd-46e0-bba4-3408bc57cb1c" Apr 17 20:04:52.634293 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:52.634260 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnnh2" Apr 17 20:04:52.634910 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:52.634259 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgppr" Apr 17 20:04:52.634910 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:52.634409 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cnnh2" podUID="7e87cc22-ed21-46b2-9dac-b1a3906efc5c" Apr 17 20:04:52.634910 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:52.634482 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgppr" podUID="8ea69030-bfbd-46e0-bba4-3408bc57cb1c" Apr 17 20:04:54.518749 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:54.518673 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-172.ec2.internal" event="NodeReady" Apr 17 20:04:54.519349 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:54.518817 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 20:04:54.566520 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:54.566484 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-p227m"] Apr 17 20:04:54.571962 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:54.571933 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-l2wrg"] Apr 17 20:04:54.572100 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:54.572079 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-p227m" Apr 17 20:04:54.574232 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:54.574201 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 20:04:54.574232 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:54.574221 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 20:04:54.574232 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:54.574230 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-4cbzc\"" Apr 17 20:04:54.575476 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:54.575449 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-l2wrg" Apr 17 20:04:54.577565 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:54.577546 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 20:04:54.577822 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:54.577783 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-p227m"] Apr 17 20:04:54.577943 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:54.577900 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 20:04:54.577943 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:54.577931 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-tl4j2\"" Apr 17 20:04:54.578643 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:54.578571 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 20:04:54.581034 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:54.581002 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-l2wrg"] Apr 17 20:04:54.634308 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:54.634279 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnnh2" Apr 17 20:04:54.634488 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:54.634285 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgppr" Apr 17 20:04:54.637039 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:54.637017 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 20:04:54.637181 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:54.637077 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-8c6pz\"" Apr 17 20:04:54.637181 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:54.637077 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 20:04:54.637181 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:54.637159 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-x8jh2\"" Apr 17 20:04:54.637356 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:54.637325 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 20:04:54.669757 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:54.669725 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvpg7\" (UniqueName: \"kubernetes.io/projected/642f7cfc-05bf-4010-8948-bc21230b60e0-kube-api-access-kvpg7\") pod \"ingress-canary-l2wrg\" (UID: \"642f7cfc-05bf-4010-8948-bc21230b60e0\") " pod="openshift-ingress-canary/ingress-canary-l2wrg" Apr 17 20:04:54.669930 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:54.669764 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8314d958-7474-40f1-b9c0-6ce6e8e69f40-config-volume\") pod \"dns-default-p227m\" (UID: \"8314d958-7474-40f1-b9c0-6ce6e8e69f40\") " pod="openshift-dns/dns-default-p227m" Apr 17 20:04:54.669930 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:54.669790 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/642f7cfc-05bf-4010-8948-bc21230b60e0-cert\") pod \"ingress-canary-l2wrg\" (UID: \"642f7cfc-05bf-4010-8948-bc21230b60e0\") " pod="openshift-ingress-canary/ingress-canary-l2wrg" Apr 17 20:04:54.669930 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:54.669816 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8314d958-7474-40f1-b9c0-6ce6e8e69f40-metrics-tls\") pod \"dns-default-p227m\" (UID: \"8314d958-7474-40f1-b9c0-6ce6e8e69f40\") " pod="openshift-dns/dns-default-p227m" Apr 17 20:04:54.669930 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:54.669865 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8314d958-7474-40f1-b9c0-6ce6e8e69f40-tmp-dir\") pod \"dns-default-p227m\" (UID: \"8314d958-7474-40f1-b9c0-6ce6e8e69f40\") " pod="openshift-dns/dns-default-p227m" Apr 17 20:04:54.670123 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:54.669959 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfvw5\" (UniqueName: \"kubernetes.io/projected/8314d958-7474-40f1-b9c0-6ce6e8e69f40-kube-api-access-jfvw5\") pod \"dns-default-p227m\" (UID: \"8314d958-7474-40f1-b9c0-6ce6e8e69f40\") " pod="openshift-dns/dns-default-p227m" Apr 17 20:04:54.770536 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:54.770453 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jfvw5\" (UniqueName: \"kubernetes.io/projected/8314d958-7474-40f1-b9c0-6ce6e8e69f40-kube-api-access-jfvw5\") pod \"dns-default-p227m\" (UID: \"8314d958-7474-40f1-b9c0-6ce6e8e69f40\") " pod="openshift-dns/dns-default-p227m" Apr 17 20:04:54.770690 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:54.770541 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kvpg7\" (UniqueName: \"kubernetes.io/projected/642f7cfc-05bf-4010-8948-bc21230b60e0-kube-api-access-kvpg7\") pod \"ingress-canary-l2wrg\" (UID: \"642f7cfc-05bf-4010-8948-bc21230b60e0\") " pod="openshift-ingress-canary/ingress-canary-l2wrg" Apr 17 20:04:54.771166 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:54.770924 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8314d958-7474-40f1-b9c0-6ce6e8e69f40-config-volume\") pod \"dns-default-p227m\" (UID: \"8314d958-7474-40f1-b9c0-6ce6e8e69f40\") " pod="openshift-dns/dns-default-p227m" Apr 17 20:04:54.771166 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:54.770985 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/642f7cfc-05bf-4010-8948-bc21230b60e0-cert\") pod \"ingress-canary-l2wrg\" (UID: \"642f7cfc-05bf-4010-8948-bc21230b60e0\") " pod="openshift-ingress-canary/ingress-canary-l2wrg" Apr 17 20:04:54.771166 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:54.771017 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8314d958-7474-40f1-b9c0-6ce6e8e69f40-metrics-tls\") pod \"dns-default-p227m\" (UID: \"8314d958-7474-40f1-b9c0-6ce6e8e69f40\") " pod="openshift-dns/dns-default-p227m" Apr 17 20:04:54.771166 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:54.771063 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8314d958-7474-40f1-b9c0-6ce6e8e69f40-tmp-dir\") pod \"dns-default-p227m\" (UID: \"8314d958-7474-40f1-b9c0-6ce6e8e69f40\") " pod="openshift-dns/dns-default-p227m" Apr 17 20:04:54.771455 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:54.771243 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:04:54.771455 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:54.771368 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:04:54.771560 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:54.771517 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/642f7cfc-05bf-4010-8948-bc21230b60e0-cert podName:642f7cfc-05bf-4010-8948-bc21230b60e0 nodeName:}" failed. No retries permitted until 2026-04-17 20:04:55.271493921 +0000 UTC m=+34.139313661 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/642f7cfc-05bf-4010-8948-bc21230b60e0-cert") pod "ingress-canary-l2wrg" (UID: "642f7cfc-05bf-4010-8948-bc21230b60e0") : secret "canary-serving-cert" not found Apr 17 20:04:54.771560 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:54.771525 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8314d958-7474-40f1-b9c0-6ce6e8e69f40-tmp-dir\") pod \"dns-default-p227m\" (UID: \"8314d958-7474-40f1-b9c0-6ce6e8e69f40\") " pod="openshift-dns/dns-default-p227m" Apr 17 20:04:54.773695 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:54.771694 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8314d958-7474-40f1-b9c0-6ce6e8e69f40-metrics-tls podName:8314d958-7474-40f1-b9c0-6ce6e8e69f40 nodeName:}" failed. No retries permitted until 2026-04-17 20:04:55.271543648 +0000 UTC m=+34.139363401 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8314d958-7474-40f1-b9c0-6ce6e8e69f40-metrics-tls") pod "dns-default-p227m" (UID: "8314d958-7474-40f1-b9c0-6ce6e8e69f40") : secret "dns-default-metrics-tls" not found Apr 17 20:04:54.773695 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:54.771754 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8314d958-7474-40f1-b9c0-6ce6e8e69f40-config-volume\") pod \"dns-default-p227m\" (UID: \"8314d958-7474-40f1-b9c0-6ce6e8e69f40\") " pod="openshift-dns/dns-default-p227m" Apr 17 20:04:54.781525 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:54.781504 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfvw5\" (UniqueName: \"kubernetes.io/projected/8314d958-7474-40f1-b9c0-6ce6e8e69f40-kube-api-access-jfvw5\") pod \"dns-default-p227m\" (UID: \"8314d958-7474-40f1-b9c0-6ce6e8e69f40\") " pod="openshift-dns/dns-default-p227m" Apr 17 20:04:54.781648 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:54.781551 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvpg7\" (UniqueName: \"kubernetes.io/projected/642f7cfc-05bf-4010-8948-bc21230b60e0-kube-api-access-kvpg7\") pod \"ingress-canary-l2wrg\" (UID: \"642f7cfc-05bf-4010-8948-bc21230b60e0\") " pod="openshift-ingress-canary/ingress-canary-l2wrg" Apr 17 20:04:55.274955 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:55.274913 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/642f7cfc-05bf-4010-8948-bc21230b60e0-cert\") pod \"ingress-canary-l2wrg\" (UID: \"642f7cfc-05bf-4010-8948-bc21230b60e0\") " pod="openshift-ingress-canary/ingress-canary-l2wrg" Apr 17 20:04:55.274955 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:55.274959 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8314d958-7474-40f1-b9c0-6ce6e8e69f40-metrics-tls\") pod \"dns-default-p227m\" (UID: \"8314d958-7474-40f1-b9c0-6ce6e8e69f40\") " pod="openshift-dns/dns-default-p227m" Apr 17 20:04:55.275143 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:55.274999 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e87cc22-ed21-46b2-9dac-b1a3906efc5c-metrics-certs\") pod \"network-metrics-daemon-cnnh2\" (UID: \"7e87cc22-ed21-46b2-9dac-b1a3906efc5c\") " pod="openshift-multus/network-metrics-daemon-cnnh2" Apr 17 20:04:55.275143 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:55.275068 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:04:55.275143 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:55.275087 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 20:04:55.275143 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:55.275120 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:04:55.275269 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:55.275145 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e87cc22-ed21-46b2-9dac-b1a3906efc5c-metrics-certs podName:7e87cc22-ed21-46b2-9dac-b1a3906efc5c nodeName:}" failed. No retries permitted until 2026-04-17 20:05:27.275126289 +0000 UTC m=+66.142946041 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7e87cc22-ed21-46b2-9dac-b1a3906efc5c-metrics-certs") pod "network-metrics-daemon-cnnh2" (UID: "7e87cc22-ed21-46b2-9dac-b1a3906efc5c") : secret "metrics-daemon-secret" not found Apr 17 20:04:55.275269 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:55.275160 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/642f7cfc-05bf-4010-8948-bc21230b60e0-cert podName:642f7cfc-05bf-4010-8948-bc21230b60e0 nodeName:}" failed. No retries permitted until 2026-04-17 20:04:56.275153149 +0000 UTC m=+35.142972889 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/642f7cfc-05bf-4010-8948-bc21230b60e0-cert") pod "ingress-canary-l2wrg" (UID: "642f7cfc-05bf-4010-8948-bc21230b60e0") : secret "canary-serving-cert" not found Apr 17 20:04:55.275269 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:55.275170 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8314d958-7474-40f1-b9c0-6ce6e8e69f40-metrics-tls podName:8314d958-7474-40f1-b9c0-6ce6e8e69f40 nodeName:}" failed. No retries permitted until 2026-04-17 20:04:56.275164636 +0000 UTC m=+35.142984375 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8314d958-7474-40f1-b9c0-6ce6e8e69f40-metrics-tls") pod "dns-default-p227m" (UID: "8314d958-7474-40f1-b9c0-6ce6e8e69f40") : secret "dns-default-metrics-tls" not found Apr 17 20:04:55.476773 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:55.476734 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-whvrs\" (UniqueName: \"kubernetes.io/projected/8ea69030-bfbd-46e0-bba4-3408bc57cb1c-kube-api-access-whvrs\") pod \"network-check-target-zgppr\" (UID: \"8ea69030-bfbd-46e0-bba4-3408bc57cb1c\") " pod="openshift-network-diagnostics/network-check-target-zgppr" Apr 17 20:04:55.479162 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:55.479141 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-whvrs\" (UniqueName: \"kubernetes.io/projected/8ea69030-bfbd-46e0-bba4-3408bc57cb1c-kube-api-access-whvrs\") pod \"network-check-target-zgppr\" (UID: \"8ea69030-bfbd-46e0-bba4-3408bc57cb1c\") " pod="openshift-network-diagnostics/network-check-target-zgppr" Apr 17 20:04:55.551579 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:55.551498 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgppr" Apr 17 20:04:55.768365 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:55.768186 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-zgppr"] Apr 17 20:04:55.771963 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:04:55.771937 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ea69030_bfbd_46e0_bba4_3408bc57cb1c.slice/crio-c1a5c30d72f133b3134c78799bcaf6e8dbb4b39e693474f02c23e3bd6338a83d WatchSource:0}: Error finding container c1a5c30d72f133b3134c78799bcaf6e8dbb4b39e693474f02c23e3bd6338a83d: Status 404 returned error can't find the container with id c1a5c30d72f133b3134c78799bcaf6e8dbb4b39e693474f02c23e3bd6338a83d Apr 17 20:04:55.788012 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:55.787986 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-zgppr" event={"ID":"8ea69030-bfbd-46e0-bba4-3408bc57cb1c","Type":"ContainerStarted","Data":"c1a5c30d72f133b3134c78799bcaf6e8dbb4b39e693474f02c23e3bd6338a83d"} Apr 17 20:04:56.284418 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:56.284326 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/642f7cfc-05bf-4010-8948-bc21230b60e0-cert\") pod \"ingress-canary-l2wrg\" (UID: \"642f7cfc-05bf-4010-8948-bc21230b60e0\") " pod="openshift-ingress-canary/ingress-canary-l2wrg" Apr 17 20:04:56.284418 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:56.284368 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8314d958-7474-40f1-b9c0-6ce6e8e69f40-metrics-tls\") pod \"dns-default-p227m\" (UID: \"8314d958-7474-40f1-b9c0-6ce6e8e69f40\") " pod="openshift-dns/dns-default-p227m" Apr 17 20:04:56.284614 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:56.284465 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:04:56.284614 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:56.284477 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:04:56.284614 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:56.284525 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8314d958-7474-40f1-b9c0-6ce6e8e69f40-metrics-tls podName:8314d958-7474-40f1-b9c0-6ce6e8e69f40 nodeName:}" failed. No retries permitted until 2026-04-17 20:04:58.284510524 +0000 UTC m=+37.152330268 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8314d958-7474-40f1-b9c0-6ce6e8e69f40-metrics-tls") pod "dns-default-p227m" (UID: "8314d958-7474-40f1-b9c0-6ce6e8e69f40") : secret "dns-default-metrics-tls" not found Apr 17 20:04:56.284614 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:56.284538 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/642f7cfc-05bf-4010-8948-bc21230b60e0-cert podName:642f7cfc-05bf-4010-8948-bc21230b60e0 nodeName:}" failed. No retries permitted until 2026-04-17 20:04:58.284532386 +0000 UTC m=+37.152352125 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/642f7cfc-05bf-4010-8948-bc21230b60e0-cert") pod "ingress-canary-l2wrg" (UID: "642f7cfc-05bf-4010-8948-bc21230b60e0") : secret "canary-serving-cert" not found Apr 17 20:04:56.793329 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:56.793292 2576 generic.go:358] "Generic (PLEG): container finished" podID="7221eba4-90c3-41d0-809f-2cfbbcbf40d8" containerID="c9eff2cae3c26ade506d20be6f47b754d69295c5840c78c484b63b60a2fae84a" exitCode=0 Apr 17 20:04:56.793329 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:56.793322 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fwvg8" event={"ID":"7221eba4-90c3-41d0-809f-2cfbbcbf40d8","Type":"ContainerDied","Data":"c9eff2cae3c26ade506d20be6f47b754d69295c5840c78c484b63b60a2fae84a"} Apr 17 20:04:57.799515 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:57.799479 2576 generic.go:358] "Generic (PLEG): container finished" podID="7221eba4-90c3-41d0-809f-2cfbbcbf40d8" containerID="73e33877ee208f88758187c1f7e41c024dc926ed2501a15b7b02421b23bfd68d" exitCode=0 Apr 17 20:04:57.800000 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:57.799556 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fwvg8" event={"ID":"7221eba4-90c3-41d0-809f-2cfbbcbf40d8","Type":"ContainerDied","Data":"73e33877ee208f88758187c1f7e41c024dc926ed2501a15b7b02421b23bfd68d"} Apr 17 20:04:58.300284 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:58.300239 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/642f7cfc-05bf-4010-8948-bc21230b60e0-cert\") pod \"ingress-canary-l2wrg\" (UID: \"642f7cfc-05bf-4010-8948-bc21230b60e0\") " pod="openshift-ingress-canary/ingress-canary-l2wrg" Apr 17 20:04:58.300450 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:58.300294 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8314d958-7474-40f1-b9c0-6ce6e8e69f40-metrics-tls\") pod \"dns-default-p227m\" (UID: \"8314d958-7474-40f1-b9c0-6ce6e8e69f40\") " pod="openshift-dns/dns-default-p227m" Apr 17 20:04:58.300450 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:58.300401 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:04:58.300450 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:58.300405 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:04:58.300582 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:58.300467 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/642f7cfc-05bf-4010-8948-bc21230b60e0-cert podName:642f7cfc-05bf-4010-8948-bc21230b60e0 nodeName:}" failed. No retries permitted until 2026-04-17 20:05:02.30044712 +0000 UTC m=+41.168266860 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/642f7cfc-05bf-4010-8948-bc21230b60e0-cert") pod "ingress-canary-l2wrg" (UID: "642f7cfc-05bf-4010-8948-bc21230b60e0") : secret "canary-serving-cert" not found Apr 17 20:04:58.300582 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:04:58.300483 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8314d958-7474-40f1-b9c0-6ce6e8e69f40-metrics-tls podName:8314d958-7474-40f1-b9c0-6ce6e8e69f40 nodeName:}" failed. No retries permitted until 2026-04-17 20:05:02.300476869 +0000 UTC m=+41.168296608 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8314d958-7474-40f1-b9c0-6ce6e8e69f40-metrics-tls") pod "dns-default-p227m" (UID: "8314d958-7474-40f1-b9c0-6ce6e8e69f40") : secret "dns-default-metrics-tls" not found Apr 17 20:04:58.807434 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:58.807259 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fwvg8" event={"ID":"7221eba4-90c3-41d0-809f-2cfbbcbf40d8","Type":"ContainerStarted","Data":"3153126d6f54e5ccf9ae0451ab6c306bd23135c442455722282d7374ba635899"} Apr 17 20:04:58.830394 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:58.830355 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-fwvg8" podStartSLOduration=6.481739857 podStartE2EDuration="37.830341596s" podCreationTimestamp="2026-04-17 20:04:21 +0000 UTC" firstStartedPulling="2026-04-17 20:04:24.279424077 +0000 UTC m=+3.147243831" lastFinishedPulling="2026-04-17 20:04:55.628025817 +0000 UTC m=+34.495845570" observedRunningTime="2026-04-17 20:04:58.829292787 +0000 UTC m=+37.697112550" watchObservedRunningTime="2026-04-17 20:04:58.830341596 +0000 UTC m=+37.698161357" Apr 17 20:04:59.810922 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:59.810888 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-zgppr" event={"ID":"8ea69030-bfbd-46e0-bba4-3408bc57cb1c","Type":"ContainerStarted","Data":"f12f711364802a5d2a874ecb07a159d89d7903ed065be7270c471be03d925aac"} Apr 17 20:04:59.811287 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:59.811122 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-zgppr" Apr 17 20:04:59.824761 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:04:59.824723 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-zgppr" podStartSLOduration=35.876323377 podStartE2EDuration="38.8247099s" podCreationTimestamp="2026-04-17 20:04:21 +0000 UTC" firstStartedPulling="2026-04-17 20:04:55.773963107 +0000 UTC m=+34.641782846" lastFinishedPulling="2026-04-17 20:04:58.722349627 +0000 UTC m=+37.590169369" observedRunningTime="2026-04-17 20:04:59.824629395 +0000 UTC m=+38.692449155" watchObservedRunningTime="2026-04-17 20:04:59.8247099 +0000 UTC m=+38.692529660" Apr 17 20:05:02.328056 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:02.328014 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/642f7cfc-05bf-4010-8948-bc21230b60e0-cert\") pod \"ingress-canary-l2wrg\" (UID: \"642f7cfc-05bf-4010-8948-bc21230b60e0\") " pod="openshift-ingress-canary/ingress-canary-l2wrg" Apr 17 20:05:02.328056 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:02.328058 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8314d958-7474-40f1-b9c0-6ce6e8e69f40-metrics-tls\") pod \"dns-default-p227m\" (UID: \"8314d958-7474-40f1-b9c0-6ce6e8e69f40\") " pod="openshift-dns/dns-default-p227m" Apr 17 20:05:02.328494 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:02.328148 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:05:02.328494 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:02.328169 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:05:02.328494 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:02.328202 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8314d958-7474-40f1-b9c0-6ce6e8e69f40-metrics-tls podName:8314d958-7474-40f1-b9c0-6ce6e8e69f40 nodeName:}" failed. No retries permitted until 2026-04-17 20:05:10.328187178 +0000 UTC m=+49.196006917 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8314d958-7474-40f1-b9c0-6ce6e8e69f40-metrics-tls") pod "dns-default-p227m" (UID: "8314d958-7474-40f1-b9c0-6ce6e8e69f40") : secret "dns-default-metrics-tls" not found Apr 17 20:05:02.328494 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:02.328234 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/642f7cfc-05bf-4010-8948-bc21230b60e0-cert podName:642f7cfc-05bf-4010-8948-bc21230b60e0 nodeName:}" failed. No retries permitted until 2026-04-17 20:05:10.328219629 +0000 UTC m=+49.196039368 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/642f7cfc-05bf-4010-8948-bc21230b60e0-cert") pod "ingress-canary-l2wrg" (UID: "642f7cfc-05bf-4010-8948-bc21230b60e0") : secret "canary-serving-cert" not found Apr 17 20:05:10.381742 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:10.381702 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/642f7cfc-05bf-4010-8948-bc21230b60e0-cert\") pod \"ingress-canary-l2wrg\" (UID: \"642f7cfc-05bf-4010-8948-bc21230b60e0\") " pod="openshift-ingress-canary/ingress-canary-l2wrg" Apr 17 20:05:10.381742 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:10.381749 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8314d958-7474-40f1-b9c0-6ce6e8e69f40-metrics-tls\") pod \"dns-default-p227m\" (UID: \"8314d958-7474-40f1-b9c0-6ce6e8e69f40\") " pod="openshift-dns/dns-default-p227m" Apr 17 20:05:10.382384 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:10.381887 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:05:10.382384 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:10.381961 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8314d958-7474-40f1-b9c0-6ce6e8e69f40-metrics-tls podName:8314d958-7474-40f1-b9c0-6ce6e8e69f40 nodeName:}" failed. No retries permitted until 2026-04-17 20:05:26.381944169 +0000 UTC m=+65.249763907 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8314d958-7474-40f1-b9c0-6ce6e8e69f40-metrics-tls") pod "dns-default-p227m" (UID: "8314d958-7474-40f1-b9c0-6ce6e8e69f40") : secret "dns-default-metrics-tls" not found Apr 17 20:05:10.382384 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:10.381887 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:05:10.382384 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:10.382025 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/642f7cfc-05bf-4010-8948-bc21230b60e0-cert podName:642f7cfc-05bf-4010-8948-bc21230b60e0 nodeName:}" failed. No retries permitted until 2026-04-17 20:05:26.382011959 +0000 UTC m=+65.249831708 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/642f7cfc-05bf-4010-8948-bc21230b60e0-cert") pod "ingress-canary-l2wrg" (UID: "642f7cfc-05bf-4010-8948-bc21230b60e0") : secret "canary-serving-cert" not found Apr 17 20:05:18.782205 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:18.782175 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z2wkz" Apr 17 20:05:26.389359 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:26.389318 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/642f7cfc-05bf-4010-8948-bc21230b60e0-cert\") pod \"ingress-canary-l2wrg\" (UID: \"642f7cfc-05bf-4010-8948-bc21230b60e0\") " pod="openshift-ingress-canary/ingress-canary-l2wrg" Apr 17 20:05:26.389359 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:26.389360 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8314d958-7474-40f1-b9c0-6ce6e8e69f40-metrics-tls\") pod \"dns-default-p227m\" (UID: \"8314d958-7474-40f1-b9c0-6ce6e8e69f40\") " pod="openshift-dns/dns-default-p227m" Apr 17 20:05:26.389836 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:26.389474 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:05:26.389836 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:26.389495 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:05:26.389836 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:26.389531 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8314d958-7474-40f1-b9c0-6ce6e8e69f40-metrics-tls podName:8314d958-7474-40f1-b9c0-6ce6e8e69f40 nodeName:}" failed. No retries permitted until 2026-04-17 20:05:58.389515119 +0000 UTC m=+97.257334858 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8314d958-7474-40f1-b9c0-6ce6e8e69f40-metrics-tls") pod "dns-default-p227m" (UID: "8314d958-7474-40f1-b9c0-6ce6e8e69f40") : secret "dns-default-metrics-tls" not found Apr 17 20:05:26.389836 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:26.389606 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/642f7cfc-05bf-4010-8948-bc21230b60e0-cert podName:642f7cfc-05bf-4010-8948-bc21230b60e0 nodeName:}" failed. No retries permitted until 2026-04-17 20:05:58.38958653 +0000 UTC m=+97.257406286 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/642f7cfc-05bf-4010-8948-bc21230b60e0-cert") pod "ingress-canary-l2wrg" (UID: "642f7cfc-05bf-4010-8948-bc21230b60e0") : secret "canary-serving-cert" not found Apr 17 20:05:27.295851 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:27.295810 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e87cc22-ed21-46b2-9dac-b1a3906efc5c-metrics-certs\") pod \"network-metrics-daemon-cnnh2\" (UID: \"7e87cc22-ed21-46b2-9dac-b1a3906efc5c\") " pod="openshift-multus/network-metrics-daemon-cnnh2" Apr 17 20:05:27.296029 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:27.295977 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 20:05:27.296074 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:27.296048 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e87cc22-ed21-46b2-9dac-b1a3906efc5c-metrics-certs podName:7e87cc22-ed21-46b2-9dac-b1a3906efc5c nodeName:}" failed. No retries permitted until 2026-04-17 20:06:31.296030222 +0000 UTC m=+130.163849960 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7e87cc22-ed21-46b2-9dac-b1a3906efc5c-metrics-certs") pod "network-metrics-daemon-cnnh2" (UID: "7e87cc22-ed21-46b2-9dac-b1a3906efc5c") : secret "metrics-daemon-secret" not found Apr 17 20:05:27.731110 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:27.731080 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-fbvmc"] Apr 17 20:05:27.737547 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:27.737524 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5f85496cb-r87zd"] Apr 17 20:05:27.737701 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:27.737683 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fbvmc" Apr 17 20:05:27.740473 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:27.740446 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5f85496cb-r87zd" Apr 17 20:05:27.741267 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:27.741248 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-k5hr6\"" Apr 17 20:05:27.741425 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:27.741361 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 20:05:27.741707 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:27.741606 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 20:05:27.741707 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:27.741683 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 17 20:05:27.741874 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:27.741816 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 17 20:05:27.742752 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:27.742733 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-fbvmc"] Apr 17 20:05:27.743041 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:27.743020 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 17 20:05:27.743153 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:27.743059 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 17 20:05:27.743153 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:27.743067 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 17 20:05:27.743153 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:27.743028 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-jwwmg\"" Apr 17 20:05:27.743153 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:27.743126 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 17 20:05:27.743444 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:27.743428 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 20:05:27.743736 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:27.743712 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 20:05:27.745078 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:27.745058 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5f85496cb-r87zd"] Apr 17 20:05:27.800630 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:27.800602 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69b938b6-9d16-44cc-9e04-55819bafd06e-service-ca-bundle\") pod \"router-default-5f85496cb-r87zd\" (UID: \"69b938b6-9d16-44cc-9e04-55819bafd06e\") " pod="openshift-ingress/router-default-5f85496cb-r87zd" Apr 17 20:05:27.800757 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:27.800645 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69b938b6-9d16-44cc-9e04-55819bafd06e-metrics-certs\") pod \"router-default-5f85496cb-r87zd\" (UID: \"69b938b6-9d16-44cc-9e04-55819bafd06e\") " pod="openshift-ingress/router-default-5f85496cb-r87zd" Apr 17 20:05:27.800757 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:27.800671 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/eea67969-35bc-420e-a3e8-819b529619bf-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fbvmc\" (UID: \"eea67969-35bc-420e-a3e8-819b529619bf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fbvmc" Apr 17 20:05:27.800832 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:27.800762 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/eea67969-35bc-420e-a3e8-819b529619bf-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-fbvmc\" (UID: \"eea67969-35bc-420e-a3e8-819b529619bf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fbvmc" Apr 17 20:05:27.800832 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:27.800780 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4gzs\" (UniqueName: \"kubernetes.io/projected/eea67969-35bc-420e-a3e8-819b529619bf-kube-api-access-k4gzs\") pod \"cluster-monitoring-operator-75587bd455-fbvmc\" (UID: \"eea67969-35bc-420e-a3e8-819b529619bf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fbvmc" Apr 17 20:05:27.800832 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:27.800818 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpr5r\" (UniqueName: \"kubernetes.io/projected/69b938b6-9d16-44cc-9e04-55819bafd06e-kube-api-access-tpr5r\") pod \"router-default-5f85496cb-r87zd\" (UID: \"69b938b6-9d16-44cc-9e04-55819bafd06e\") " pod="openshift-ingress/router-default-5f85496cb-r87zd" Apr 17 20:05:27.800950 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:27.800880 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/69b938b6-9d16-44cc-9e04-55819bafd06e-default-certificate\") pod \"router-default-5f85496cb-r87zd\" (UID: \"69b938b6-9d16-44cc-9e04-55819bafd06e\") " pod="openshift-ingress/router-default-5f85496cb-r87zd" Apr 17 20:05:27.800950 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:27.800905 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/69b938b6-9d16-44cc-9e04-55819bafd06e-stats-auth\") pod \"router-default-5f85496cb-r87zd\" (UID: \"69b938b6-9d16-44cc-9e04-55819bafd06e\") " pod="openshift-ingress/router-default-5f85496cb-r87zd" Apr 17 20:05:27.901477 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:27.901446 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/eea67969-35bc-420e-a3e8-819b529619bf-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-fbvmc\" (UID: \"eea67969-35bc-420e-a3e8-819b529619bf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fbvmc" Apr 17 20:05:27.901637 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:27.901484 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4gzs\" (UniqueName: \"kubernetes.io/projected/eea67969-35bc-420e-a3e8-819b529619bf-kube-api-access-k4gzs\") pod \"cluster-monitoring-operator-75587bd455-fbvmc\" (UID: \"eea67969-35bc-420e-a3e8-819b529619bf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fbvmc" Apr 17 20:05:27.901637 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:27.901502 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tpr5r\" (UniqueName: \"kubernetes.io/projected/69b938b6-9d16-44cc-9e04-55819bafd06e-kube-api-access-tpr5r\") pod \"router-default-5f85496cb-r87zd\" (UID: \"69b938b6-9d16-44cc-9e04-55819bafd06e\") " pod="openshift-ingress/router-default-5f85496cb-r87zd" Apr 17 20:05:27.901637 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:27.901521 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/69b938b6-9d16-44cc-9e04-55819bafd06e-default-certificate\") pod \"router-default-5f85496cb-r87zd\" (UID: \"69b938b6-9d16-44cc-9e04-55819bafd06e\") " pod="openshift-ingress/router-default-5f85496cb-r87zd" Apr 17 20:05:27.901637 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:27.901539 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/69b938b6-9d16-44cc-9e04-55819bafd06e-stats-auth\") pod \"router-default-5f85496cb-r87zd\" (UID: \"69b938b6-9d16-44cc-9e04-55819bafd06e\") " pod="openshift-ingress/router-default-5f85496cb-r87zd" Apr 17 20:05:27.901637 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:27.901572 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69b938b6-9d16-44cc-9e04-55819bafd06e-service-ca-bundle\") pod \"router-default-5f85496cb-r87zd\" (UID: \"69b938b6-9d16-44cc-9e04-55819bafd06e\") " pod="openshift-ingress/router-default-5f85496cb-r87zd" Apr 17 20:05:27.901637 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:27.901590 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69b938b6-9d16-44cc-9e04-55819bafd06e-metrics-certs\") pod \"router-default-5f85496cb-r87zd\" (UID: \"69b938b6-9d16-44cc-9e04-55819bafd06e\") " pod="openshift-ingress/router-default-5f85496cb-r87zd" Apr 17 20:05:27.901637 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:27.901606 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/eea67969-35bc-420e-a3e8-819b529619bf-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fbvmc\" (UID: \"eea67969-35bc-420e-a3e8-819b529619bf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fbvmc" Apr 17 20:05:27.902008 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:27.901726 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 20:05:27.902008 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:27.901791 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 20:05:27.902008 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:27.901799 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69b938b6-9d16-44cc-9e04-55819bafd06e-metrics-certs podName:69b938b6-9d16-44cc-9e04-55819bafd06e nodeName:}" failed. No retries permitted until 2026-04-17 20:05:28.40177683 +0000 UTC m=+67.269596586 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/69b938b6-9d16-44cc-9e04-55819bafd06e-metrics-certs") pod "router-default-5f85496cb-r87zd" (UID: "69b938b6-9d16-44cc-9e04-55819bafd06e") : secret "router-metrics-certs-default" not found Apr 17 20:05:27.902008 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:27.901826 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eea67969-35bc-420e-a3e8-819b529619bf-cluster-monitoring-operator-tls podName:eea67969-35bc-420e-a3e8-819b529619bf nodeName:}" failed. No retries permitted until 2026-04-17 20:05:28.401816128 +0000 UTC m=+67.269635868 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/eea67969-35bc-420e-a3e8-819b529619bf-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-fbvmc" (UID: "eea67969-35bc-420e-a3e8-819b529619bf") : secret "cluster-monitoring-operator-tls" not found Apr 17 20:05:27.902008 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:27.901927 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/69b938b6-9d16-44cc-9e04-55819bafd06e-service-ca-bundle podName:69b938b6-9d16-44cc-9e04-55819bafd06e nodeName:}" failed. No retries permitted until 2026-04-17 20:05:28.401910717 +0000 UTC m=+67.269730461 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/69b938b6-9d16-44cc-9e04-55819bafd06e-service-ca-bundle") pod "router-default-5f85496cb-r87zd" (UID: "69b938b6-9d16-44cc-9e04-55819bafd06e") : configmap references non-existent config key: service-ca.crt Apr 17 20:05:27.902275 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:27.902255 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/eea67969-35bc-420e-a3e8-819b529619bf-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-fbvmc\" (UID: \"eea67969-35bc-420e-a3e8-819b529619bf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fbvmc" Apr 17 20:05:27.905480 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:27.905448 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/69b938b6-9d16-44cc-9e04-55819bafd06e-default-certificate\") pod \"router-default-5f85496cb-r87zd\" (UID: \"69b938b6-9d16-44cc-9e04-55819bafd06e\") " pod="openshift-ingress/router-default-5f85496cb-r87zd" Apr 17 20:05:27.910221 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:27.910197 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/69b938b6-9d16-44cc-9e04-55819bafd06e-stats-auth\") pod \"router-default-5f85496cb-r87zd\" (UID: \"69b938b6-9d16-44cc-9e04-55819bafd06e\") " pod="openshift-ingress/router-default-5f85496cb-r87zd" Apr 17 20:05:27.911184 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:27.911161 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpr5r\" (UniqueName: \"kubernetes.io/projected/69b938b6-9d16-44cc-9e04-55819bafd06e-kube-api-access-tpr5r\") pod \"router-default-5f85496cb-r87zd\" (UID: \"69b938b6-9d16-44cc-9e04-55819bafd06e\") " pod="openshift-ingress/router-default-5f85496cb-r87zd" Apr 17 20:05:27.911268 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:27.911161 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4gzs\" (UniqueName: \"kubernetes.io/projected/eea67969-35bc-420e-a3e8-819b529619bf-kube-api-access-k4gzs\") pod \"cluster-monitoring-operator-75587bd455-fbvmc\" (UID: \"eea67969-35bc-420e-a3e8-819b529619bf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fbvmc" Apr 17 20:05:28.405211 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:28.405164 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69b938b6-9d16-44cc-9e04-55819bafd06e-service-ca-bundle\") pod \"router-default-5f85496cb-r87zd\" (UID: \"69b938b6-9d16-44cc-9e04-55819bafd06e\") " pod="openshift-ingress/router-default-5f85496cb-r87zd" Apr 17 20:05:28.405211 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:28.405212 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69b938b6-9d16-44cc-9e04-55819bafd06e-metrics-certs\") pod \"router-default-5f85496cb-r87zd\" (UID: \"69b938b6-9d16-44cc-9e04-55819bafd06e\") " pod="openshift-ingress/router-default-5f85496cb-r87zd" Apr 17 20:05:28.405584 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:28.405233 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/eea67969-35bc-420e-a3e8-819b529619bf-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fbvmc\" (UID: \"eea67969-35bc-420e-a3e8-819b529619bf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fbvmc" Apr 17 20:05:28.405755 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:28.405723 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 20:05:28.405835 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:28.405724 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 20:05:28.405835 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:28.405813 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/69b938b6-9d16-44cc-9e04-55819bafd06e-service-ca-bundle podName:69b938b6-9d16-44cc-9e04-55819bafd06e nodeName:}" failed. No retries permitted until 2026-04-17 20:05:29.405780148 +0000 UTC m=+68.273599887 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/69b938b6-9d16-44cc-9e04-55819bafd06e-service-ca-bundle") pod "router-default-5f85496cb-r87zd" (UID: "69b938b6-9d16-44cc-9e04-55819bafd06e") : configmap references non-existent config key: service-ca.crt Apr 17 20:05:28.405976 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:28.405866 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69b938b6-9d16-44cc-9e04-55819bafd06e-metrics-certs podName:69b938b6-9d16-44cc-9e04-55819bafd06e nodeName:}" failed. No retries permitted until 2026-04-17 20:05:29.405831134 +0000 UTC m=+68.273650877 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/69b938b6-9d16-44cc-9e04-55819bafd06e-metrics-certs") pod "router-default-5f85496cb-r87zd" (UID: "69b938b6-9d16-44cc-9e04-55819bafd06e") : secret "router-metrics-certs-default" not found Apr 17 20:05:28.405976 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:28.405902 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eea67969-35bc-420e-a3e8-819b529619bf-cluster-monitoring-operator-tls podName:eea67969-35bc-420e-a3e8-819b529619bf nodeName:}" failed. No retries permitted until 2026-04-17 20:05:29.405874882 +0000 UTC m=+68.273694621 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/eea67969-35bc-420e-a3e8-819b529619bf-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-fbvmc" (UID: "eea67969-35bc-420e-a3e8-819b529619bf") : secret "cluster-monitoring-operator-tls" not found Apr 17 20:05:29.415409 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:29.415362 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69b938b6-9d16-44cc-9e04-55819bafd06e-service-ca-bundle\") pod \"router-default-5f85496cb-r87zd\" (UID: \"69b938b6-9d16-44cc-9e04-55819bafd06e\") " pod="openshift-ingress/router-default-5f85496cb-r87zd" Apr 17 20:05:29.415409 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:29.415410 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69b938b6-9d16-44cc-9e04-55819bafd06e-metrics-certs\") pod \"router-default-5f85496cb-r87zd\" (UID: \"69b938b6-9d16-44cc-9e04-55819bafd06e\") " pod="openshift-ingress/router-default-5f85496cb-r87zd" Apr 17 20:05:29.415828 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:29.415432 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/eea67969-35bc-420e-a3e8-819b529619bf-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fbvmc\" (UID: \"eea67969-35bc-420e-a3e8-819b529619bf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fbvmc" Apr 17 20:05:29.415828 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:29.415533 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 20:05:29.415828 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:29.415572 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/69b938b6-9d16-44cc-9e04-55819bafd06e-service-ca-bundle podName:69b938b6-9d16-44cc-9e04-55819bafd06e nodeName:}" failed. No retries permitted until 2026-04-17 20:05:31.415551812 +0000 UTC m=+70.283371573 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/69b938b6-9d16-44cc-9e04-55819bafd06e-service-ca-bundle") pod "router-default-5f85496cb-r87zd" (UID: "69b938b6-9d16-44cc-9e04-55819bafd06e") : configmap references non-existent config key: service-ca.crt Apr 17 20:05:29.415828 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:29.415543 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 20:05:29.415828 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:29.415598 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69b938b6-9d16-44cc-9e04-55819bafd06e-metrics-certs podName:69b938b6-9d16-44cc-9e04-55819bafd06e nodeName:}" failed. No retries permitted until 2026-04-17 20:05:31.415590841 +0000 UTC m=+70.283410580 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/69b938b6-9d16-44cc-9e04-55819bafd06e-metrics-certs") pod "router-default-5f85496cb-r87zd" (UID: "69b938b6-9d16-44cc-9e04-55819bafd06e") : secret "router-metrics-certs-default" not found Apr 17 20:05:29.415828 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:29.415617 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eea67969-35bc-420e-a3e8-819b529619bf-cluster-monitoring-operator-tls podName:eea67969-35bc-420e-a3e8-819b529619bf nodeName:}" failed. No retries permitted until 2026-04-17 20:05:31.415604141 +0000 UTC m=+70.283423879 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/eea67969-35bc-420e-a3e8-819b529619bf-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-fbvmc" (UID: "eea67969-35bc-420e-a3e8-819b529619bf") : secret "cluster-monitoring-operator-tls" not found Apr 17 20:05:30.814770 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:30.814741 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-zgppr" Apr 17 20:05:31.430105 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:31.430053 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69b938b6-9d16-44cc-9e04-55819bafd06e-service-ca-bundle\") pod \"router-default-5f85496cb-r87zd\" (UID: \"69b938b6-9d16-44cc-9e04-55819bafd06e\") " pod="openshift-ingress/router-default-5f85496cb-r87zd" Apr 17 20:05:31.430279 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:31.430116 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69b938b6-9d16-44cc-9e04-55819bafd06e-metrics-certs\") pod \"router-default-5f85496cb-r87zd\" (UID: \"69b938b6-9d16-44cc-9e04-55819bafd06e\") " pod="openshift-ingress/router-default-5f85496cb-r87zd" Apr 17 20:05:31.430279 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:31.430153 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/eea67969-35bc-420e-a3e8-819b529619bf-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fbvmc\" (UID: \"eea67969-35bc-420e-a3e8-819b529619bf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fbvmc" Apr 17 20:05:31.430279 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:31.430230 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/69b938b6-9d16-44cc-9e04-55819bafd06e-service-ca-bundle podName:69b938b6-9d16-44cc-9e04-55819bafd06e nodeName:}" failed. No retries permitted until 2026-04-17 20:05:35.430209735 +0000 UTC m=+74.298029500 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/69b938b6-9d16-44cc-9e04-55819bafd06e-service-ca-bundle") pod "router-default-5f85496cb-r87zd" (UID: "69b938b6-9d16-44cc-9e04-55819bafd06e") : configmap references non-existent config key: service-ca.crt Apr 17 20:05:31.430390 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:31.430275 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 20:05:31.430390 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:31.430334 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69b938b6-9d16-44cc-9e04-55819bafd06e-metrics-certs podName:69b938b6-9d16-44cc-9e04-55819bafd06e nodeName:}" failed. No retries permitted until 2026-04-17 20:05:35.430322003 +0000 UTC m=+74.298141742 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/69b938b6-9d16-44cc-9e04-55819bafd06e-metrics-certs") pod "router-default-5f85496cb-r87zd" (UID: "69b938b6-9d16-44cc-9e04-55819bafd06e") : secret "router-metrics-certs-default" not found Apr 17 20:05:31.430390 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:31.430278 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 20:05:31.430390 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:31.430367 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eea67969-35bc-420e-a3e8-819b529619bf-cluster-monitoring-operator-tls podName:eea67969-35bc-420e-a3e8-819b529619bf nodeName:}" failed. No retries permitted until 2026-04-17 20:05:35.430360269 +0000 UTC m=+74.298180008 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/eea67969-35bc-420e-a3e8-819b529619bf-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-fbvmc" (UID: "eea67969-35bc-420e-a3e8-819b529619bf") : secret "cluster-monitoring-operator-tls" not found Apr 17 20:05:34.122809 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:34.122678 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-fd4vp_959e3ae9-cf8e-4e50-b31f-76cf29653018/dns-node-resolver/0.log" Apr 17 20:05:34.723048 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:34.723015 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-2m9c7_eb52d0ba-f85e-4263-9b90-db812ebeca7c/node-ca/0.log" Apr 17 20:05:35.461675 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:35.461636 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69b938b6-9d16-44cc-9e04-55819bafd06e-service-ca-bundle\") pod \"router-default-5f85496cb-r87zd\" (UID: \"69b938b6-9d16-44cc-9e04-55819bafd06e\") " pod="openshift-ingress/router-default-5f85496cb-r87zd" Apr 17 20:05:35.461675 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:35.461678 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69b938b6-9d16-44cc-9e04-55819bafd06e-metrics-certs\") pod \"router-default-5f85496cb-r87zd\" (UID: \"69b938b6-9d16-44cc-9e04-55819bafd06e\") " pod="openshift-ingress/router-default-5f85496cb-r87zd" Apr 17 20:05:35.462093 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:35.461698 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/eea67969-35bc-420e-a3e8-819b529619bf-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fbvmc\" (UID: \"eea67969-35bc-420e-a3e8-819b529619bf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fbvmc" Apr 17 20:05:35.462093 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:35.461801 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 20:05:35.462093 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:35.461813 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/69b938b6-9d16-44cc-9e04-55819bafd06e-service-ca-bundle podName:69b938b6-9d16-44cc-9e04-55819bafd06e nodeName:}" failed. No retries permitted until 2026-04-17 20:05:43.461794368 +0000 UTC m=+82.329614107 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/69b938b6-9d16-44cc-9e04-55819bafd06e-service-ca-bundle") pod "router-default-5f85496cb-r87zd" (UID: "69b938b6-9d16-44cc-9e04-55819bafd06e") : configmap references non-existent config key: service-ca.crt Apr 17 20:05:35.462093 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:35.461824 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 20:05:35.462093 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:35.461889 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69b938b6-9d16-44cc-9e04-55819bafd06e-metrics-certs podName:69b938b6-9d16-44cc-9e04-55819bafd06e nodeName:}" failed. No retries permitted until 2026-04-17 20:05:43.461869416 +0000 UTC m=+82.329689159 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/69b938b6-9d16-44cc-9e04-55819bafd06e-metrics-certs") pod "router-default-5f85496cb-r87zd" (UID: "69b938b6-9d16-44cc-9e04-55819bafd06e") : secret "router-metrics-certs-default" not found Apr 17 20:05:35.462093 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:35.461906 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eea67969-35bc-420e-a3e8-819b529619bf-cluster-monitoring-operator-tls podName:eea67969-35bc-420e-a3e8-819b529619bf nodeName:}" failed. No retries permitted until 2026-04-17 20:05:43.461898152 +0000 UTC m=+82.329717891 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/eea67969-35bc-420e-a3e8-819b529619bf-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-fbvmc" (UID: "eea67969-35bc-420e-a3e8-819b529619bf") : secret "cluster-monitoring-operator-tls" not found Apr 17 20:05:37.693344 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:37.693308 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-m55qn"] Apr 17 20:05:37.698072 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:37.698054 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-m55qn" Apr 17 20:05:37.700452 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:37.700432 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 17 20:05:37.700552 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:37.700436 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:05:37.701295 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:37.701281 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-p4d4l\"" Apr 17 20:05:37.705210 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:37.704833 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-m55qn"] Apr 17 20:05:37.778729 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:37.778690 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfq64\" (UniqueName: \"kubernetes.io/projected/cc9af87a-db40-4e36-8c0e-1c4f68713a3d-kube-api-access-bfq64\") pod \"volume-data-source-validator-7c6cbb6c87-m55qn\" (UID: \"cc9af87a-db40-4e36-8c0e-1c4f68713a3d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-m55qn" Apr 17 20:05:37.795135 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:37.795103 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qrtjl"] Apr 17 20:05:37.798432 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:37.798410 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qrtjl" Apr 17 20:05:37.800819 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:37.800799 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-x2pmw\"" Apr 17 20:05:37.801086 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:37.801070 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 17 20:05:37.801132 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:37.801108 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 17 20:05:37.801306 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:37.801289 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:05:37.801489 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:37.801476 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 17 20:05:37.808464 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:37.808442 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qrtjl"] Apr 17 20:05:37.879235 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:37.879196 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bfq64\" (UniqueName: \"kubernetes.io/projected/cc9af87a-db40-4e36-8c0e-1c4f68713a3d-kube-api-access-bfq64\") pod \"volume-data-source-validator-7c6cbb6c87-m55qn\" (UID: \"cc9af87a-db40-4e36-8c0e-1c4f68713a3d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-m55qn" Apr 17 20:05:37.879396 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:37.879286 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sww7h\" (UniqueName: \"kubernetes.io/projected/fb0f783e-6eb1-4b42-82f2-7eed827adacb-kube-api-access-sww7h\") pod \"kube-storage-version-migrator-operator-6769c5d45-qrtjl\" (UID: \"fb0f783e-6eb1-4b42-82f2-7eed827adacb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qrtjl" Apr 17 20:05:37.879396 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:37.879311 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb0f783e-6eb1-4b42-82f2-7eed827adacb-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-qrtjl\" (UID: \"fb0f783e-6eb1-4b42-82f2-7eed827adacb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qrtjl" Apr 17 20:05:37.879396 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:37.879328 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb0f783e-6eb1-4b42-82f2-7eed827adacb-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-qrtjl\" (UID: \"fb0f783e-6eb1-4b42-82f2-7eed827adacb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qrtjl" Apr 17 20:05:37.889131 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:37.889104 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfq64\" (UniqueName: \"kubernetes.io/projected/cc9af87a-db40-4e36-8c0e-1c4f68713a3d-kube-api-access-bfq64\") pod \"volume-data-source-validator-7c6cbb6c87-m55qn\" (UID: \"cc9af87a-db40-4e36-8c0e-1c4f68713a3d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-m55qn" Apr 17 20:05:37.980724 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:37.980631 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sww7h\" (UniqueName: \"kubernetes.io/projected/fb0f783e-6eb1-4b42-82f2-7eed827adacb-kube-api-access-sww7h\") pod \"kube-storage-version-migrator-operator-6769c5d45-qrtjl\" (UID: \"fb0f783e-6eb1-4b42-82f2-7eed827adacb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qrtjl" Apr 17 20:05:37.980724 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:37.980672 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb0f783e-6eb1-4b42-82f2-7eed827adacb-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-qrtjl\" (UID: \"fb0f783e-6eb1-4b42-82f2-7eed827adacb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qrtjl" Apr 17 20:05:37.980724 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:37.980694 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb0f783e-6eb1-4b42-82f2-7eed827adacb-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-qrtjl\" (UID: \"fb0f783e-6eb1-4b42-82f2-7eed827adacb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qrtjl" Apr 17 20:05:37.981302 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:37.981274 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb0f783e-6eb1-4b42-82f2-7eed827adacb-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-qrtjl\" (UID: \"fb0f783e-6eb1-4b42-82f2-7eed827adacb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qrtjl" Apr 17 20:05:37.982850 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:37.982825 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb0f783e-6eb1-4b42-82f2-7eed827adacb-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-qrtjl\" (UID: \"fb0f783e-6eb1-4b42-82f2-7eed827adacb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qrtjl" Apr 17 20:05:37.988274 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:37.988252 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sww7h\" (UniqueName: \"kubernetes.io/projected/fb0f783e-6eb1-4b42-82f2-7eed827adacb-kube-api-access-sww7h\") pod \"kube-storage-version-migrator-operator-6769c5d45-qrtjl\" (UID: \"fb0f783e-6eb1-4b42-82f2-7eed827adacb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qrtjl" Apr 17 20:05:38.008324 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:38.008293 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-m55qn" Apr 17 20:05:38.107159 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:38.107130 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qrtjl" Apr 17 20:05:38.140274 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:38.140242 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-m55qn"] Apr 17 20:05:38.144046 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:05:38.143994 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc9af87a_db40_4e36_8c0e_1c4f68713a3d.slice/crio-51aa372d01b5bee92651fda0ce1fc22e40d080a50476ad0cdea98314f41e1985 WatchSource:0}: Error finding container 51aa372d01b5bee92651fda0ce1fc22e40d080a50476ad0cdea98314f41e1985: Status 404 returned error can't find the container with id 51aa372d01b5bee92651fda0ce1fc22e40d080a50476ad0cdea98314f41e1985 Apr 17 20:05:38.221639 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:38.221598 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qrtjl"] Apr 17 20:05:38.224987 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:05:38.224961 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb0f783e_6eb1_4b42_82f2_7eed827adacb.slice/crio-c76c3b62884de111456e9b5f57922eaf7bfc14d307e7dcc69808f4c8fa92cbff WatchSource:0}: Error finding container c76c3b62884de111456e9b5f57922eaf7bfc14d307e7dcc69808f4c8fa92cbff: Status 404 returned error can't find the container with id c76c3b62884de111456e9b5f57922eaf7bfc14d307e7dcc69808f4c8fa92cbff Apr 17 20:05:38.886503 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:38.886451 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qrtjl" event={"ID":"fb0f783e-6eb1-4b42-82f2-7eed827adacb","Type":"ContainerStarted","Data":"c76c3b62884de111456e9b5f57922eaf7bfc14d307e7dcc69808f4c8fa92cbff"} Apr 17 20:05:38.887726 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:38.887692 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-m55qn" event={"ID":"cc9af87a-db40-4e36-8c0e-1c4f68713a3d","Type":"ContainerStarted","Data":"51aa372d01b5bee92651fda0ce1fc22e40d080a50476ad0cdea98314f41e1985"} Apr 17 20:05:39.891382 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:39.891338 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-m55qn" event={"ID":"cc9af87a-db40-4e36-8c0e-1c4f68713a3d","Type":"ContainerStarted","Data":"68d8e7ea118721b6f4e0618954714c66965af15609e40e9f9a2360f119df1926"} Apr 17 20:05:39.909364 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:39.909311 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-m55qn" podStartSLOduration=1.591707439 podStartE2EDuration="2.909293873s" podCreationTimestamp="2026-04-17 20:05:37 +0000 UTC" firstStartedPulling="2026-04-17 20:05:38.146140472 +0000 UTC m=+77.013960210" lastFinishedPulling="2026-04-17 20:05:39.463726892 +0000 UTC m=+78.331546644" observedRunningTime="2026-04-17 20:05:39.909139786 +0000 UTC m=+78.776959547" watchObservedRunningTime="2026-04-17 20:05:39.909293873 +0000 UTC m=+78.777113633" Apr 17 20:05:40.895000 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:40.894964 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qrtjl" event={"ID":"fb0f783e-6eb1-4b42-82f2-7eed827adacb","Type":"ContainerStarted","Data":"2dd9e741be1f9f56a9fdc9b1aa6367df5f7df3497db96c7b266f9fa194fe148b"} Apr 17 20:05:42.670090 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:42.670039 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qrtjl" podStartSLOduration=3.596198442 podStartE2EDuration="5.670019513s" podCreationTimestamp="2026-04-17 20:05:37 +0000 UTC" firstStartedPulling="2026-04-17 20:05:38.226701986 +0000 UTC m=+77.094521725" lastFinishedPulling="2026-04-17 20:05:40.300523054 +0000 UTC m=+79.168342796" observedRunningTime="2026-04-17 20:05:40.909439095 +0000 UTC m=+79.777258857" watchObservedRunningTime="2026-04-17 20:05:42.670019513 +0000 UTC m=+81.537839268" Apr 17 20:05:42.671304 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:42.671278 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-dvq7l"] Apr 17 20:05:42.675620 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:42.675603 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dvq7l" Apr 17 20:05:42.677727 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:42.677701 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 17 20:05:42.678513 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:42.678496 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-42twh\"" Apr 17 20:05:42.678568 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:42.678514 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 17 20:05:42.680162 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:42.680126 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-dvq7l"] Apr 17 20:05:42.725547 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:42.725513 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpvcm\" (UniqueName: \"kubernetes.io/projected/3c3a02fe-7f40-416b-90f8-3df7975d906c-kube-api-access-dpvcm\") pod \"migrator-74bb7799d9-dvq7l\" (UID: \"3c3a02fe-7f40-416b-90f8-3df7975d906c\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dvq7l" Apr 17 20:05:42.826140 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:42.826107 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dpvcm\" (UniqueName: \"kubernetes.io/projected/3c3a02fe-7f40-416b-90f8-3df7975d906c-kube-api-access-dpvcm\") pod \"migrator-74bb7799d9-dvq7l\" (UID: \"3c3a02fe-7f40-416b-90f8-3df7975d906c\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dvq7l" Apr 17 20:05:42.833893 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:42.833860 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpvcm\" (UniqueName: \"kubernetes.io/projected/3c3a02fe-7f40-416b-90f8-3df7975d906c-kube-api-access-dpvcm\") pod \"migrator-74bb7799d9-dvq7l\" (UID: \"3c3a02fe-7f40-416b-90f8-3df7975d906c\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dvq7l" Apr 17 20:05:42.984936 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:42.984856 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dvq7l" Apr 17 20:05:43.100928 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:43.100899 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-dvq7l"] Apr 17 20:05:43.103955 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:05:43.103927 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c3a02fe_7f40_416b_90f8_3df7975d906c.slice/crio-8ab2bc34886554419aef510a20eeed68966ac8a84a449dbcf6179f41a2522281 WatchSource:0}: Error finding container 8ab2bc34886554419aef510a20eeed68966ac8a84a449dbcf6179f41a2522281: Status 404 returned error can't find the container with id 8ab2bc34886554419aef510a20eeed68966ac8a84a449dbcf6179f41a2522281 Apr 17 20:05:43.530284 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:43.530249 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/eea67969-35bc-420e-a3e8-819b529619bf-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fbvmc\" (UID: \"eea67969-35bc-420e-a3e8-819b529619bf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fbvmc" Apr 17 20:05:43.530499 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:43.530356 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69b938b6-9d16-44cc-9e04-55819bafd06e-service-ca-bundle\") pod \"router-default-5f85496cb-r87zd\" (UID: \"69b938b6-9d16-44cc-9e04-55819bafd06e\") " pod="openshift-ingress/router-default-5f85496cb-r87zd" Apr 17 20:05:43.530499 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:43.530377 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69b938b6-9d16-44cc-9e04-55819bafd06e-metrics-certs\") pod \"router-default-5f85496cb-r87zd\" (UID: \"69b938b6-9d16-44cc-9e04-55819bafd06e\") " pod="openshift-ingress/router-default-5f85496cb-r87zd" Apr 17 20:05:43.530499 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:43.530425 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 20:05:43.530499 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:43.530470 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 20:05:43.530665 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:43.530510 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eea67969-35bc-420e-a3e8-819b529619bf-cluster-monitoring-operator-tls podName:eea67969-35bc-420e-a3e8-819b529619bf nodeName:}" failed. No retries permitted until 2026-04-17 20:05:59.53048878 +0000 UTC m=+98.398308526 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/eea67969-35bc-420e-a3e8-819b529619bf-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-fbvmc" (UID: "eea67969-35bc-420e-a3e8-819b529619bf") : secret "cluster-monitoring-operator-tls" not found Apr 17 20:05:43.530665 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:43.530529 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/69b938b6-9d16-44cc-9e04-55819bafd06e-service-ca-bundle podName:69b938b6-9d16-44cc-9e04-55819bafd06e nodeName:}" failed. No retries permitted until 2026-04-17 20:05:59.53052068 +0000 UTC m=+98.398340421 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/69b938b6-9d16-44cc-9e04-55819bafd06e-service-ca-bundle") pod "router-default-5f85496cb-r87zd" (UID: "69b938b6-9d16-44cc-9e04-55819bafd06e") : configmap references non-existent config key: service-ca.crt Apr 17 20:05:43.530665 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:43.530546 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69b938b6-9d16-44cc-9e04-55819bafd06e-metrics-certs podName:69b938b6-9d16-44cc-9e04-55819bafd06e nodeName:}" failed. No retries permitted until 2026-04-17 20:05:59.530536549 +0000 UTC m=+98.398356287 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/69b938b6-9d16-44cc-9e04-55819bafd06e-metrics-certs") pod "router-default-5f85496cb-r87zd" (UID: "69b938b6-9d16-44cc-9e04-55819bafd06e") : secret "router-metrics-certs-default" not found Apr 17 20:05:43.590168 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:43.590131 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-64db8476cb-95w6j"] Apr 17 20:05:43.594464 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:43.594446 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-64db8476cb-95w6j" Apr 17 20:05:43.609998 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:43.609972 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 20:05:43.614157 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:43.614138 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 20:05:43.615734 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:43.615717 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 20:05:43.622469 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:43.622454 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-rkflr\"" Apr 17 20:05:43.644206 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:43.644179 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 20:05:43.647977 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:43.647953 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-64db8476cb-95w6j"] Apr 17 20:05:43.731394 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:43.731361 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8eafe28a-1549-4404-a39c-dad771c51b73-trusted-ca\") pod \"image-registry-64db8476cb-95w6j\" (UID: \"8eafe28a-1549-4404-a39c-dad771c51b73\") " pod="openshift-image-registry/image-registry-64db8476cb-95w6j" Apr 17 20:05:43.731876 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:43.731400 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8eafe28a-1549-4404-a39c-dad771c51b73-bound-sa-token\") pod \"image-registry-64db8476cb-95w6j\" (UID: \"8eafe28a-1549-4404-a39c-dad771c51b73\") " pod="openshift-image-registry/image-registry-64db8476cb-95w6j" Apr 17 20:05:43.731876 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:43.731425 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8eafe28a-1549-4404-a39c-dad771c51b73-registry-certificates\") pod \"image-registry-64db8476cb-95w6j\" (UID: \"8eafe28a-1549-4404-a39c-dad771c51b73\") " pod="openshift-image-registry/image-registry-64db8476cb-95w6j" Apr 17 20:05:43.731876 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:43.731453 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8eafe28a-1549-4404-a39c-dad771c51b73-installation-pull-secrets\") pod \"image-registry-64db8476cb-95w6j\" (UID: \"8eafe28a-1549-4404-a39c-dad771c51b73\") " pod="openshift-image-registry/image-registry-64db8476cb-95w6j" Apr 17 20:05:43.731876 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:43.731584 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8eafe28a-1549-4404-a39c-dad771c51b73-registry-tls\") pod \"image-registry-64db8476cb-95w6j\" (UID: \"8eafe28a-1549-4404-a39c-dad771c51b73\") " pod="openshift-image-registry/image-registry-64db8476cb-95w6j" Apr 17 20:05:43.731876 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:43.731646 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8eafe28a-1549-4404-a39c-dad771c51b73-image-registry-private-configuration\") pod \"image-registry-64db8476cb-95w6j\" (UID: \"8eafe28a-1549-4404-a39c-dad771c51b73\") " pod="openshift-image-registry/image-registry-64db8476cb-95w6j" Apr 17 20:05:43.731876 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:43.731670 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8eafe28a-1549-4404-a39c-dad771c51b73-ca-trust-extracted\") pod \"image-registry-64db8476cb-95w6j\" (UID: \"8eafe28a-1549-4404-a39c-dad771c51b73\") " pod="openshift-image-registry/image-registry-64db8476cb-95w6j" Apr 17 20:05:43.731876 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:43.731695 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flmts\" (UniqueName: \"kubernetes.io/projected/8eafe28a-1549-4404-a39c-dad771c51b73-kube-api-access-flmts\") pod \"image-registry-64db8476cb-95w6j\" (UID: \"8eafe28a-1549-4404-a39c-dad771c51b73\") " pod="openshift-image-registry/image-registry-64db8476cb-95w6j" Apr 17 20:05:43.832572 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:43.832535 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-flmts\" (UniqueName: \"kubernetes.io/projected/8eafe28a-1549-4404-a39c-dad771c51b73-kube-api-access-flmts\") pod \"image-registry-64db8476cb-95w6j\" (UID: \"8eafe28a-1549-4404-a39c-dad771c51b73\") " pod="openshift-image-registry/image-registry-64db8476cb-95w6j" Apr 17 20:05:43.832735 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:43.832604 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8eafe28a-1549-4404-a39c-dad771c51b73-trusted-ca\") pod \"image-registry-64db8476cb-95w6j\" (UID: \"8eafe28a-1549-4404-a39c-dad771c51b73\") " pod="openshift-image-registry/image-registry-64db8476cb-95w6j" Apr 17 20:05:43.832735 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:43.832625 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8eafe28a-1549-4404-a39c-dad771c51b73-bound-sa-token\") pod \"image-registry-64db8476cb-95w6j\" (UID: \"8eafe28a-1549-4404-a39c-dad771c51b73\") " pod="openshift-image-registry/image-registry-64db8476cb-95w6j" Apr 17 20:05:43.832735 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:43.832678 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8eafe28a-1549-4404-a39c-dad771c51b73-registry-certificates\") pod \"image-registry-64db8476cb-95w6j\" (UID: \"8eafe28a-1549-4404-a39c-dad771c51b73\") " pod="openshift-image-registry/image-registry-64db8476cb-95w6j" Apr 17 20:05:43.832735 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:43.832703 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8eafe28a-1549-4404-a39c-dad771c51b73-installation-pull-secrets\") pod \"image-registry-64db8476cb-95w6j\" (UID: \"8eafe28a-1549-4404-a39c-dad771c51b73\") " pod="openshift-image-registry/image-registry-64db8476cb-95w6j" Apr 17 20:05:43.832916 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:43.832784 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8eafe28a-1549-4404-a39c-dad771c51b73-registry-tls\") pod \"image-registry-64db8476cb-95w6j\" (UID: \"8eafe28a-1549-4404-a39c-dad771c51b73\") " pod="openshift-image-registry/image-registry-64db8476cb-95w6j" Apr 17 20:05:43.832950 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:43.832932 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:05:43.832983 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:43.832950 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64db8476cb-95w6j: secret "image-registry-tls" not found Apr 17 20:05:43.833014 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:43.832977 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8eafe28a-1549-4404-a39c-dad771c51b73-image-registry-private-configuration\") pod \"image-registry-64db8476cb-95w6j\" (UID: \"8eafe28a-1549-4404-a39c-dad771c51b73\") " pod="openshift-image-registry/image-registry-64db8476cb-95w6j" Apr 17 20:05:43.833049 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:43.833020 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8eafe28a-1549-4404-a39c-dad771c51b73-registry-tls podName:8eafe28a-1549-4404-a39c-dad771c51b73 nodeName:}" failed. No retries permitted until 2026-04-17 20:05:44.332989101 +0000 UTC m=+83.200808857 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8eafe28a-1549-4404-a39c-dad771c51b73-registry-tls") pod "image-registry-64db8476cb-95w6j" (UID: "8eafe28a-1549-4404-a39c-dad771c51b73") : secret "image-registry-tls" not found Apr 17 20:05:43.833088 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:43.833053 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8eafe28a-1549-4404-a39c-dad771c51b73-ca-trust-extracted\") pod \"image-registry-64db8476cb-95w6j\" (UID: \"8eafe28a-1549-4404-a39c-dad771c51b73\") " pod="openshift-image-registry/image-registry-64db8476cb-95w6j" Apr 17 20:05:43.833417 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:43.833397 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8eafe28a-1549-4404-a39c-dad771c51b73-ca-trust-extracted\") pod \"image-registry-64db8476cb-95w6j\" (UID: \"8eafe28a-1549-4404-a39c-dad771c51b73\") " pod="openshift-image-registry/image-registry-64db8476cb-95w6j" Apr 17 20:05:43.833481 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:43.833396 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8eafe28a-1549-4404-a39c-dad771c51b73-registry-certificates\") pod \"image-registry-64db8476cb-95w6j\" (UID: \"8eafe28a-1549-4404-a39c-dad771c51b73\") " pod="openshift-image-registry/image-registry-64db8476cb-95w6j" Apr 17 20:05:43.833738 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:43.833716 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8eafe28a-1549-4404-a39c-dad771c51b73-trusted-ca\") pod \"image-registry-64db8476cb-95w6j\" (UID: \"8eafe28a-1549-4404-a39c-dad771c51b73\") " pod="openshift-image-registry/image-registry-64db8476cb-95w6j" Apr 17 20:05:43.835186 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:43.835166 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8eafe28a-1549-4404-a39c-dad771c51b73-image-registry-private-configuration\") pod \"image-registry-64db8476cb-95w6j\" (UID: \"8eafe28a-1549-4404-a39c-dad771c51b73\") " pod="openshift-image-registry/image-registry-64db8476cb-95w6j" Apr 17 20:05:43.835294 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:43.835223 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8eafe28a-1549-4404-a39c-dad771c51b73-installation-pull-secrets\") pod \"image-registry-64db8476cb-95w6j\" (UID: \"8eafe28a-1549-4404-a39c-dad771c51b73\") " pod="openshift-image-registry/image-registry-64db8476cb-95w6j" Apr 17 20:05:43.840875 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:43.840853 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8eafe28a-1549-4404-a39c-dad771c51b73-bound-sa-token\") pod \"image-registry-64db8476cb-95w6j\" (UID: \"8eafe28a-1549-4404-a39c-dad771c51b73\") " pod="openshift-image-registry/image-registry-64db8476cb-95w6j" Apr 17 20:05:43.841120 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:43.841101 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-flmts\" (UniqueName: \"kubernetes.io/projected/8eafe28a-1549-4404-a39c-dad771c51b73-kube-api-access-flmts\") pod \"image-registry-64db8476cb-95w6j\" (UID: \"8eafe28a-1549-4404-a39c-dad771c51b73\") " pod="openshift-image-registry/image-registry-64db8476cb-95w6j" Apr 17 20:05:43.902997 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:43.902956 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dvq7l" event={"ID":"3c3a02fe-7f40-416b-90f8-3df7975d906c","Type":"ContainerStarted","Data":"8ab2bc34886554419aef510a20eeed68966ac8a84a449dbcf6179f41a2522281"} Apr 17 20:05:44.338652 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:44.338608 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8eafe28a-1549-4404-a39c-dad771c51b73-registry-tls\") pod \"image-registry-64db8476cb-95w6j\" (UID: \"8eafe28a-1549-4404-a39c-dad771c51b73\") " pod="openshift-image-registry/image-registry-64db8476cb-95w6j" Apr 17 20:05:44.338828 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:44.338766 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:05:44.338828 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:44.338783 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64db8476cb-95w6j: secret "image-registry-tls" not found Apr 17 20:05:44.338944 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:44.338854 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8eafe28a-1549-4404-a39c-dad771c51b73-registry-tls podName:8eafe28a-1549-4404-a39c-dad771c51b73 nodeName:}" failed. No retries permitted until 2026-04-17 20:05:45.33882035 +0000 UTC m=+84.206640093 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8eafe28a-1549-4404-a39c-dad771c51b73-registry-tls") pod "image-registry-64db8476cb-95w6j" (UID: "8eafe28a-1549-4404-a39c-dad771c51b73") : secret "image-registry-tls" not found Apr 17 20:05:44.906870 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:44.906822 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dvq7l" event={"ID":"3c3a02fe-7f40-416b-90f8-3df7975d906c","Type":"ContainerStarted","Data":"80c38f51bed136415dad90427e24218e98b8f67e37eb6ffac09238e6cbd4aea2"} Apr 17 20:05:44.906870 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:44.906870 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dvq7l" event={"ID":"3c3a02fe-7f40-416b-90f8-3df7975d906c","Type":"ContainerStarted","Data":"e35a08fa16e08cebf4466d8d48349065638d4c5c625a387672de1fb18ea851b8"} Apr 17 20:05:44.921321 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:44.921275 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dvq7l" podStartSLOduration=1.585483365 podStartE2EDuration="2.921260551s" podCreationTimestamp="2026-04-17 20:05:42 +0000 UTC" firstStartedPulling="2026-04-17 20:05:43.10628282 +0000 UTC m=+81.974102559" lastFinishedPulling="2026-04-17 20:05:44.442059992 +0000 UTC m=+83.309879745" observedRunningTime="2026-04-17 20:05:44.920975666 +0000 UTC m=+83.788795426" watchObservedRunningTime="2026-04-17 20:05:44.921260551 +0000 UTC m=+83.789080311" Apr 17 20:05:45.347026 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:45.346988 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8eafe28a-1549-4404-a39c-dad771c51b73-registry-tls\") pod \"image-registry-64db8476cb-95w6j\" (UID: \"8eafe28a-1549-4404-a39c-dad771c51b73\") " pod="openshift-image-registry/image-registry-64db8476cb-95w6j" Apr 17 20:05:45.347186 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:45.347138 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:05:45.347186 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:45.347159 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64db8476cb-95w6j: secret "image-registry-tls" not found Apr 17 20:05:45.347263 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:45.347222 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8eafe28a-1549-4404-a39c-dad771c51b73-registry-tls podName:8eafe28a-1549-4404-a39c-dad771c51b73 nodeName:}" failed. No retries permitted until 2026-04-17 20:05:47.347205415 +0000 UTC m=+86.215025154 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8eafe28a-1549-4404-a39c-dad771c51b73-registry-tls") pod "image-registry-64db8476cb-95w6j" (UID: "8eafe28a-1549-4404-a39c-dad771c51b73") : secret "image-registry-tls" not found Apr 17 20:05:45.551658 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:45.551616 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-kjsfm"] Apr 17 20:05:45.554895 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:45.554873 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-kjsfm" Apr 17 20:05:45.557284 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:45.557260 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 20:05:45.557401 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:45.557299 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 20:05:45.557401 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:45.557362 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 20:05:45.558060 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:45.558036 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-zcfv7\"" Apr 17 20:05:45.558153 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:45.558082 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 20:05:45.562594 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:45.562573 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-kjsfm"] Apr 17 20:05:45.649527 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:45.649450 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0ad0b493-02ab-40b9-ac9b-42e735a5eb63-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-kjsfm\" (UID: \"0ad0b493-02ab-40b9-ac9b-42e735a5eb63\") " pod="openshift-insights/insights-runtime-extractor-kjsfm" Apr 17 20:05:45.649527 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:45.649481 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4bcn\" (UniqueName: \"kubernetes.io/projected/0ad0b493-02ab-40b9-ac9b-42e735a5eb63-kube-api-access-s4bcn\") pod \"insights-runtime-extractor-kjsfm\" (UID: \"0ad0b493-02ab-40b9-ac9b-42e735a5eb63\") " pod="openshift-insights/insights-runtime-extractor-kjsfm" Apr 17 20:05:45.649702 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:45.649561 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0ad0b493-02ab-40b9-ac9b-42e735a5eb63-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kjsfm\" (UID: \"0ad0b493-02ab-40b9-ac9b-42e735a5eb63\") " pod="openshift-insights/insights-runtime-extractor-kjsfm" Apr 17 20:05:45.649702 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:45.649606 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0ad0b493-02ab-40b9-ac9b-42e735a5eb63-data-volume\") pod \"insights-runtime-extractor-kjsfm\" (UID: \"0ad0b493-02ab-40b9-ac9b-42e735a5eb63\") " pod="openshift-insights/insights-runtime-extractor-kjsfm" Apr 17 20:05:45.649702 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:45.649643 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0ad0b493-02ab-40b9-ac9b-42e735a5eb63-crio-socket\") pod \"insights-runtime-extractor-kjsfm\" (UID: \"0ad0b493-02ab-40b9-ac9b-42e735a5eb63\") " pod="openshift-insights/insights-runtime-extractor-kjsfm" Apr 17 20:05:45.750647 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:45.750608 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0ad0b493-02ab-40b9-ac9b-42e735a5eb63-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kjsfm\" (UID: \"0ad0b493-02ab-40b9-ac9b-42e735a5eb63\") " pod="openshift-insights/insights-runtime-extractor-kjsfm" Apr 17 20:05:45.750864 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:45.750664 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0ad0b493-02ab-40b9-ac9b-42e735a5eb63-data-volume\") pod \"insights-runtime-extractor-kjsfm\" (UID: \"0ad0b493-02ab-40b9-ac9b-42e735a5eb63\") " pod="openshift-insights/insights-runtime-extractor-kjsfm" Apr 17 20:05:45.750864 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:45.750698 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0ad0b493-02ab-40b9-ac9b-42e735a5eb63-crio-socket\") pod \"insights-runtime-extractor-kjsfm\" (UID: \"0ad0b493-02ab-40b9-ac9b-42e735a5eb63\") " pod="openshift-insights/insights-runtime-extractor-kjsfm" Apr 17 20:05:45.750864 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:45.750742 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0ad0b493-02ab-40b9-ac9b-42e735a5eb63-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-kjsfm\" (UID: \"0ad0b493-02ab-40b9-ac9b-42e735a5eb63\") " pod="openshift-insights/insights-runtime-extractor-kjsfm" Apr 17 20:05:45.750864 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:45.750759 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s4bcn\" (UniqueName: \"kubernetes.io/projected/0ad0b493-02ab-40b9-ac9b-42e735a5eb63-kube-api-access-s4bcn\") pod \"insights-runtime-extractor-kjsfm\" (UID: \"0ad0b493-02ab-40b9-ac9b-42e735a5eb63\") " pod="openshift-insights/insights-runtime-extractor-kjsfm" Apr 17 20:05:45.750864 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:45.750786 2576 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 17 20:05:45.751095 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:45.750861 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0ad0b493-02ab-40b9-ac9b-42e735a5eb63-crio-socket\") pod \"insights-runtime-extractor-kjsfm\" (UID: \"0ad0b493-02ab-40b9-ac9b-42e735a5eb63\") " pod="openshift-insights/insights-runtime-extractor-kjsfm" Apr 17 20:05:45.751095 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:45.750887 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ad0b493-02ab-40b9-ac9b-42e735a5eb63-insights-runtime-extractor-tls podName:0ad0b493-02ab-40b9-ac9b-42e735a5eb63 nodeName:}" failed. No retries permitted until 2026-04-17 20:05:46.250864725 +0000 UTC m=+85.118684463 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/0ad0b493-02ab-40b9-ac9b-42e735a5eb63-insights-runtime-extractor-tls") pod "insights-runtime-extractor-kjsfm" (UID: "0ad0b493-02ab-40b9-ac9b-42e735a5eb63") : secret "insights-runtime-extractor-tls" not found Apr 17 20:05:45.751170 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:45.751100 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0ad0b493-02ab-40b9-ac9b-42e735a5eb63-data-volume\") pod \"insights-runtime-extractor-kjsfm\" (UID: \"0ad0b493-02ab-40b9-ac9b-42e735a5eb63\") " pod="openshift-insights/insights-runtime-extractor-kjsfm" Apr 17 20:05:45.751326 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:45.751309 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0ad0b493-02ab-40b9-ac9b-42e735a5eb63-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-kjsfm\" (UID: \"0ad0b493-02ab-40b9-ac9b-42e735a5eb63\") " pod="openshift-insights/insights-runtime-extractor-kjsfm" Apr 17 20:05:45.762561 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:45.762541 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4bcn\" (UniqueName: \"kubernetes.io/projected/0ad0b493-02ab-40b9-ac9b-42e735a5eb63-kube-api-access-s4bcn\") pod \"insights-runtime-extractor-kjsfm\" (UID: \"0ad0b493-02ab-40b9-ac9b-42e735a5eb63\") " pod="openshift-insights/insights-runtime-extractor-kjsfm" Apr 17 20:05:46.254919 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:46.254878 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0ad0b493-02ab-40b9-ac9b-42e735a5eb63-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kjsfm\" (UID: \"0ad0b493-02ab-40b9-ac9b-42e735a5eb63\") " pod="openshift-insights/insights-runtime-extractor-kjsfm" Apr 17 20:05:46.255296 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:46.255033 2576 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 17 20:05:46.255296 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:46.255100 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ad0b493-02ab-40b9-ac9b-42e735a5eb63-insights-runtime-extractor-tls podName:0ad0b493-02ab-40b9-ac9b-42e735a5eb63 nodeName:}" failed. No retries permitted until 2026-04-17 20:05:47.255083972 +0000 UTC m=+86.122903720 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/0ad0b493-02ab-40b9-ac9b-42e735a5eb63-insights-runtime-extractor-tls") pod "insights-runtime-extractor-kjsfm" (UID: "0ad0b493-02ab-40b9-ac9b-42e735a5eb63") : secret "insights-runtime-extractor-tls" not found Apr 17 20:05:47.263019 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:47.262981 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0ad0b493-02ab-40b9-ac9b-42e735a5eb63-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kjsfm\" (UID: \"0ad0b493-02ab-40b9-ac9b-42e735a5eb63\") " pod="openshift-insights/insights-runtime-extractor-kjsfm" Apr 17 20:05:47.263386 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:47.263100 2576 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 17 20:05:47.263386 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:47.263159 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ad0b493-02ab-40b9-ac9b-42e735a5eb63-insights-runtime-extractor-tls podName:0ad0b493-02ab-40b9-ac9b-42e735a5eb63 nodeName:}" failed. No retries permitted until 2026-04-17 20:05:49.263142727 +0000 UTC m=+88.130962468 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/0ad0b493-02ab-40b9-ac9b-42e735a5eb63-insights-runtime-extractor-tls") pod "insights-runtime-extractor-kjsfm" (UID: "0ad0b493-02ab-40b9-ac9b-42e735a5eb63") : secret "insights-runtime-extractor-tls" not found Apr 17 20:05:47.364325 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:47.364288 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8eafe28a-1549-4404-a39c-dad771c51b73-registry-tls\") pod \"image-registry-64db8476cb-95w6j\" (UID: \"8eafe28a-1549-4404-a39c-dad771c51b73\") " pod="openshift-image-registry/image-registry-64db8476cb-95w6j" Apr 17 20:05:47.364480 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:47.364425 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:05:47.364480 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:47.364445 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64db8476cb-95w6j: secret "image-registry-tls" not found Apr 17 20:05:47.364564 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:47.364500 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8eafe28a-1549-4404-a39c-dad771c51b73-registry-tls podName:8eafe28a-1549-4404-a39c-dad771c51b73 nodeName:}" failed. No retries permitted until 2026-04-17 20:05:51.364483603 +0000 UTC m=+90.232303342 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8eafe28a-1549-4404-a39c-dad771c51b73-registry-tls") pod "image-registry-64db8476cb-95w6j" (UID: "8eafe28a-1549-4404-a39c-dad771c51b73") : secret "image-registry-tls" not found Apr 17 20:05:49.279969 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:49.279913 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0ad0b493-02ab-40b9-ac9b-42e735a5eb63-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kjsfm\" (UID: \"0ad0b493-02ab-40b9-ac9b-42e735a5eb63\") " pod="openshift-insights/insights-runtime-extractor-kjsfm" Apr 17 20:05:49.280478 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:49.280090 2576 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 17 20:05:49.280478 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:05:49.280175 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ad0b493-02ab-40b9-ac9b-42e735a5eb63-insights-runtime-extractor-tls podName:0ad0b493-02ab-40b9-ac9b-42e735a5eb63 nodeName:}" failed. No retries permitted until 2026-04-17 20:05:53.28015516 +0000 UTC m=+92.147974905 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/0ad0b493-02ab-40b9-ac9b-42e735a5eb63-insights-runtime-extractor-tls") pod "insights-runtime-extractor-kjsfm" (UID: "0ad0b493-02ab-40b9-ac9b-42e735a5eb63") : secret "insights-runtime-extractor-tls" not found Apr 17 20:05:51.396831 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:51.396796 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8eafe28a-1549-4404-a39c-dad771c51b73-registry-tls\") pod \"image-registry-64db8476cb-95w6j\" (UID: \"8eafe28a-1549-4404-a39c-dad771c51b73\") " pod="openshift-image-registry/image-registry-64db8476cb-95w6j" Apr 17 20:05:51.399120 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:51.399098 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8eafe28a-1549-4404-a39c-dad771c51b73-registry-tls\") pod \"image-registry-64db8476cb-95w6j\" (UID: \"8eafe28a-1549-4404-a39c-dad771c51b73\") " pod="openshift-image-registry/image-registry-64db8476cb-95w6j" Apr 17 20:05:51.403011 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:51.402988 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-64db8476cb-95w6j" Apr 17 20:05:51.519433 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:51.519399 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-64db8476cb-95w6j"] Apr 17 20:05:51.524022 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:05:51.523988 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8eafe28a_1549_4404_a39c_dad771c51b73.slice/crio-d75ec74389c7c95e502fb8ee49d71327177904b422a25fc2c40ac15b1b2b623a WatchSource:0}: Error finding container d75ec74389c7c95e502fb8ee49d71327177904b422a25fc2c40ac15b1b2b623a: Status 404 returned error can't find the container with id d75ec74389c7c95e502fb8ee49d71327177904b422a25fc2c40ac15b1b2b623a Apr 17 20:05:51.925472 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:51.925436 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-64db8476cb-95w6j" event={"ID":"8eafe28a-1549-4404-a39c-dad771c51b73","Type":"ContainerStarted","Data":"aceccba09fec67b6a07cee09e662f287b4bfe2bdcbff21e72345c3a085f386ab"} Apr 17 20:05:51.925472 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:51.925478 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-64db8476cb-95w6j" event={"ID":"8eafe28a-1549-4404-a39c-dad771c51b73","Type":"ContainerStarted","Data":"d75ec74389c7c95e502fb8ee49d71327177904b422a25fc2c40ac15b1b2b623a"} Apr 17 20:05:51.925716 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:51.925685 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-64db8476cb-95w6j" Apr 17 20:05:51.944861 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:51.944793 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-64db8476cb-95w6j" podStartSLOduration=8.944778553 podStartE2EDuration="8.944778553s" podCreationTimestamp="2026-04-17 20:05:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:05:51.943891737 +0000 UTC m=+90.811711500" watchObservedRunningTime="2026-04-17 20:05:51.944778553 +0000 UTC m=+90.812598349" Apr 17 20:05:53.312019 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:53.311953 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0ad0b493-02ab-40b9-ac9b-42e735a5eb63-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kjsfm\" (UID: \"0ad0b493-02ab-40b9-ac9b-42e735a5eb63\") " pod="openshift-insights/insights-runtime-extractor-kjsfm" Apr 17 20:05:53.314379 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:53.314358 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0ad0b493-02ab-40b9-ac9b-42e735a5eb63-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kjsfm\" (UID: \"0ad0b493-02ab-40b9-ac9b-42e735a5eb63\") " pod="openshift-insights/insights-runtime-extractor-kjsfm" Apr 17 20:05:53.364539 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:53.364497 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-kjsfm" Apr 17 20:05:53.483450 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:53.483416 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-kjsfm"] Apr 17 20:05:53.486769 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:05:53.486739 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ad0b493_02ab_40b9_ac9b_42e735a5eb63.slice/crio-bf648e4a4a50a2e3d021bcc7485c2fd686fde8ad7c8202dbae4de2611453ca24 WatchSource:0}: Error finding container bf648e4a4a50a2e3d021bcc7485c2fd686fde8ad7c8202dbae4de2611453ca24: Status 404 returned error can't find the container with id bf648e4a4a50a2e3d021bcc7485c2fd686fde8ad7c8202dbae4de2611453ca24 Apr 17 20:05:53.932603 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:53.932565 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kjsfm" event={"ID":"0ad0b493-02ab-40b9-ac9b-42e735a5eb63","Type":"ContainerStarted","Data":"1250ad0700b54507d7c05170dda95796e555bc56d457089348c010f187d204af"} Apr 17 20:05:53.932603 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:53.932604 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kjsfm" event={"ID":"0ad0b493-02ab-40b9-ac9b-42e735a5eb63","Type":"ContainerStarted","Data":"bf648e4a4a50a2e3d021bcc7485c2fd686fde8ad7c8202dbae4de2611453ca24"} Apr 17 20:05:54.936995 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:54.936958 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kjsfm" event={"ID":"0ad0b493-02ab-40b9-ac9b-42e735a5eb63","Type":"ContainerStarted","Data":"46810ff8c0faeb2f006fd29aa0e80d1ef6262ca8d631568abfa31871c4d45035"} Apr 17 20:05:55.941170 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:55.941075 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kjsfm" event={"ID":"0ad0b493-02ab-40b9-ac9b-42e735a5eb63","Type":"ContainerStarted","Data":"0ab2badcdc2f685c506acbbd19df2c3c21c8b4c52958c01892ee52184c1ef4fe"} Apr 17 20:05:55.961066 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:55.961011 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-kjsfm" podStartSLOduration=8.851701306 podStartE2EDuration="10.960994062s" podCreationTimestamp="2026-04-17 20:05:45 +0000 UTC" firstStartedPulling="2026-04-17 20:05:53.551094127 +0000 UTC m=+92.418913866" lastFinishedPulling="2026-04-17 20:05:55.660386879 +0000 UTC m=+94.528206622" observedRunningTime="2026-04-17 20:05:55.960585482 +0000 UTC m=+94.828405256" watchObservedRunningTime="2026-04-17 20:05:55.960994062 +0000 UTC m=+94.828813823" Apr 17 20:05:58.449466 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:58.449371 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8314d958-7474-40f1-b9c0-6ce6e8e69f40-metrics-tls\") pod \"dns-default-p227m\" (UID: \"8314d958-7474-40f1-b9c0-6ce6e8e69f40\") " pod="openshift-dns/dns-default-p227m" Apr 17 20:05:58.449800 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:58.449475 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/642f7cfc-05bf-4010-8948-bc21230b60e0-cert\") pod \"ingress-canary-l2wrg\" (UID: \"642f7cfc-05bf-4010-8948-bc21230b60e0\") " pod="openshift-ingress-canary/ingress-canary-l2wrg" Apr 17 20:05:58.451709 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:58.451686 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8314d958-7474-40f1-b9c0-6ce6e8e69f40-metrics-tls\") pod \"dns-default-p227m\" (UID: \"8314d958-7474-40f1-b9c0-6ce6e8e69f40\") " pod="openshift-dns/dns-default-p227m" Apr 17 20:05:58.451811 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:58.451737 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/642f7cfc-05bf-4010-8948-bc21230b60e0-cert\") pod \"ingress-canary-l2wrg\" (UID: \"642f7cfc-05bf-4010-8948-bc21230b60e0\") " pod="openshift-ingress-canary/ingress-canary-l2wrg" Apr 17 20:05:58.486998 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:58.486970 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-4cbzc\"" Apr 17 20:05:58.493619 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:58.493600 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-tl4j2\"" Apr 17 20:05:58.495712 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:58.495696 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-p227m" Apr 17 20:05:58.502430 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:58.502398 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-l2wrg" Apr 17 20:05:58.632596 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:58.632561 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-p227m"] Apr 17 20:05:58.636309 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:05:58.636283 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8314d958_7474_40f1_b9c0_6ce6e8e69f40.slice/crio-3eea359eab5670c3d9c157eb2925a7b92788a0ff5d3db3a31662cf6c317b1d1a WatchSource:0}: Error finding container 3eea359eab5670c3d9c157eb2925a7b92788a0ff5d3db3a31662cf6c317b1d1a: Status 404 returned error can't find the container with id 3eea359eab5670c3d9c157eb2925a7b92788a0ff5d3db3a31662cf6c317b1d1a Apr 17 20:05:58.646316 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:58.646288 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-l2wrg"] Apr 17 20:05:58.650174 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:05:58.650145 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod642f7cfc_05bf_4010_8948_bc21230b60e0.slice/crio-1569d27712a1c9f53fb72f42ad7106b6a44e2b9bba9c63e432be30478f80f5e2 WatchSource:0}: Error finding container 1569d27712a1c9f53fb72f42ad7106b6a44e2b9bba9c63e432be30478f80f5e2: Status 404 returned error can't find the container with id 1569d27712a1c9f53fb72f42ad7106b6a44e2b9bba9c63e432be30478f80f5e2 Apr 17 20:05:58.949227 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:58.949189 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p227m" event={"ID":"8314d958-7474-40f1-b9c0-6ce6e8e69f40","Type":"ContainerStarted","Data":"3eea359eab5670c3d9c157eb2925a7b92788a0ff5d3db3a31662cf6c317b1d1a"} Apr 17 20:05:58.950256 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:58.950236 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-l2wrg" event={"ID":"642f7cfc-05bf-4010-8948-bc21230b60e0","Type":"ContainerStarted","Data":"1569d27712a1c9f53fb72f42ad7106b6a44e2b9bba9c63e432be30478f80f5e2"} Apr 17 20:05:59.557884 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:59.557829 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69b938b6-9d16-44cc-9e04-55819bafd06e-metrics-certs\") pod \"router-default-5f85496cb-r87zd\" (UID: \"69b938b6-9d16-44cc-9e04-55819bafd06e\") " pod="openshift-ingress/router-default-5f85496cb-r87zd" Apr 17 20:05:59.558389 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:59.557933 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/eea67969-35bc-420e-a3e8-819b529619bf-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fbvmc\" (UID: \"eea67969-35bc-420e-a3e8-819b529619bf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fbvmc" Apr 17 20:05:59.558389 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:59.557964 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69b938b6-9d16-44cc-9e04-55819bafd06e-service-ca-bundle\") pod \"router-default-5f85496cb-r87zd\" (UID: \"69b938b6-9d16-44cc-9e04-55819bafd06e\") " pod="openshift-ingress/router-default-5f85496cb-r87zd" Apr 17 20:05:59.558639 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:59.558612 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69b938b6-9d16-44cc-9e04-55819bafd06e-service-ca-bundle\") pod \"router-default-5f85496cb-r87zd\" (UID: \"69b938b6-9d16-44cc-9e04-55819bafd06e\") " pod="openshift-ingress/router-default-5f85496cb-r87zd" Apr 17 20:05:59.560982 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:59.560951 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/eea67969-35bc-420e-a3e8-819b529619bf-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fbvmc\" (UID: \"eea67969-35bc-420e-a3e8-819b529619bf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fbvmc" Apr 17 20:05:59.561111 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:59.561039 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69b938b6-9d16-44cc-9e04-55819bafd06e-metrics-certs\") pod \"router-default-5f85496cb-r87zd\" (UID: \"69b938b6-9d16-44cc-9e04-55819bafd06e\") " pod="openshift-ingress/router-default-5f85496cb-r87zd" Apr 17 20:05:59.850398 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:59.850305 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fbvmc" Apr 17 20:05:59.854201 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:05:59.854174 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5f85496cb-r87zd" Apr 17 20:06:00.682790 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:00.682703 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-fbvmc"] Apr 17 20:06:00.688466 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:06:00.688433 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeea67969_35bc_420e_a3e8_819b529619bf.slice/crio-06174b46f8ec2fbb341efe05b77dd4f3b36854a04aea21e32ab3926f3fade8a3 WatchSource:0}: Error finding container 06174b46f8ec2fbb341efe05b77dd4f3b36854a04aea21e32ab3926f3fade8a3: Status 404 returned error can't find the container with id 06174b46f8ec2fbb341efe05b77dd4f3b36854a04aea21e32ab3926f3fade8a3 Apr 17 20:06:00.712882 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:00.710803 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5f85496cb-r87zd"] Apr 17 20:06:00.717354 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:06:00.717327 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69b938b6_9d16_44cc_9e04_55819bafd06e.slice/crio-234638bfc7d2c5d233e762f71bbf8966d908ea02fbfdd8aa7fe393e88b551a19 WatchSource:0}: Error finding container 234638bfc7d2c5d233e762f71bbf8966d908ea02fbfdd8aa7fe393e88b551a19: Status 404 returned error can't find the container with id 234638bfc7d2c5d233e762f71bbf8966d908ea02fbfdd8aa7fe393e88b551a19 Apr 17 20:06:00.956140 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:00.956103 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5f85496cb-r87zd" event={"ID":"69b938b6-9d16-44cc-9e04-55819bafd06e","Type":"ContainerStarted","Data":"0d9fc02d569d7defa1ab8983f41423ac38c1474975eaca4e51ce57598031b685"} Apr 17 20:06:00.956325 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:00.956146 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5f85496cb-r87zd" event={"ID":"69b938b6-9d16-44cc-9e04-55819bafd06e","Type":"ContainerStarted","Data":"234638bfc7d2c5d233e762f71bbf8966d908ea02fbfdd8aa7fe393e88b551a19"} Apr 17 20:06:00.957454 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:00.957431 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-l2wrg" event={"ID":"642f7cfc-05bf-4010-8948-bc21230b60e0","Type":"ContainerStarted","Data":"62069aa8102d69a050a3c29f865f11d4a2239ea2e6ec4fc843a3c659e96e183c"} Apr 17 20:06:00.958899 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:00.958864 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p227m" event={"ID":"8314d958-7474-40f1-b9c0-6ce6e8e69f40","Type":"ContainerStarted","Data":"161294d50ccc50ea4e91b2a11a389e29ac151ef78beaa5f615c4ab58fc5ea7c4"} Apr 17 20:06:00.958899 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:00.958898 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p227m" event={"ID":"8314d958-7474-40f1-b9c0-6ce6e8e69f40","Type":"ContainerStarted","Data":"54826b0e15edc47fcc75779ebcaf7384a19d591de184738137f4635db24c45d4"} Apr 17 20:06:00.959074 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:00.958985 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-p227m" Apr 17 20:06:00.959903 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:00.959871 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fbvmc" event={"ID":"eea67969-35bc-420e-a3e8-819b529619bf","Type":"ContainerStarted","Data":"06174b46f8ec2fbb341efe05b77dd4f3b36854a04aea21e32ab3926f3fade8a3"} Apr 17 20:06:00.973986 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:00.973948 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5f85496cb-r87zd" podStartSLOduration=33.973937206 podStartE2EDuration="33.973937206s" podCreationTimestamp="2026-04-17 20:05:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:06:00.973077956 +0000 UTC m=+99.840897716" watchObservedRunningTime="2026-04-17 20:06:00.973937206 +0000 UTC m=+99.841756966" Apr 17 20:06:00.995506 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:00.995453 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-p227m" podStartSLOduration=65.084348983 podStartE2EDuration="1m6.995437567s" podCreationTimestamp="2026-04-17 20:04:54 +0000 UTC" firstStartedPulling="2026-04-17 20:05:58.638257762 +0000 UTC m=+97.506077505" lastFinishedPulling="2026-04-17 20:06:00.549346336 +0000 UTC m=+99.417166089" observedRunningTime="2026-04-17 20:06:00.994993852 +0000 UTC m=+99.862813616" watchObservedRunningTime="2026-04-17 20:06:00.995437567 +0000 UTC m=+99.863257327" Apr 17 20:06:01.009244 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:01.009199 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-l2wrg" podStartSLOduration=65.106403299 podStartE2EDuration="1m7.009184189s" podCreationTimestamp="2026-04-17 20:04:54 +0000 UTC" firstStartedPulling="2026-04-17 20:05:58.65227093 +0000 UTC m=+97.520090670" lastFinishedPulling="2026-04-17 20:06:00.555051806 +0000 UTC m=+99.422871560" observedRunningTime="2026-04-17 20:06:01.008215498 +0000 UTC m=+99.876035259" watchObservedRunningTime="2026-04-17 20:06:01.009184189 +0000 UTC m=+99.877003981" Apr 17 20:06:01.854856 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:01.854812 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5f85496cb-r87zd" Apr 17 20:06:01.857542 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:01.857520 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5f85496cb-r87zd" Apr 17 20:06:01.965031 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:01.964995 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-5f85496cb-r87zd" Apr 17 20:06:01.966112 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:01.966092 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5f85496cb-r87zd" Apr 17 20:06:03.806248 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:03.806221 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-64db8476cb-95w6j"] Apr 17 20:06:03.916587 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:03.916554 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-dzdlg"] Apr 17 20:06:03.919607 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:03.919586 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-l7bms"] Apr 17 20:06:03.919750 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:03.919729 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-dzdlg" Apr 17 20:06:03.922451 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:03.922425 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 17 20:06:03.922451 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:03.922443 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-l7bms" Apr 17 20:06:03.922628 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:03.922434 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-sjkc6\"" Apr 17 20:06:03.924200 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:03.924185 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 20:06:03.924352 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:03.924333 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-br987\"" Apr 17 20:06:03.924501 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:03.924488 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 20:06:03.930610 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:03.930588 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-dzdlg"] Apr 17 20:06:03.931800 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:03.931777 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-l7bms"] Apr 17 20:06:03.971176 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:03.971137 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fbvmc" event={"ID":"eea67969-35bc-420e-a3e8-819b529619bf","Type":"ContainerStarted","Data":"3ab6cc6040e854469da683c090585e102255ff7c5292935858a858feba1ab692"} Apr 17 20:06:03.989759 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:03.989717 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fbvmc" podStartSLOduration=34.602791283 podStartE2EDuration="36.989703195s" podCreationTimestamp="2026-04-17 20:05:27 +0000 UTC" firstStartedPulling="2026-04-17 20:06:00.690454548 +0000 UTC m=+99.558274290" lastFinishedPulling="2026-04-17 20:06:03.077366448 +0000 UTC m=+101.945186202" observedRunningTime="2026-04-17 20:06:03.988796437 +0000 UTC m=+102.856616210" watchObservedRunningTime="2026-04-17 20:06:03.989703195 +0000 UTC m=+102.857522956" Apr 17 20:06:04.094650 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:04.094614 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/723a02cc-35ab-40aa-88d0-8ef966947773-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-dzdlg\" (UID: \"723a02cc-35ab-40aa-88d0-8ef966947773\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-dzdlg" Apr 17 20:06:04.094828 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:04.094760 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f5gp\" (UniqueName: \"kubernetes.io/projected/a29542e8-2ce0-4458-a973-dafcdaa4d486-kube-api-access-9f5gp\") pod \"downloads-6bcc868b7-l7bms\" (UID: \"a29542e8-2ce0-4458-a973-dafcdaa4d486\") " pod="openshift-console/downloads-6bcc868b7-l7bms" Apr 17 20:06:04.195813 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:04.195770 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/723a02cc-35ab-40aa-88d0-8ef966947773-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-dzdlg\" (UID: \"723a02cc-35ab-40aa-88d0-8ef966947773\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-dzdlg" Apr 17 20:06:04.196024 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:04.195830 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9f5gp\" (UniqueName: \"kubernetes.io/projected/a29542e8-2ce0-4458-a973-dafcdaa4d486-kube-api-access-9f5gp\") pod \"downloads-6bcc868b7-l7bms\" (UID: \"a29542e8-2ce0-4458-a973-dafcdaa4d486\") " pod="openshift-console/downloads-6bcc868b7-l7bms" Apr 17 20:06:04.198221 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:04.198186 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/723a02cc-35ab-40aa-88d0-8ef966947773-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-dzdlg\" (UID: \"723a02cc-35ab-40aa-88d0-8ef966947773\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-dzdlg" Apr 17 20:06:04.203384 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:04.203360 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f5gp\" (UniqueName: \"kubernetes.io/projected/a29542e8-2ce0-4458-a973-dafcdaa4d486-kube-api-access-9f5gp\") pod \"downloads-6bcc868b7-l7bms\" (UID: \"a29542e8-2ce0-4458-a973-dafcdaa4d486\") " pod="openshift-console/downloads-6bcc868b7-l7bms" Apr 17 20:06:04.230282 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:04.230239 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-dzdlg" Apr 17 20:06:04.237957 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:04.237931 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-l7bms" Apr 17 20:06:04.368195 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:04.368095 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-dzdlg"] Apr 17 20:06:04.371899 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:06:04.371870 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod723a02cc_35ab_40aa_88d0_8ef966947773.slice/crio-1fadedcc599d7b721984155ef5a883b3df4e3e9de3c5771cd62dbcfbc1a03e09 WatchSource:0}: Error finding container 1fadedcc599d7b721984155ef5a883b3df4e3e9de3c5771cd62dbcfbc1a03e09: Status 404 returned error can't find the container with id 1fadedcc599d7b721984155ef5a883b3df4e3e9de3c5771cd62dbcfbc1a03e09 Apr 17 20:06:04.389506 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:04.389477 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-l7bms"] Apr 17 20:06:04.392856 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:06:04.392807 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda29542e8_2ce0_4458_a973_dafcdaa4d486.slice/crio-31eaf7c453f212fe4ec422da681a05e7b1472478acdef60675ff5e4b155d879d WatchSource:0}: Error finding container 31eaf7c453f212fe4ec422da681a05e7b1472478acdef60675ff5e4b155d879d: Status 404 returned error can't find the container with id 31eaf7c453f212fe4ec422da681a05e7b1472478acdef60675ff5e4b155d879d Apr 17 20:06:04.975567 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:04.975516 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-l7bms" event={"ID":"a29542e8-2ce0-4458-a973-dafcdaa4d486","Type":"ContainerStarted","Data":"31eaf7c453f212fe4ec422da681a05e7b1472478acdef60675ff5e4b155d879d"} Apr 17 20:06:04.976967 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:04.976879 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-dzdlg" event={"ID":"723a02cc-35ab-40aa-88d0-8ef966947773","Type":"ContainerStarted","Data":"1fadedcc599d7b721984155ef5a883b3df4e3e9de3c5771cd62dbcfbc1a03e09"} Apr 17 20:06:05.985630 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:05.985544 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-dzdlg" event={"ID":"723a02cc-35ab-40aa-88d0-8ef966947773","Type":"ContainerStarted","Data":"b9a099429b2d65aefb38903ac2ca6b65c6b4a0d45a21ce0143180fc0972c0844"} Apr 17 20:06:05.986070 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:05.985736 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-dzdlg" Apr 17 20:06:05.991650 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:05.991623 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-dzdlg" Apr 17 20:06:05.999719 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:05.999660 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-dzdlg" podStartSLOduration=1.735948593 podStartE2EDuration="2.999641891s" podCreationTimestamp="2026-04-17 20:06:03 +0000 UTC" firstStartedPulling="2026-04-17 20:06:04.373872558 +0000 UTC m=+103.241692301" lastFinishedPulling="2026-04-17 20:06:05.637565858 +0000 UTC m=+104.505385599" observedRunningTime="2026-04-17 20:06:05.998725615 +0000 UTC m=+104.866545378" watchObservedRunningTime="2026-04-17 20:06:05.999641891 +0000 UTC m=+104.867461653" Apr 17 20:06:06.628494 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:06.628457 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-x8nkg"] Apr 17 20:06:06.631962 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:06.631939 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-x8nkg" Apr 17 20:06:06.634156 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:06.634120 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 17 20:06:06.634280 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:06.634158 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 17 20:06:06.634280 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:06.634160 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-6mmpt\"" Apr 17 20:06:06.635226 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:06.635049 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 20:06:06.638333 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:06.638309 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-x8nkg"] Apr 17 20:06:06.721138 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:06.721105 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b80e8a19-d2f5-44be-995b-32f6fa5ba15c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-x8nkg\" (UID: \"b80e8a19-d2f5-44be-995b-32f6fa5ba15c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-x8nkg" Apr 17 20:06:06.721303 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:06.721156 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhgc7\" (UniqueName: \"kubernetes.io/projected/b80e8a19-d2f5-44be-995b-32f6fa5ba15c-kube-api-access-rhgc7\") pod \"prometheus-operator-5676c8c784-x8nkg\" (UID: \"b80e8a19-d2f5-44be-995b-32f6fa5ba15c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-x8nkg" Apr 17 20:06:06.721303 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:06.721281 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b80e8a19-d2f5-44be-995b-32f6fa5ba15c-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-x8nkg\" (UID: \"b80e8a19-d2f5-44be-995b-32f6fa5ba15c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-x8nkg" Apr 17 20:06:06.721408 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:06.721333 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b80e8a19-d2f5-44be-995b-32f6fa5ba15c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-x8nkg\" (UID: \"b80e8a19-d2f5-44be-995b-32f6fa5ba15c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-x8nkg" Apr 17 20:06:06.822465 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:06.822428 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b80e8a19-d2f5-44be-995b-32f6fa5ba15c-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-x8nkg\" (UID: \"b80e8a19-d2f5-44be-995b-32f6fa5ba15c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-x8nkg" Apr 17 20:06:06.822465 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:06.822478 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b80e8a19-d2f5-44be-995b-32f6fa5ba15c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-x8nkg\" (UID: \"b80e8a19-d2f5-44be-995b-32f6fa5ba15c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-x8nkg" Apr 17 20:06:06.822694 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:06.822514 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b80e8a19-d2f5-44be-995b-32f6fa5ba15c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-x8nkg\" (UID: \"b80e8a19-d2f5-44be-995b-32f6fa5ba15c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-x8nkg" Apr 17 20:06:06.822694 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:06.822546 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rhgc7\" (UniqueName: \"kubernetes.io/projected/b80e8a19-d2f5-44be-995b-32f6fa5ba15c-kube-api-access-rhgc7\") pod \"prometheus-operator-5676c8c784-x8nkg\" (UID: \"b80e8a19-d2f5-44be-995b-32f6fa5ba15c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-x8nkg" Apr 17 20:06:06.822694 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:06:06.822638 2576 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 17 20:06:06.822899 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:06:06.822718 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b80e8a19-d2f5-44be-995b-32f6fa5ba15c-prometheus-operator-tls podName:b80e8a19-d2f5-44be-995b-32f6fa5ba15c nodeName:}" failed. No retries permitted until 2026-04-17 20:06:07.322698014 +0000 UTC m=+106.190517765 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/b80e8a19-d2f5-44be-995b-32f6fa5ba15c-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-x8nkg" (UID: "b80e8a19-d2f5-44be-995b-32f6fa5ba15c") : secret "prometheus-operator-tls" not found Apr 17 20:06:06.823291 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:06.823257 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b80e8a19-d2f5-44be-995b-32f6fa5ba15c-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-x8nkg\" (UID: \"b80e8a19-d2f5-44be-995b-32f6fa5ba15c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-x8nkg" Apr 17 20:06:06.825018 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:06.824989 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b80e8a19-d2f5-44be-995b-32f6fa5ba15c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-x8nkg\" (UID: \"b80e8a19-d2f5-44be-995b-32f6fa5ba15c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-x8nkg" Apr 17 20:06:06.833468 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:06.833444 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhgc7\" (UniqueName: \"kubernetes.io/projected/b80e8a19-d2f5-44be-995b-32f6fa5ba15c-kube-api-access-rhgc7\") pod \"prometheus-operator-5676c8c784-x8nkg\" (UID: \"b80e8a19-d2f5-44be-995b-32f6fa5ba15c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-x8nkg" Apr 17 20:06:07.328212 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:07.328176 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b80e8a19-d2f5-44be-995b-32f6fa5ba15c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-x8nkg\" (UID: \"b80e8a19-d2f5-44be-995b-32f6fa5ba15c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-x8nkg" Apr 17 20:06:07.330715 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:07.330681 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b80e8a19-d2f5-44be-995b-32f6fa5ba15c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-x8nkg\" (UID: \"b80e8a19-d2f5-44be-995b-32f6fa5ba15c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-x8nkg" Apr 17 20:06:07.542646 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:07.542597 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-x8nkg" Apr 17 20:06:07.675211 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:07.675176 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-x8nkg"] Apr 17 20:06:07.679636 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:06:07.679595 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb80e8a19_d2f5_44be_995b_32f6fa5ba15c.slice/crio-fade0656c20dd3f190991db1f950a8037a3bd53a208d777599739dfa189ff0f8 WatchSource:0}: Error finding container fade0656c20dd3f190991db1f950a8037a3bd53a208d777599739dfa189ff0f8: Status 404 returned error can't find the container with id fade0656c20dd3f190991db1f950a8037a3bd53a208d777599739dfa189ff0f8 Apr 17 20:06:07.992269 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:07.992178 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-x8nkg" event={"ID":"b80e8a19-d2f5-44be-995b-32f6fa5ba15c","Type":"ContainerStarted","Data":"fade0656c20dd3f190991db1f950a8037a3bd53a208d777599739dfa189ff0f8"} Apr 17 20:06:09.999363 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:09.999326 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-x8nkg" event={"ID":"b80e8a19-d2f5-44be-995b-32f6fa5ba15c","Type":"ContainerStarted","Data":"d2d67bdbe44a067c4ed15ce4402c7eb5fbed27651f985f89b13c527debdafdca"} Apr 17 20:06:09.999759 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:09.999371 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-x8nkg" event={"ID":"b80e8a19-d2f5-44be-995b-32f6fa5ba15c","Type":"ContainerStarted","Data":"6d6553fc0561fa1e2c96f0a12939a6335fb8e10994e71fa27968c8b804bf94a4"} Apr 17 20:06:10.015038 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:10.014972 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-x8nkg" podStartSLOduration=2.717559767 podStartE2EDuration="4.014951185s" podCreationTimestamp="2026-04-17 20:06:06 +0000 UTC" firstStartedPulling="2026-04-17 20:06:07.681931839 +0000 UTC m=+106.549751579" lastFinishedPulling="2026-04-17 20:06:08.979323258 +0000 UTC m=+107.847142997" observedRunningTime="2026-04-17 20:06:10.014496009 +0000 UTC m=+108.882315797" watchObservedRunningTime="2026-04-17 20:06:10.014951185 +0000 UTC m=+108.882770947" Apr 17 20:06:10.967617 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:10.967586 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-p227m" Apr 17 20:06:12.010576 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:12.010539 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-t2lrx"] Apr 17 20:06:12.014181 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:12.014158 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-t2lrx" Apr 17 20:06:12.016312 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:12.016291 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 20:06:12.016456 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:12.016343 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 20:06:12.016533 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:12.016471 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-ntvkb\"" Apr 17 20:06:12.016617 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:12.016601 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 20:06:12.168620 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:12.168578 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/63581dd1-b691-415d-8ba2-5d1f37bbf88b-sys\") pod \"node-exporter-t2lrx\" (UID: \"63581dd1-b691-415d-8ba2-5d1f37bbf88b\") " pod="openshift-monitoring/node-exporter-t2lrx" Apr 17 20:06:12.168620 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:12.168620 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/63581dd1-b691-415d-8ba2-5d1f37bbf88b-root\") pod \"node-exporter-t2lrx\" (UID: \"63581dd1-b691-415d-8ba2-5d1f37bbf88b\") " pod="openshift-monitoring/node-exporter-t2lrx" Apr 17 20:06:12.168908 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:12.168692 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/63581dd1-b691-415d-8ba2-5d1f37bbf88b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-t2lrx\" (UID: \"63581dd1-b691-415d-8ba2-5d1f37bbf88b\") " pod="openshift-monitoring/node-exporter-t2lrx" Apr 17 20:06:12.168908 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:12.168756 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/63581dd1-b691-415d-8ba2-5d1f37bbf88b-node-exporter-textfile\") pod \"node-exporter-t2lrx\" (UID: \"63581dd1-b691-415d-8ba2-5d1f37bbf88b\") " pod="openshift-monitoring/node-exporter-t2lrx" Apr 17 20:06:12.168908 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:12.168823 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/63581dd1-b691-415d-8ba2-5d1f37bbf88b-node-exporter-accelerators-collector-config\") pod \"node-exporter-t2lrx\" (UID: \"63581dd1-b691-415d-8ba2-5d1f37bbf88b\") " pod="openshift-monitoring/node-exporter-t2lrx" Apr 17 20:06:12.168908 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:12.168883 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/63581dd1-b691-415d-8ba2-5d1f37bbf88b-node-exporter-tls\") pod \"node-exporter-t2lrx\" (UID: \"63581dd1-b691-415d-8ba2-5d1f37bbf88b\") " pod="openshift-monitoring/node-exporter-t2lrx" Apr 17 20:06:12.169086 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:12.168916 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldtsx\" (UniqueName: \"kubernetes.io/projected/63581dd1-b691-415d-8ba2-5d1f37bbf88b-kube-api-access-ldtsx\") pod \"node-exporter-t2lrx\" (UID: \"63581dd1-b691-415d-8ba2-5d1f37bbf88b\") " pod="openshift-monitoring/node-exporter-t2lrx" Apr 17 20:06:12.169086 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:12.168947 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/63581dd1-b691-415d-8ba2-5d1f37bbf88b-metrics-client-ca\") pod \"node-exporter-t2lrx\" (UID: \"63581dd1-b691-415d-8ba2-5d1f37bbf88b\") " pod="openshift-monitoring/node-exporter-t2lrx" Apr 17 20:06:12.169086 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:12.168964 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/63581dd1-b691-415d-8ba2-5d1f37bbf88b-node-exporter-wtmp\") pod \"node-exporter-t2lrx\" (UID: \"63581dd1-b691-415d-8ba2-5d1f37bbf88b\") " pod="openshift-monitoring/node-exporter-t2lrx" Apr 17 20:06:12.269766 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:12.269677 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/63581dd1-b691-415d-8ba2-5d1f37bbf88b-node-exporter-accelerators-collector-config\") pod \"node-exporter-t2lrx\" (UID: \"63581dd1-b691-415d-8ba2-5d1f37bbf88b\") " pod="openshift-monitoring/node-exporter-t2lrx" Apr 17 20:06:12.269766 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:12.269723 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/63581dd1-b691-415d-8ba2-5d1f37bbf88b-node-exporter-tls\") pod \"node-exporter-t2lrx\" (UID: \"63581dd1-b691-415d-8ba2-5d1f37bbf88b\") " pod="openshift-monitoring/node-exporter-t2lrx" Apr 17 20:06:12.269766 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:12.269744 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldtsx\" (UniqueName: \"kubernetes.io/projected/63581dd1-b691-415d-8ba2-5d1f37bbf88b-kube-api-access-ldtsx\") pod \"node-exporter-t2lrx\" (UID: \"63581dd1-b691-415d-8ba2-5d1f37bbf88b\") " pod="openshift-monitoring/node-exporter-t2lrx" Apr 17 20:06:12.269766 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:12.269768 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/63581dd1-b691-415d-8ba2-5d1f37bbf88b-metrics-client-ca\") pod \"node-exporter-t2lrx\" (UID: \"63581dd1-b691-415d-8ba2-5d1f37bbf88b\") " pod="openshift-monitoring/node-exporter-t2lrx" Apr 17 20:06:12.270103 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:12.269790 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/63581dd1-b691-415d-8ba2-5d1f37bbf88b-node-exporter-wtmp\") pod \"node-exporter-t2lrx\" (UID: \"63581dd1-b691-415d-8ba2-5d1f37bbf88b\") " pod="openshift-monitoring/node-exporter-t2lrx" Apr 17 20:06:12.270103 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:12.269834 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/63581dd1-b691-415d-8ba2-5d1f37bbf88b-sys\") pod \"node-exporter-t2lrx\" (UID: \"63581dd1-b691-415d-8ba2-5d1f37bbf88b\") " pod="openshift-monitoring/node-exporter-t2lrx" Apr 17 20:06:12.270103 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:12.269871 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/63581dd1-b691-415d-8ba2-5d1f37bbf88b-root\") pod \"node-exporter-t2lrx\" (UID: \"63581dd1-b691-415d-8ba2-5d1f37bbf88b\") " pod="openshift-monitoring/node-exporter-t2lrx" Apr 17 20:06:12.270103 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:12.269899 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/63581dd1-b691-415d-8ba2-5d1f37bbf88b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-t2lrx\" (UID: \"63581dd1-b691-415d-8ba2-5d1f37bbf88b\") " pod="openshift-monitoring/node-exporter-t2lrx" Apr 17 20:06:12.270103 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:12.269923 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/63581dd1-b691-415d-8ba2-5d1f37bbf88b-node-exporter-textfile\") pod \"node-exporter-t2lrx\" (UID: \"63581dd1-b691-415d-8ba2-5d1f37bbf88b\") " pod="openshift-monitoring/node-exporter-t2lrx" Apr 17 20:06:12.270103 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:12.269986 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/63581dd1-b691-415d-8ba2-5d1f37bbf88b-node-exporter-wtmp\") pod \"node-exporter-t2lrx\" (UID: \"63581dd1-b691-415d-8ba2-5d1f37bbf88b\") " pod="openshift-monitoring/node-exporter-t2lrx" Apr 17 20:06:12.270103 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:12.270043 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/63581dd1-b691-415d-8ba2-5d1f37bbf88b-sys\") pod \"node-exporter-t2lrx\" (UID: \"63581dd1-b691-415d-8ba2-5d1f37bbf88b\") " pod="openshift-monitoring/node-exporter-t2lrx" Apr 17 20:06:12.270103 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:12.270042 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/63581dd1-b691-415d-8ba2-5d1f37bbf88b-root\") pod \"node-exporter-t2lrx\" (UID: \"63581dd1-b691-415d-8ba2-5d1f37bbf88b\") " pod="openshift-monitoring/node-exporter-t2lrx" Apr 17 20:06:12.270638 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:12.270410 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/63581dd1-b691-415d-8ba2-5d1f37bbf88b-node-exporter-accelerators-collector-config\") pod \"node-exporter-t2lrx\" (UID: \"63581dd1-b691-415d-8ba2-5d1f37bbf88b\") " pod="openshift-monitoring/node-exporter-t2lrx" Apr 17 20:06:12.270638 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:12.270582 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/63581dd1-b691-415d-8ba2-5d1f37bbf88b-node-exporter-textfile\") pod \"node-exporter-t2lrx\" (UID: \"63581dd1-b691-415d-8ba2-5d1f37bbf88b\") " pod="openshift-monitoring/node-exporter-t2lrx" Apr 17 20:06:12.270855 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:12.270815 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/63581dd1-b691-415d-8ba2-5d1f37bbf88b-metrics-client-ca\") pod \"node-exporter-t2lrx\" (UID: \"63581dd1-b691-415d-8ba2-5d1f37bbf88b\") " pod="openshift-monitoring/node-exporter-t2lrx" Apr 17 20:06:12.272521 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:12.272497 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/63581dd1-b691-415d-8ba2-5d1f37bbf88b-node-exporter-tls\") pod \"node-exporter-t2lrx\" (UID: \"63581dd1-b691-415d-8ba2-5d1f37bbf88b\") " pod="openshift-monitoring/node-exporter-t2lrx" Apr 17 20:06:12.272644 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:12.272616 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/63581dd1-b691-415d-8ba2-5d1f37bbf88b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-t2lrx\" (UID: \"63581dd1-b691-415d-8ba2-5d1f37bbf88b\") " pod="openshift-monitoring/node-exporter-t2lrx" Apr 17 20:06:12.277678 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:12.277620 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldtsx\" (UniqueName: \"kubernetes.io/projected/63581dd1-b691-415d-8ba2-5d1f37bbf88b-kube-api-access-ldtsx\") pod \"node-exporter-t2lrx\" (UID: \"63581dd1-b691-415d-8ba2-5d1f37bbf88b\") " pod="openshift-monitoring/node-exporter-t2lrx" Apr 17 20:06:12.324525 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:12.324486 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-t2lrx" Apr 17 20:06:12.332436 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:06:12.332398 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63581dd1_b691_415d_8ba2_5d1f37bbf88b.slice/crio-e7918841e6cbc4f67969cbf4a1896320a2b8e39394c4cc8fd470a04a01aeae8e WatchSource:0}: Error finding container e7918841e6cbc4f67969cbf4a1896320a2b8e39394c4cc8fd470a04a01aeae8e: Status 404 returned error can't find the container with id e7918841e6cbc4f67969cbf4a1896320a2b8e39394c4cc8fd470a04a01aeae8e Apr 17 20:06:13.010086 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.010047 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-t2lrx" event={"ID":"63581dd1-b691-415d-8ba2-5d1f37bbf88b","Type":"ContainerStarted","Data":"e7918841e6cbc4f67969cbf4a1896320a2b8e39394c4cc8fd470a04a01aeae8e"} Apr 17 20:06:13.060053 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.060021 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 20:06:13.067627 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.066470 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:06:13.069381 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.069058 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 20:06:13.069381 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.069101 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 20:06:13.069381 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.069129 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-fdt85\"" Apr 17 20:06:13.069381 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.069134 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 20:06:13.069381 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.069108 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 20:06:13.069381 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.069307 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 20:06:13.069798 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.069569 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 20:06:13.069798 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.069671 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 20:06:13.069798 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.069696 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 20:06:13.069798 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.069676 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 20:06:13.075985 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.075967 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 20:06:13.176911 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.176865 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:06:13.177048 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.177007 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:06:13.177109 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.177073 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-config-out\") pod \"alertmanager-main-0\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:06:13.177169 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.177110 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:06:13.177222 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.177189 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:06:13.177327 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.177305 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:06:13.177396 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.177350 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-web-config\") pod \"alertmanager-main-0\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:06:13.177444 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.177391 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:06:13.177444 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.177428 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:06:13.177559 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.177454 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b227\" (UniqueName: \"kubernetes.io/projected/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-kube-api-access-6b227\") pod \"alertmanager-main-0\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:06:13.177719 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.177690 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:06:13.177802 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.177756 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-config-volume\") pod \"alertmanager-main-0\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:06:13.177876 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.177806 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:06:13.278927 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.278547 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:06:13.278927 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.278592 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:06:13.278927 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.278654 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:06:13.278927 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.278681 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-config-out\") pod \"alertmanager-main-0\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:06:13.278927 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.278705 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:06:13.278927 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.278736 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:06:13.278927 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.278767 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:06:13.280933 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.279994 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-web-config\") pod \"alertmanager-main-0\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:06:13.280933 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.280086 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:06:13.280933 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.280131 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:06:13.280933 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.280159 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6b227\" (UniqueName: \"kubernetes.io/projected/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-kube-api-access-6b227\") pod \"alertmanager-main-0\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:06:13.280933 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.280189 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:06:13.280933 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.280224 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-config-volume\") pod \"alertmanager-main-0\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:06:13.280933 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.280359 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:06:13.281341 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:06:13.281060 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-alertmanager-trusted-ca-bundle podName:0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d nodeName:}" failed. No retries permitted until 2026-04-17 20:06:13.781033633 +0000 UTC m=+112.648853378 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d") : configmap references non-existent config key: ca-bundle.crt Apr 17 20:06:13.282961 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.281999 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:06:13.286661 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.286266 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-config-volume\") pod \"alertmanager-main-0\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:06:13.286661 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.286619 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:06:13.289501 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.289076 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:06:13.289501 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.289079 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-config-out\") pod \"alertmanager-main-0\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:06:13.289501 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.289327 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:06:13.289501 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.289446 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:06:13.289819 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.289638 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:06:13.289819 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.289714 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:06:13.291938 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.291904 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b227\" (UniqueName: \"kubernetes.io/projected/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-kube-api-access-6b227\") pod \"alertmanager-main-0\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:06:13.292292 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.292255 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-web-config\") pod \"alertmanager-main-0\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:06:13.784774 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.784735 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:06:13.785498 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.785469 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:06:13.811268 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.811237 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-64db8476cb-95w6j" Apr 17 20:06:13.980307 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:13.980261 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:06:14.014985 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:14.014948 2576 generic.go:358] "Generic (PLEG): container finished" podID="63581dd1-b691-415d-8ba2-5d1f37bbf88b" containerID="ea9ae49721728ff2553e2c2a51fbf305ab4409951104e8e52fb23a35d056568e" exitCode=0 Apr 17 20:06:14.015129 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:14.015030 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-t2lrx" event={"ID":"63581dd1-b691-415d-8ba2-5d1f37bbf88b","Type":"ContainerDied","Data":"ea9ae49721728ff2553e2c2a51fbf305ab4409951104e8e52fb23a35d056568e"} Apr 17 20:06:14.118555 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:14.118527 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 20:06:14.121633 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:06:14.121608 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b1cc99a_9c75_4a2c_bacd_46e8d69cd42d.slice/crio-98ce3edd0f907073919cea4809bce37c9f140ccad73aa54edf7490e08e051044 WatchSource:0}: Error finding container 98ce3edd0f907073919cea4809bce37c9f140ccad73aa54edf7490e08e051044: Status 404 returned error can't find the container with id 98ce3edd0f907073919cea4809bce37c9f140ccad73aa54edf7490e08e051044 Apr 17 20:06:15.019508 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:15.019469 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d","Type":"ContainerStarted","Data":"98ce3edd0f907073919cea4809bce37c9f140ccad73aa54edf7490e08e051044"} Apr 17 20:06:15.021431 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:15.021397 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-t2lrx" event={"ID":"63581dd1-b691-415d-8ba2-5d1f37bbf88b","Type":"ContainerStarted","Data":"c13e89e016e1375a149111c0536581fa634fea375985a2998c17df3502b5357c"} Apr 17 20:06:15.021574 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:15.021436 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-t2lrx" event={"ID":"63581dd1-b691-415d-8ba2-5d1f37bbf88b","Type":"ContainerStarted","Data":"d5f11aeb69dbfff74c5da987fc813544e4d2cdf9a166914b8b36e416cef6bb20"} Apr 17 20:06:15.038544 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:15.038488 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-t2lrx" podStartSLOduration=3.236978422 podStartE2EDuration="4.038472586s" podCreationTimestamp="2026-04-17 20:06:11 +0000 UTC" firstStartedPulling="2026-04-17 20:06:12.334040846 +0000 UTC m=+111.201860585" lastFinishedPulling="2026-04-17 20:06:13.135534997 +0000 UTC m=+112.003354749" observedRunningTime="2026-04-17 20:06:15.037728023 +0000 UTC m=+113.905547797" watchObservedRunningTime="2026-04-17 20:06:15.038472586 +0000 UTC m=+113.906292350" Apr 17 20:06:16.492801 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:16.492769 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-6f8f6f89d6-p2bnw"] Apr 17 20:06:16.496254 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:16.496225 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6f8f6f89d6-p2bnw" Apr 17 20:06:16.500592 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:16.500567 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 17 20:06:16.500770 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:16.500568 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-4dcv29kq06tnj\"" Apr 17 20:06:16.500770 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:16.500568 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-pl9mk\"" Apr 17 20:06:16.500770 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:16.500574 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 20:06:16.501036 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:16.500720 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 17 20:06:16.501036 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:16.500867 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 17 20:06:16.506590 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:16.506568 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06270147-16c3-4db2-a6dd-58512c0c72cc-client-ca-bundle\") pod \"metrics-server-6f8f6f89d6-p2bnw\" (UID: \"06270147-16c3-4db2-a6dd-58512c0c72cc\") " pod="openshift-monitoring/metrics-server-6f8f6f89d6-p2bnw" Apr 17 20:06:16.506705 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:16.506605 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/06270147-16c3-4db2-a6dd-58512c0c72cc-secret-metrics-server-client-certs\") pod \"metrics-server-6f8f6f89d6-p2bnw\" (UID: \"06270147-16c3-4db2-a6dd-58512c0c72cc\") " pod="openshift-monitoring/metrics-server-6f8f6f89d6-p2bnw" Apr 17 20:06:16.506705 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:16.506631 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/06270147-16c3-4db2-a6dd-58512c0c72cc-secret-metrics-server-tls\") pod \"metrics-server-6f8f6f89d6-p2bnw\" (UID: \"06270147-16c3-4db2-a6dd-58512c0c72cc\") " pod="openshift-monitoring/metrics-server-6f8f6f89d6-p2bnw" Apr 17 20:06:16.506705 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:16.506663 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/06270147-16c3-4db2-a6dd-58512c0c72cc-audit-log\") pod \"metrics-server-6f8f6f89d6-p2bnw\" (UID: \"06270147-16c3-4db2-a6dd-58512c0c72cc\") " pod="openshift-monitoring/metrics-server-6f8f6f89d6-p2bnw" Apr 17 20:06:16.506705 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:16.506689 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d82ch\" (UniqueName: \"kubernetes.io/projected/06270147-16c3-4db2-a6dd-58512c0c72cc-kube-api-access-d82ch\") pod \"metrics-server-6f8f6f89d6-p2bnw\" (UID: \"06270147-16c3-4db2-a6dd-58512c0c72cc\") " pod="openshift-monitoring/metrics-server-6f8f6f89d6-p2bnw" Apr 17 20:06:16.506951 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:16.506741 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/06270147-16c3-4db2-a6dd-58512c0c72cc-metrics-server-audit-profiles\") pod \"metrics-server-6f8f6f89d6-p2bnw\" (UID: \"06270147-16c3-4db2-a6dd-58512c0c72cc\") " pod="openshift-monitoring/metrics-server-6f8f6f89d6-p2bnw" Apr 17 20:06:16.506951 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:16.506903 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06270147-16c3-4db2-a6dd-58512c0c72cc-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6f8f6f89d6-p2bnw\" (UID: \"06270147-16c3-4db2-a6dd-58512c0c72cc\") " pod="openshift-monitoring/metrics-server-6f8f6f89d6-p2bnw" Apr 17 20:06:16.507127 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:16.507106 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6f8f6f89d6-p2bnw"] Apr 17 20:06:16.607361 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:16.607323 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06270147-16c3-4db2-a6dd-58512c0c72cc-client-ca-bundle\") pod \"metrics-server-6f8f6f89d6-p2bnw\" (UID: \"06270147-16c3-4db2-a6dd-58512c0c72cc\") " pod="openshift-monitoring/metrics-server-6f8f6f89d6-p2bnw" Apr 17 20:06:16.607547 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:16.607393 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/06270147-16c3-4db2-a6dd-58512c0c72cc-secret-metrics-server-client-certs\") pod \"metrics-server-6f8f6f89d6-p2bnw\" (UID: \"06270147-16c3-4db2-a6dd-58512c0c72cc\") " pod="openshift-monitoring/metrics-server-6f8f6f89d6-p2bnw" Apr 17 20:06:16.607547 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:16.607433 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/06270147-16c3-4db2-a6dd-58512c0c72cc-secret-metrics-server-tls\") pod \"metrics-server-6f8f6f89d6-p2bnw\" (UID: \"06270147-16c3-4db2-a6dd-58512c0c72cc\") " pod="openshift-monitoring/metrics-server-6f8f6f89d6-p2bnw" Apr 17 20:06:16.607547 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:16.607456 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/06270147-16c3-4db2-a6dd-58512c0c72cc-audit-log\") pod \"metrics-server-6f8f6f89d6-p2bnw\" (UID: \"06270147-16c3-4db2-a6dd-58512c0c72cc\") " pod="openshift-monitoring/metrics-server-6f8f6f89d6-p2bnw" Apr 17 20:06:16.607718 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:16.607627 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d82ch\" (UniqueName: \"kubernetes.io/projected/06270147-16c3-4db2-a6dd-58512c0c72cc-kube-api-access-d82ch\") pod \"metrics-server-6f8f6f89d6-p2bnw\" (UID: \"06270147-16c3-4db2-a6dd-58512c0c72cc\") " pod="openshift-monitoring/metrics-server-6f8f6f89d6-p2bnw" Apr 17 20:06:16.607718 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:16.607705 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/06270147-16c3-4db2-a6dd-58512c0c72cc-metrics-server-audit-profiles\") pod \"metrics-server-6f8f6f89d6-p2bnw\" (UID: \"06270147-16c3-4db2-a6dd-58512c0c72cc\") " pod="openshift-monitoring/metrics-server-6f8f6f89d6-p2bnw" Apr 17 20:06:16.607821 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:16.607788 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06270147-16c3-4db2-a6dd-58512c0c72cc-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6f8f6f89d6-p2bnw\" (UID: \"06270147-16c3-4db2-a6dd-58512c0c72cc\") " pod="openshift-monitoring/metrics-server-6f8f6f89d6-p2bnw" Apr 17 20:06:16.607908 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:16.607825 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/06270147-16c3-4db2-a6dd-58512c0c72cc-audit-log\") pod \"metrics-server-6f8f6f89d6-p2bnw\" (UID: \"06270147-16c3-4db2-a6dd-58512c0c72cc\") " pod="openshift-monitoring/metrics-server-6f8f6f89d6-p2bnw" Apr 17 20:06:16.608517 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:16.608490 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06270147-16c3-4db2-a6dd-58512c0c72cc-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6f8f6f89d6-p2bnw\" (UID: \"06270147-16c3-4db2-a6dd-58512c0c72cc\") " pod="openshift-monitoring/metrics-server-6f8f6f89d6-p2bnw" Apr 17 20:06:16.608946 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:16.608924 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/06270147-16c3-4db2-a6dd-58512c0c72cc-metrics-server-audit-profiles\") pod \"metrics-server-6f8f6f89d6-p2bnw\" (UID: \"06270147-16c3-4db2-a6dd-58512c0c72cc\") " pod="openshift-monitoring/metrics-server-6f8f6f89d6-p2bnw" Apr 17 20:06:16.610450 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:16.610400 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06270147-16c3-4db2-a6dd-58512c0c72cc-client-ca-bundle\") pod \"metrics-server-6f8f6f89d6-p2bnw\" (UID: \"06270147-16c3-4db2-a6dd-58512c0c72cc\") " pod="openshift-monitoring/metrics-server-6f8f6f89d6-p2bnw" Apr 17 20:06:16.610556 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:16.610477 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/06270147-16c3-4db2-a6dd-58512c0c72cc-secret-metrics-server-client-certs\") pod \"metrics-server-6f8f6f89d6-p2bnw\" (UID: \"06270147-16c3-4db2-a6dd-58512c0c72cc\") " pod="openshift-monitoring/metrics-server-6f8f6f89d6-p2bnw" Apr 17 20:06:16.610621 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:16.610572 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/06270147-16c3-4db2-a6dd-58512c0c72cc-secret-metrics-server-tls\") pod \"metrics-server-6f8f6f89d6-p2bnw\" (UID: \"06270147-16c3-4db2-a6dd-58512c0c72cc\") " pod="openshift-monitoring/metrics-server-6f8f6f89d6-p2bnw" Apr 17 20:06:16.615757 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:16.615715 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d82ch\" (UniqueName: \"kubernetes.io/projected/06270147-16c3-4db2-a6dd-58512c0c72cc-kube-api-access-d82ch\") pod \"metrics-server-6f8f6f89d6-p2bnw\" (UID: \"06270147-16c3-4db2-a6dd-58512c0c72cc\") " pod="openshift-monitoring/metrics-server-6f8f6f89d6-p2bnw" Apr 17 20:06:16.753947 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:16.753867 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-jnvf2"] Apr 17 20:06:16.761313 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:16.761282 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jnvf2" Apr 17 20:06:16.762942 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:16.762887 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-jnvf2"] Apr 17 20:06:16.763788 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:16.763589 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 17 20:06:16.763788 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:16.763676 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-28rcc\"" Apr 17 20:06:16.808733 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:16.808701 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6f8f6f89d6-p2bnw" Apr 17 20:06:16.809885 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:16.809585 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/64493f01-c3e5-44b2-879a-259426ce4222-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-jnvf2\" (UID: \"64493f01-c3e5-44b2-879a-259426ce4222\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jnvf2" Apr 17 20:06:16.910804 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:16.910768 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/64493f01-c3e5-44b2-879a-259426ce4222-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-jnvf2\" (UID: \"64493f01-c3e5-44b2-879a-259426ce4222\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jnvf2" Apr 17 20:06:16.913536 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:16.913504 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/64493f01-c3e5-44b2-879a-259426ce4222-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-jnvf2\" (UID: \"64493f01-c3e5-44b2-879a-259426ce4222\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jnvf2" Apr 17 20:06:17.078433 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:17.078394 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jnvf2" Apr 17 20:06:18.186190 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.186150 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 20:06:18.191457 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.191430 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.193994 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.193970 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-jmxvp\"" Apr 17 20:06:18.194356 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.193996 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 20:06:18.194460 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.194230 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 20:06:18.194521 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.194284 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 20:06:18.194574 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.194313 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 20:06:18.194815 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.194794 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 20:06:18.194934 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.194873 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-3bob03gk936tc\"" Apr 17 20:06:18.194992 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.194943 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 20:06:18.195097 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.195078 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 20:06:18.195160 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.195094 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 20:06:18.195213 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.195188 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 20:06:18.195330 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.195312 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 20:06:18.195392 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.195344 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 20:06:18.198104 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.197379 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 20:06:18.202954 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.202904 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 20:06:18.224563 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.223817 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.224563 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.223885 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.224563 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.223912 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e5d48dcb-f9b5-4299-89f4-4829a18daddd-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.224563 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.223940 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e5d48dcb-f9b5-4299-89f4-4829a18daddd-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.224563 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.223966 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz8p9\" (UniqueName: \"kubernetes.io/projected/e5d48dcb-f9b5-4299-89f4-4829a18daddd-kube-api-access-cz8p9\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.224563 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.223990 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e5d48dcb-f9b5-4299-89f4-4829a18daddd-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.224563 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.224017 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e5d48dcb-f9b5-4299-89f4-4829a18daddd-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.224563 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.224042 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.224563 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.224074 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5d48dcb-f9b5-4299-89f4-4829a18daddd-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.224563 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.224109 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.224563 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.224140 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.224563 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.224162 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e5d48dcb-f9b5-4299-89f4-4829a18daddd-config-out\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.224563 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.224183 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.224563 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.224224 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-config\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.224563 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.224262 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5d48dcb-f9b5-4299-89f4-4829a18daddd-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.224563 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.224323 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5d48dcb-f9b5-4299-89f4-4829a18daddd-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.225372 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.224345 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.225372 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.224370 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-web-config\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.327449 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.327408 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5d48dcb-f9b5-4299-89f4-4829a18daddd-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.327635 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.327505 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5d48dcb-f9b5-4299-89f4-4829a18daddd-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.327635 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.327538 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.327635 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.327564 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-web-config\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.327635 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.327611 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.327825 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.327636 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.327825 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.327663 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e5d48dcb-f9b5-4299-89f4-4829a18daddd-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.327825 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.327695 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e5d48dcb-f9b5-4299-89f4-4829a18daddd-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.327825 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.327720 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cz8p9\" (UniqueName: \"kubernetes.io/projected/e5d48dcb-f9b5-4299-89f4-4829a18daddd-kube-api-access-cz8p9\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.327825 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.327749 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e5d48dcb-f9b5-4299-89f4-4829a18daddd-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.327825 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.327780 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e5d48dcb-f9b5-4299-89f4-4829a18daddd-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.327825 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.327811 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.328166 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.327870 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5d48dcb-f9b5-4299-89f4-4829a18daddd-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.328166 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.327969 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.328166 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.328008 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.328166 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.328037 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e5d48dcb-f9b5-4299-89f4-4829a18daddd-config-out\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.328166 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.328063 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.328166 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.328098 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-config\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.328444 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.328361 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5d48dcb-f9b5-4299-89f4-4829a18daddd-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.328510 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.328457 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5d48dcb-f9b5-4299-89f4-4829a18daddd-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.329087 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.328833 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e5d48dcb-f9b5-4299-89f4-4829a18daddd-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.329312 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.329280 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5d48dcb-f9b5-4299-89f4-4829a18daddd-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.330464 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.330442 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e5d48dcb-f9b5-4299-89f4-4829a18daddd-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.331881 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.331861 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.332504 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.332212 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-config\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.333026 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.333001 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e5d48dcb-f9b5-4299-89f4-4829a18daddd-config-out\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.333200 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.333179 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.333474 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.333452 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.333756 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.333705 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.333961 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.333831 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e5d48dcb-f9b5-4299-89f4-4829a18daddd-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.333961 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.333901 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e5d48dcb-f9b5-4299-89f4-4829a18daddd-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.334232 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.334210 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-web-config\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.334582 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.334563 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.335042 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.335018 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.336132 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.336108 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.337114 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.337093 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz8p9\" (UniqueName: \"kubernetes.io/projected/e5d48dcb-f9b5-4299-89f4-4829a18daddd-kube-api-access-cz8p9\") pod \"prometheus-k8s-0\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:18.506530 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:18.506431 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:22.501466 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:22.501434 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 20:06:22.513823 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:06:22.513782 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5d48dcb_f9b5_4299_89f4_4829a18daddd.slice/crio-5b97c375857821b4a7534e43b3f0779d4be668541766d7bf75dbc1fb88916734 WatchSource:0}: Error finding container 5b97c375857821b4a7534e43b3f0779d4be668541766d7bf75dbc1fb88916734: Status 404 returned error can't find the container with id 5b97c375857821b4a7534e43b3f0779d4be668541766d7bf75dbc1fb88916734 Apr 17 20:06:22.716305 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:22.716242 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-jnvf2"] Apr 17 20:06:22.719205 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:22.719179 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6f8f6f89d6-p2bnw"] Apr 17 20:06:22.719704 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:06:22.719669 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64493f01_c3e5_44b2_879a_259426ce4222.slice/crio-93ef29327789a97943e9373a628c4c99316a195b5fe6849579128dbd7e6594bc WatchSource:0}: Error finding container 93ef29327789a97943e9373a628c4c99316a195b5fe6849579128dbd7e6594bc: Status 404 returned error can't find the container with id 93ef29327789a97943e9373a628c4c99316a195b5fe6849579128dbd7e6594bc Apr 17 20:06:22.723941 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:06:22.723904 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06270147_16c3_4db2_a6dd_58512c0c72cc.slice/crio-ddfa85ba750dbdae66bb252f6c002de53378eafae1d9fe1b70bf343ceeb29269 WatchSource:0}: Error finding container ddfa85ba750dbdae66bb252f6c002de53378eafae1d9fe1b70bf343ceeb29269: Status 404 returned error can't find the container with id ddfa85ba750dbdae66bb252f6c002de53378eafae1d9fe1b70bf343ceeb29269 Apr 17 20:06:23.047256 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:23.047146 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jnvf2" event={"ID":"64493f01-c3e5-44b2-879a-259426ce4222","Type":"ContainerStarted","Data":"93ef29327789a97943e9373a628c4c99316a195b5fe6849579128dbd7e6594bc"} Apr 17 20:06:23.049159 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:23.049118 2576 generic.go:358] "Generic (PLEG): container finished" podID="0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" containerID="dc4ffe782285e0b09113efcac8b83e487839f3ca57b9b680384834d7535704fe" exitCode=0 Apr 17 20:06:23.049550 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:23.049492 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d","Type":"ContainerDied","Data":"dc4ffe782285e0b09113efcac8b83e487839f3ca57b9b680384834d7535704fe"} Apr 17 20:06:23.050910 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:23.050883 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6f8f6f89d6-p2bnw" event={"ID":"06270147-16c3-4db2-a6dd-58512c0c72cc","Type":"ContainerStarted","Data":"ddfa85ba750dbdae66bb252f6c002de53378eafae1d9fe1b70bf343ceeb29269"} Apr 17 20:06:23.052647 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:23.052620 2576 generic.go:358] "Generic (PLEG): container finished" podID="e5d48dcb-f9b5-4299-89f4-4829a18daddd" containerID="f1b7c6b4daa7dbfc0d51e59834db22d799466673448e0e808cb451df35f40d16" exitCode=0 Apr 17 20:06:23.052764 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:23.052730 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e5d48dcb-f9b5-4299-89f4-4829a18daddd","Type":"ContainerDied","Data":"f1b7c6b4daa7dbfc0d51e59834db22d799466673448e0e808cb451df35f40d16"} Apr 17 20:06:23.052764 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:23.052757 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e5d48dcb-f9b5-4299-89f4-4829a18daddd","Type":"ContainerStarted","Data":"5b97c375857821b4a7534e43b3f0779d4be668541766d7bf75dbc1fb88916734"} Apr 17 20:06:23.054443 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:23.054339 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-l7bms" event={"ID":"a29542e8-2ce0-4458-a973-dafcdaa4d486","Type":"ContainerStarted","Data":"7c4f116f1b87f9d3ddcbb15decb8eba78a5823b1b969272c2db89aebc238c5b2"} Apr 17 20:06:23.054601 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:23.054580 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-l7bms" Apr 17 20:06:23.065402 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:23.065372 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-l7bms" Apr 17 20:06:23.149725 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:23.149672 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-l7bms" podStartSLOduration=2.165572459 podStartE2EDuration="20.149654516s" podCreationTimestamp="2026-04-17 20:06:03 +0000 UTC" firstStartedPulling="2026-04-17 20:06:04.394868073 +0000 UTC m=+103.262687813" lastFinishedPulling="2026-04-17 20:06:22.378950117 +0000 UTC m=+121.246769870" observedRunningTime="2026-04-17 20:06:23.149298138 +0000 UTC m=+122.017117900" watchObservedRunningTime="2026-04-17 20:06:23.149654516 +0000 UTC m=+122.017474280" Apr 17 20:06:28.081916 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:28.081733 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e5d48dcb-f9b5-4299-89f4-4829a18daddd","Type":"ContainerStarted","Data":"41b921a40669844a2e807a176a1c000edeb942134aa814bd8f60a019b15854c3"} Apr 17 20:06:28.084735 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:28.083385 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jnvf2" event={"ID":"64493f01-c3e5-44b2-879a-259426ce4222","Type":"ContainerStarted","Data":"e5b47f72f4ead01976d8d2da0c5d30ae3de3bcacca5b9972b99cf7df0e07ae34"} Apr 17 20:06:28.084735 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:28.084554 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jnvf2" Apr 17 20:06:28.088265 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:28.088241 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d","Type":"ContainerStarted","Data":"88f380ed1679fb604fffdab23d43c2bafd48d0d5bea25c0a8c55b73b3fe8984c"} Apr 17 20:06:28.088369 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:28.088274 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d","Type":"ContainerStarted","Data":"4e5a1bc45ae7f8fd865c8f36d3b9799b860873e74862162264b641b978dc5e52"} Apr 17 20:06:28.089563 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:28.089538 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6f8f6f89d6-p2bnw" event={"ID":"06270147-16c3-4db2-a6dd-58512c0c72cc","Type":"ContainerStarted","Data":"9b504a0c79a2b37a1bab160d80363ed51d7e44d52b724ccc7e629e47b9af1df9"} Apr 17 20:06:28.094902 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:28.094776 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jnvf2" Apr 17 20:06:28.100588 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:28.099772 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jnvf2" podStartSLOduration=7.269343311 podStartE2EDuration="12.099756102s" podCreationTimestamp="2026-04-17 20:06:16 +0000 UTC" firstStartedPulling="2026-04-17 20:06:22.722006776 +0000 UTC m=+121.589826523" lastFinishedPulling="2026-04-17 20:06:27.552419563 +0000 UTC m=+126.420239314" observedRunningTime="2026-04-17 20:06:28.098021717 +0000 UTC m=+126.965841480" watchObservedRunningTime="2026-04-17 20:06:28.099756102 +0000 UTC m=+126.967575864" Apr 17 20:06:28.121934 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:28.121817 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-6f8f6f89d6-p2bnw" podStartSLOduration=7.2941722460000005 podStartE2EDuration="12.121800351s" podCreationTimestamp="2026-04-17 20:06:16 +0000 UTC" firstStartedPulling="2026-04-17 20:06:22.726331294 +0000 UTC m=+121.594151044" lastFinishedPulling="2026-04-17 20:06:27.553959395 +0000 UTC m=+126.421779149" observedRunningTime="2026-04-17 20:06:28.120906145 +0000 UTC m=+126.988725906" watchObservedRunningTime="2026-04-17 20:06:28.121800351 +0000 UTC m=+126.989620125" Apr 17 20:06:28.825495 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:28.825290 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-64db8476cb-95w6j" podUID="8eafe28a-1549-4404-a39c-dad771c51b73" containerName="registry" containerID="cri-o://aceccba09fec67b6a07cee09e662f287b4bfe2bdcbff21e72345c3a085f386ab" gracePeriod=30 Apr 17 20:06:29.098618 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:29.098579 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e5d48dcb-f9b5-4299-89f4-4829a18daddd","Type":"ContainerStarted","Data":"3667578875c98f6e2ba81e5229d551cc5625415c9e58508399f5158e8644fed2"} Apr 17 20:06:29.104578 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:29.104514 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d","Type":"ContainerStarted","Data":"5b9448abd3f038b16df183c4b1f1e29e6a30fc1b2912407bdd71bbe9a194390c"} Apr 17 20:06:29.104578 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:29.104553 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d","Type":"ContainerStarted","Data":"832b364ef9ab7670400c768540e8ee89eef29f3983e5ea526c9b454b7941595f"} Apr 17 20:06:29.104578 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:29.104571 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d","Type":"ContainerStarted","Data":"ce0cd3a1cb483fe415b0ae91addb6899927fca6cedcf517e0ac6b5e4f46e8b9a"} Apr 17 20:06:29.107988 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:29.107145 2576 generic.go:358] "Generic (PLEG): container finished" podID="8eafe28a-1549-4404-a39c-dad771c51b73" containerID="aceccba09fec67b6a07cee09e662f287b4bfe2bdcbff21e72345c3a085f386ab" exitCode=0 Apr 17 20:06:29.107988 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:29.107469 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-64db8476cb-95w6j" event={"ID":"8eafe28a-1549-4404-a39c-dad771c51b73","Type":"ContainerDied","Data":"aceccba09fec67b6a07cee09e662f287b4bfe2bdcbff21e72345c3a085f386ab"} Apr 17 20:06:29.107988 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:29.107507 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-64db8476cb-95w6j" event={"ID":"8eafe28a-1549-4404-a39c-dad771c51b73","Type":"ContainerDied","Data":"d75ec74389c7c95e502fb8ee49d71327177904b422a25fc2c40ac15b1b2b623a"} Apr 17 20:06:29.107988 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:29.107528 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d75ec74389c7c95e502fb8ee49d71327177904b422a25fc2c40ac15b1b2b623a" Apr 17 20:06:29.111606 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:29.111586 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-64db8476cb-95w6j" Apr 17 20:06:29.245777 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:29.245735 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8eafe28a-1549-4404-a39c-dad771c51b73-trusted-ca\") pod \"8eafe28a-1549-4404-a39c-dad771c51b73\" (UID: \"8eafe28a-1549-4404-a39c-dad771c51b73\") " Apr 17 20:06:29.246039 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:29.245826 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8eafe28a-1549-4404-a39c-dad771c51b73-installation-pull-secrets\") pod \"8eafe28a-1549-4404-a39c-dad771c51b73\" (UID: \"8eafe28a-1549-4404-a39c-dad771c51b73\") " Apr 17 20:06:29.246039 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:29.245878 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8eafe28a-1549-4404-a39c-dad771c51b73-registry-tls\") pod \"8eafe28a-1549-4404-a39c-dad771c51b73\" (UID: \"8eafe28a-1549-4404-a39c-dad771c51b73\") " Apr 17 20:06:29.246039 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:29.245908 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8eafe28a-1549-4404-a39c-dad771c51b73-image-registry-private-configuration\") pod \"8eafe28a-1549-4404-a39c-dad771c51b73\" (UID: \"8eafe28a-1549-4404-a39c-dad771c51b73\") " Apr 17 20:06:29.246039 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:29.245946 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8eafe28a-1549-4404-a39c-dad771c51b73-registry-certificates\") pod \"8eafe28a-1549-4404-a39c-dad771c51b73\" (UID: \"8eafe28a-1549-4404-a39c-dad771c51b73\") " Apr 17 20:06:29.246039 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:29.245976 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8eafe28a-1549-4404-a39c-dad771c51b73-bound-sa-token\") pod \"8eafe28a-1549-4404-a39c-dad771c51b73\" (UID: \"8eafe28a-1549-4404-a39c-dad771c51b73\") " Apr 17 20:06:29.246039 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:29.245995 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8eafe28a-1549-4404-a39c-dad771c51b73-ca-trust-extracted\") pod \"8eafe28a-1549-4404-a39c-dad771c51b73\" (UID: \"8eafe28a-1549-4404-a39c-dad771c51b73\") " Apr 17 20:06:29.246039 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:29.246029 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flmts\" (UniqueName: \"kubernetes.io/projected/8eafe28a-1549-4404-a39c-dad771c51b73-kube-api-access-flmts\") pod \"8eafe28a-1549-4404-a39c-dad771c51b73\" (UID: \"8eafe28a-1549-4404-a39c-dad771c51b73\") " Apr 17 20:06:29.246523 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:29.246485 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8eafe28a-1549-4404-a39c-dad771c51b73-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8eafe28a-1549-4404-a39c-dad771c51b73" (UID: "8eafe28a-1549-4404-a39c-dad771c51b73"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:06:29.248435 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:29.248405 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8eafe28a-1549-4404-a39c-dad771c51b73-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8eafe28a-1549-4404-a39c-dad771c51b73" (UID: "8eafe28a-1549-4404-a39c-dad771c51b73"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:06:29.251151 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:29.251125 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8eafe28a-1549-4404-a39c-dad771c51b73-kube-api-access-flmts" (OuterVolumeSpecName: "kube-api-access-flmts") pod "8eafe28a-1549-4404-a39c-dad771c51b73" (UID: "8eafe28a-1549-4404-a39c-dad771c51b73"). InnerVolumeSpecName "kube-api-access-flmts". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:06:29.251151 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:29.251140 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eafe28a-1549-4404-a39c-dad771c51b73-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "8eafe28a-1549-4404-a39c-dad771c51b73" (UID: "8eafe28a-1549-4404-a39c-dad771c51b73"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:06:29.251322 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:29.251203 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8eafe28a-1549-4404-a39c-dad771c51b73-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8eafe28a-1549-4404-a39c-dad771c51b73" (UID: "8eafe28a-1549-4404-a39c-dad771c51b73"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:06:29.251322 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:29.251276 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8eafe28a-1549-4404-a39c-dad771c51b73-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8eafe28a-1549-4404-a39c-dad771c51b73" (UID: "8eafe28a-1549-4404-a39c-dad771c51b73"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:06:29.251675 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:29.251640 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eafe28a-1549-4404-a39c-dad771c51b73-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8eafe28a-1549-4404-a39c-dad771c51b73" (UID: "8eafe28a-1549-4404-a39c-dad771c51b73"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:06:29.259147 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:29.259124 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8eafe28a-1549-4404-a39c-dad771c51b73-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8eafe28a-1549-4404-a39c-dad771c51b73" (UID: "8eafe28a-1549-4404-a39c-dad771c51b73"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:06:29.347910 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:29.347698 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8eafe28a-1549-4404-a39c-dad771c51b73-registry-tls\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:06:29.347910 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:29.347738 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8eafe28a-1549-4404-a39c-dad771c51b73-image-registry-private-configuration\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:06:29.347910 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:29.347755 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8eafe28a-1549-4404-a39c-dad771c51b73-registry-certificates\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:06:29.347910 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:29.347770 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8eafe28a-1549-4404-a39c-dad771c51b73-bound-sa-token\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:06:29.347910 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:29.347787 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8eafe28a-1549-4404-a39c-dad771c51b73-ca-trust-extracted\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:06:29.347910 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:29.347801 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-flmts\" (UniqueName: \"kubernetes.io/projected/8eafe28a-1549-4404-a39c-dad771c51b73-kube-api-access-flmts\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:06:29.347910 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:29.347815 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8eafe28a-1549-4404-a39c-dad771c51b73-trusted-ca\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:06:29.347910 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:29.347829 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8eafe28a-1549-4404-a39c-dad771c51b73-installation-pull-secrets\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:06:30.111868 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:30.111515 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-64db8476cb-95w6j" Apr 17 20:06:30.128933 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:30.128894 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-64db8476cb-95w6j"] Apr 17 20:06:30.131740 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:30.131707 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-64db8476cb-95w6j"] Apr 17 20:06:31.118706 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:31.118610 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d","Type":"ContainerStarted","Data":"a6ea9197e526b36e37b830d467849a7844dad40daf3d040dd574377df3f174ce"} Apr 17 20:06:31.152041 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:31.151963 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.639231526 podStartE2EDuration="18.151946686s" podCreationTimestamp="2026-04-17 20:06:13 +0000 UTC" firstStartedPulling="2026-04-17 20:06:14.12377516 +0000 UTC m=+112.991594903" lastFinishedPulling="2026-04-17 20:06:30.636490313 +0000 UTC m=+129.504310063" observedRunningTime="2026-04-17 20:06:31.146431562 +0000 UTC m=+130.014251325" watchObservedRunningTime="2026-04-17 20:06:31.151946686 +0000 UTC m=+130.019766448" Apr 17 20:06:31.369114 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:31.369087 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e87cc22-ed21-46b2-9dac-b1a3906efc5c-metrics-certs\") pod \"network-metrics-daemon-cnnh2\" (UID: \"7e87cc22-ed21-46b2-9dac-b1a3906efc5c\") " pod="openshift-multus/network-metrics-daemon-cnnh2" Apr 17 20:06:31.371761 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:31.371732 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e87cc22-ed21-46b2-9dac-b1a3906efc5c-metrics-certs\") pod \"network-metrics-daemon-cnnh2\" (UID: \"7e87cc22-ed21-46b2-9dac-b1a3906efc5c\") " pod="openshift-multus/network-metrics-daemon-cnnh2" Apr 17 20:06:31.547659 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:31.547580 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-x8jh2\"" Apr 17 20:06:31.555584 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:31.555557 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cnnh2" Apr 17 20:06:31.643311 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:31.642914 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8eafe28a-1549-4404-a39c-dad771c51b73" path="/var/lib/kubelet/pods/8eafe28a-1549-4404-a39c-dad771c51b73/volumes" Apr 17 20:06:31.927107 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:31.926808 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-cnnh2"] Apr 17 20:06:31.929816 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:06:31.929782 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e87cc22_ed21_46b2_9dac_b1a3906efc5c.slice/crio-35aa0abdd8c3e75443c6d318600b59933e5903be7ae4ba0edb9288ada22a7e60 WatchSource:0}: Error finding container 35aa0abdd8c3e75443c6d318600b59933e5903be7ae4ba0edb9288ada22a7e60: Status 404 returned error can't find the container with id 35aa0abdd8c3e75443c6d318600b59933e5903be7ae4ba0edb9288ada22a7e60 Apr 17 20:06:32.123629 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:32.123573 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cnnh2" event={"ID":"7e87cc22-ed21-46b2-9dac-b1a3906efc5c","Type":"ContainerStarted","Data":"35aa0abdd8c3e75443c6d318600b59933e5903be7ae4ba0edb9288ada22a7e60"} Apr 17 20:06:32.127199 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:32.127164 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e5d48dcb-f9b5-4299-89f4-4829a18daddd","Type":"ContainerStarted","Data":"c9760fb1e9402d8529a0d1e053d741c2550676ef1a1ffe1180f5ce13668c252c"} Apr 17 20:06:32.127365 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:32.127210 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e5d48dcb-f9b5-4299-89f4-4829a18daddd","Type":"ContainerStarted","Data":"0eb1c770bfb0a201eae4bb5ef493a7c141a2eefa83df29b5406de63c99521617"} Apr 17 20:06:32.127365 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:32.127227 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e5d48dcb-f9b5-4299-89f4-4829a18daddd","Type":"ContainerStarted","Data":"4f1ed52c0f790adc33a775a988d3076cf3f4ae11c89f9858f43ccb5a466a3ef4"} Apr 17 20:06:32.127365 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:32.127241 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e5d48dcb-f9b5-4299-89f4-4829a18daddd","Type":"ContainerStarted","Data":"f1aa297c24b90ccc087d4a3dfc011ac712ad7ffdebb0a10a5fae20a944c877d2"} Apr 17 20:06:32.151674 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:32.151620 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=6.232156622 podStartE2EDuration="14.15160402s" podCreationTimestamp="2026-04-17 20:06:18 +0000 UTC" firstStartedPulling="2026-04-17 20:06:23.054057078 +0000 UTC m=+121.921876818" lastFinishedPulling="2026-04-17 20:06:30.973504473 +0000 UTC m=+129.841324216" observedRunningTime="2026-04-17 20:06:32.15000014 +0000 UTC m=+131.017819915" watchObservedRunningTime="2026-04-17 20:06:32.15160402 +0000 UTC m=+131.019423782" Apr 17 20:06:33.506972 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:33.506937 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:06:34.137123 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:34.137084 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cnnh2" event={"ID":"7e87cc22-ed21-46b2-9dac-b1a3906efc5c","Type":"ContainerStarted","Data":"72b9ad53d24ed21bd7111a628dd8a031e4f5c55beb833cde675adf89615b3d4c"} Apr 17 20:06:34.137123 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:34.137122 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cnnh2" event={"ID":"7e87cc22-ed21-46b2-9dac-b1a3906efc5c","Type":"ContainerStarted","Data":"425fce1c380107d51bc01308558d476636039d5a51f7da1b703849980eb75508"} Apr 17 20:06:34.161832 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:34.161784 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-cnnh2" podStartSLOduration=131.507521584 podStartE2EDuration="2m13.161768576s" podCreationTimestamp="2026-04-17 20:04:21 +0000 UTC" firstStartedPulling="2026-04-17 20:06:31.932190709 +0000 UTC m=+130.800010449" lastFinishedPulling="2026-04-17 20:06:33.586437692 +0000 UTC m=+132.454257441" observedRunningTime="2026-04-17 20:06:34.159746671 +0000 UTC m=+133.027566432" watchObservedRunningTime="2026-04-17 20:06:34.161768576 +0000 UTC m=+133.029588336" Apr 17 20:06:36.808817 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:36.808783 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-6f8f6f89d6-p2bnw" Apr 17 20:06:36.808817 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:36.808822 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-6f8f6f89d6-p2bnw" Apr 17 20:06:52.646496 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:52.646464 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5f85496cb-r87zd_69b938b6-9d16-44cc-9e04-55819bafd06e/router/0.log" Apr 17 20:06:52.651972 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:52.651948 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-l2wrg_642f7cfc-05bf-4010-8948-bc21230b60e0/serve-healthcheck-canary/0.log" Apr 17 20:06:56.814491 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:56.814456 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-6f8f6f89d6-p2bnw" Apr 17 20:06:56.818401 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:06:56.818376 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-6f8f6f89d6-p2bnw" Apr 17 20:07:11.252348 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:11.252307 2576 generic.go:358] "Generic (PLEG): container finished" podID="fb0f783e-6eb1-4b42-82f2-7eed827adacb" containerID="2dd9e741be1f9f56a9fdc9b1aa6367df5f7df3497db96c7b266f9fa194fe148b" exitCode=0 Apr 17 20:07:11.252348 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:11.252354 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qrtjl" event={"ID":"fb0f783e-6eb1-4b42-82f2-7eed827adacb","Type":"ContainerDied","Data":"2dd9e741be1f9f56a9fdc9b1aa6367df5f7df3497db96c7b266f9fa194fe148b"} Apr 17 20:07:11.252794 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:11.252661 2576 scope.go:117] "RemoveContainer" containerID="2dd9e741be1f9f56a9fdc9b1aa6367df5f7df3497db96c7b266f9fa194fe148b" Apr 17 20:07:12.257603 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:12.257561 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qrtjl" event={"ID":"fb0f783e-6eb1-4b42-82f2-7eed827adacb","Type":"ContainerStarted","Data":"44aeabd167d28ada6cd8268528cc6d4f05d85f2c09b483fd9706835959c63370"} Apr 17 20:07:18.507535 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:18.507502 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:18.523123 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:18.523092 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:19.292671 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:19.292638 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:32.304277 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:32.304222 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 20:07:32.306163 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:32.306094 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" containerName="alertmanager" containerID="cri-o://4e5a1bc45ae7f8fd865c8f36d3b9799b860873e74862162264b641b978dc5e52" gracePeriod=120 Apr 17 20:07:32.306902 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:32.306532 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" containerName="prom-label-proxy" containerID="cri-o://a6ea9197e526b36e37b830d467849a7844dad40daf3d040dd574377df3f174ce" gracePeriod=120 Apr 17 20:07:32.306902 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:32.306588 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" containerName="kube-rbac-proxy-web" containerID="cri-o://ce0cd3a1cb483fe415b0ae91addb6899927fca6cedcf517e0ac6b5e4f46e8b9a" gracePeriod=120 Apr 17 20:07:32.306902 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:32.306650 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" containerName="config-reloader" containerID="cri-o://88f380ed1679fb604fffdab23d43c2bafd48d0d5bea25c0a8c55b73b3fe8984c" gracePeriod=120 Apr 17 20:07:32.306902 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:32.306732 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" containerName="kube-rbac-proxy-metric" containerID="cri-o://5b9448abd3f038b16df183c4b1f1e29e6a30fc1b2912407bdd71bbe9a194390c" gracePeriod=120 Apr 17 20:07:32.306902 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:32.306812 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" containerName="kube-rbac-proxy" containerID="cri-o://832b364ef9ab7670400c768540e8ee89eef29f3983e5ea526c9b454b7941595f" gracePeriod=120 Apr 17 20:07:33.322285 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.322253 2576 generic.go:358] "Generic (PLEG): container finished" podID="0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" containerID="a6ea9197e526b36e37b830d467849a7844dad40daf3d040dd574377df3f174ce" exitCode=0 Apr 17 20:07:33.322285 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.322281 2576 generic.go:358] "Generic (PLEG): container finished" podID="0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" containerID="832b364ef9ab7670400c768540e8ee89eef29f3983e5ea526c9b454b7941595f" exitCode=0 Apr 17 20:07:33.322285 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.322290 2576 generic.go:358] "Generic (PLEG): container finished" podID="0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" containerID="88f380ed1679fb604fffdab23d43c2bafd48d0d5bea25c0a8c55b73b3fe8984c" exitCode=0 Apr 17 20:07:33.322697 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.322296 2576 generic.go:358] "Generic (PLEG): container finished" podID="0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" containerID="4e5a1bc45ae7f8fd865c8f36d3b9799b860873e74862162264b641b978dc5e52" exitCode=0 Apr 17 20:07:33.322697 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.322322 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d","Type":"ContainerDied","Data":"a6ea9197e526b36e37b830d467849a7844dad40daf3d040dd574377df3f174ce"} Apr 17 20:07:33.322697 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.322358 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d","Type":"ContainerDied","Data":"832b364ef9ab7670400c768540e8ee89eef29f3983e5ea526c9b454b7941595f"} Apr 17 20:07:33.322697 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.322368 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d","Type":"ContainerDied","Data":"88f380ed1679fb604fffdab23d43c2bafd48d0d5bea25c0a8c55b73b3fe8984c"} Apr 17 20:07:33.322697 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.322377 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d","Type":"ContainerDied","Data":"4e5a1bc45ae7f8fd865c8f36d3b9799b860873e74862162264b641b978dc5e52"} Apr 17 20:07:33.552966 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.552943 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:07:33.722299 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.722210 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-secret-alertmanager-main-tls\") pod \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " Apr 17 20:07:33.722299 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.722246 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-tls-assets\") pod \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " Apr 17 20:07:33.722299 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.722275 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-secret-alertmanager-kube-rbac-proxy\") pod \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " Apr 17 20:07:33.722661 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.722389 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-cluster-tls-config\") pod \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " Apr 17 20:07:33.722661 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.722432 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-secret-alertmanager-kube-rbac-proxy-web\") pod \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " Apr 17 20:07:33.722661 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.722469 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-metrics-client-ca\") pod \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " Apr 17 20:07:33.722661 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.722500 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6b227\" (UniqueName: \"kubernetes.io/projected/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-kube-api-access-6b227\") pod \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " Apr 17 20:07:33.722661 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.722522 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-config-volume\") pod \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " Apr 17 20:07:33.722661 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.722547 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-alertmanager-trusted-ca-bundle\") pod \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " Apr 17 20:07:33.722661 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.722578 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-config-out\") pod \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " Apr 17 20:07:33.722661 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.722604 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " Apr 17 20:07:33.722661 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.722641 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-alertmanager-main-db\") pod \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " Apr 17 20:07:33.723124 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.722690 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-web-config\") pod \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\" (UID: \"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d\") " Apr 17 20:07:33.723124 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.722934 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" (UID: "0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:07:33.723124 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.723059 2576 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-metrics-client-ca\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:07:33.723725 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.723665 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" (UID: "0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:07:33.723921 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.723893 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" (UID: "0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:07:33.725503 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.725477 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" (UID: "0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:07:33.726463 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.726438 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-config-out" (OuterVolumeSpecName: "config-out") pod "0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" (UID: "0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:07:33.726609 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.726581 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" (UID: "0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:07:33.726713 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.726627 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-kube-api-access-6b227" (OuterVolumeSpecName: "kube-api-access-6b227") pod "0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" (UID: "0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d"). InnerVolumeSpecName "kube-api-access-6b227". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:07:33.726713 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.726649 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" (UID: "0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:07:33.726830 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.726742 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" (UID: "0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:07:33.727057 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.727035 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-config-volume" (OuterVolumeSpecName: "config-volume") pod "0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" (UID: "0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:07:33.727119 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.727102 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" (UID: "0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:07:33.731243 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.731133 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" (UID: "0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:07:33.737591 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.737571 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-web-config" (OuterVolumeSpecName: "web-config") pod "0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" (UID: "0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:07:33.824080 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.824033 2576 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-config-out\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:07:33.824080 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.824079 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:07:33.824080 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.824093 2576 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-alertmanager-main-db\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:07:33.824305 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.824103 2576 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-web-config\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:07:33.824305 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.824112 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-secret-alertmanager-main-tls\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:07:33.824305 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.824121 2576 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-tls-assets\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:07:33.824305 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.824131 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:07:33.824305 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.824140 2576 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-cluster-tls-config\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:07:33.824305 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.824149 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:07:33.824305 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.824158 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6b227\" (UniqueName: \"kubernetes.io/projected/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-kube-api-access-6b227\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:07:33.824305 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.824166 2576 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-config-volume\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:07:33.824305 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:33.824174 2576 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:07:34.327949 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:34.327914 2576 generic.go:358] "Generic (PLEG): container finished" podID="0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" containerID="5b9448abd3f038b16df183c4b1f1e29e6a30fc1b2912407bdd71bbe9a194390c" exitCode=0 Apr 17 20:07:34.327949 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:34.327943 2576 generic.go:358] "Generic (PLEG): container finished" podID="0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" containerID="ce0cd3a1cb483fe415b0ae91addb6899927fca6cedcf517e0ac6b5e4f46e8b9a" exitCode=0 Apr 17 20:07:34.328429 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:34.327989 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d","Type":"ContainerDied","Data":"5b9448abd3f038b16df183c4b1f1e29e6a30fc1b2912407bdd71bbe9a194390c"} Apr 17 20:07:34.328429 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:34.328021 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:07:34.328429 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:34.328032 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d","Type":"ContainerDied","Data":"ce0cd3a1cb483fe415b0ae91addb6899927fca6cedcf517e0ac6b5e4f46e8b9a"} Apr 17 20:07:34.328429 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:34.328045 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d","Type":"ContainerDied","Data":"98ce3edd0f907073919cea4809bce37c9f140ccad73aa54edf7490e08e051044"} Apr 17 20:07:34.328429 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:34.328060 2576 scope.go:117] "RemoveContainer" containerID="a6ea9197e526b36e37b830d467849a7844dad40daf3d040dd574377df3f174ce" Apr 17 20:07:34.335646 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:34.335631 2576 scope.go:117] "RemoveContainer" containerID="5b9448abd3f038b16df183c4b1f1e29e6a30fc1b2912407bdd71bbe9a194390c" Apr 17 20:07:34.342211 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:34.342194 2576 scope.go:117] "RemoveContainer" containerID="832b364ef9ab7670400c768540e8ee89eef29f3983e5ea526c9b454b7941595f" Apr 17 20:07:34.349040 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:34.348939 2576 scope.go:117] "RemoveContainer" containerID="ce0cd3a1cb483fe415b0ae91addb6899927fca6cedcf517e0ac6b5e4f46e8b9a" Apr 17 20:07:34.351321 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:34.351301 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 20:07:34.355636 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:34.355604 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 20:07:34.356810 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:34.356796 2576 scope.go:117] "RemoveContainer" containerID="88f380ed1679fb604fffdab23d43c2bafd48d0d5bea25c0a8c55b73b3fe8984c" Apr 17 20:07:34.363437 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:34.363418 2576 scope.go:117] "RemoveContainer" containerID="4e5a1bc45ae7f8fd865c8f36d3b9799b860873e74862162264b641b978dc5e52" Apr 17 20:07:34.369753 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:34.369733 2576 scope.go:117] "RemoveContainer" containerID="dc4ffe782285e0b09113efcac8b83e487839f3ca57b9b680384834d7535704fe" Apr 17 20:07:34.376096 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:34.376074 2576 scope.go:117] "RemoveContainer" containerID="a6ea9197e526b36e37b830d467849a7844dad40daf3d040dd574377df3f174ce" Apr 17 20:07:34.376367 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:07:34.376344 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6ea9197e526b36e37b830d467849a7844dad40daf3d040dd574377df3f174ce\": container with ID starting with a6ea9197e526b36e37b830d467849a7844dad40daf3d040dd574377df3f174ce not found: ID does not exist" containerID="a6ea9197e526b36e37b830d467849a7844dad40daf3d040dd574377df3f174ce" Apr 17 20:07:34.376415 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:34.376378 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6ea9197e526b36e37b830d467849a7844dad40daf3d040dd574377df3f174ce"} err="failed to get container status \"a6ea9197e526b36e37b830d467849a7844dad40daf3d040dd574377df3f174ce\": rpc error: code = NotFound desc = could not find container \"a6ea9197e526b36e37b830d467849a7844dad40daf3d040dd574377df3f174ce\": container with ID starting with a6ea9197e526b36e37b830d467849a7844dad40daf3d040dd574377df3f174ce not found: ID does not exist" Apr 17 20:07:34.376415 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:34.376410 2576 scope.go:117] "RemoveContainer" containerID="5b9448abd3f038b16df183c4b1f1e29e6a30fc1b2912407bdd71bbe9a194390c" Apr 17 20:07:34.376648 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:07:34.376629 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b9448abd3f038b16df183c4b1f1e29e6a30fc1b2912407bdd71bbe9a194390c\": container with ID starting with 5b9448abd3f038b16df183c4b1f1e29e6a30fc1b2912407bdd71bbe9a194390c not found: ID does not exist" containerID="5b9448abd3f038b16df183c4b1f1e29e6a30fc1b2912407bdd71bbe9a194390c" Apr 17 20:07:34.376740 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:34.376655 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b9448abd3f038b16df183c4b1f1e29e6a30fc1b2912407bdd71bbe9a194390c"} err="failed to get container status \"5b9448abd3f038b16df183c4b1f1e29e6a30fc1b2912407bdd71bbe9a194390c\": rpc error: code = NotFound desc = could not find container \"5b9448abd3f038b16df183c4b1f1e29e6a30fc1b2912407bdd71bbe9a194390c\": container with ID starting with 5b9448abd3f038b16df183c4b1f1e29e6a30fc1b2912407bdd71bbe9a194390c not found: ID does not exist" Apr 17 20:07:34.376740 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:34.376673 2576 scope.go:117] "RemoveContainer" containerID="832b364ef9ab7670400c768540e8ee89eef29f3983e5ea526c9b454b7941595f" Apr 17 20:07:34.376997 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:07:34.376975 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"832b364ef9ab7670400c768540e8ee89eef29f3983e5ea526c9b454b7941595f\": container with ID starting with 832b364ef9ab7670400c768540e8ee89eef29f3983e5ea526c9b454b7941595f not found: ID does not exist" containerID="832b364ef9ab7670400c768540e8ee89eef29f3983e5ea526c9b454b7941595f" Apr 17 20:07:34.377092 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:34.377002 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"832b364ef9ab7670400c768540e8ee89eef29f3983e5ea526c9b454b7941595f"} err="failed to get container status \"832b364ef9ab7670400c768540e8ee89eef29f3983e5ea526c9b454b7941595f\": rpc error: code = NotFound desc = could not find container \"832b364ef9ab7670400c768540e8ee89eef29f3983e5ea526c9b454b7941595f\": container with ID starting with 832b364ef9ab7670400c768540e8ee89eef29f3983e5ea526c9b454b7941595f not found: ID does not exist" Apr 17 20:07:34.377092 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:34.377017 2576 scope.go:117] "RemoveContainer" containerID="ce0cd3a1cb483fe415b0ae91addb6899927fca6cedcf517e0ac6b5e4f46e8b9a" Apr 17 20:07:34.377399 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:07:34.377375 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce0cd3a1cb483fe415b0ae91addb6899927fca6cedcf517e0ac6b5e4f46e8b9a\": container with ID starting with ce0cd3a1cb483fe415b0ae91addb6899927fca6cedcf517e0ac6b5e4f46e8b9a not found: ID does not exist" containerID="ce0cd3a1cb483fe415b0ae91addb6899927fca6cedcf517e0ac6b5e4f46e8b9a" Apr 17 20:07:34.377477 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:34.377405 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce0cd3a1cb483fe415b0ae91addb6899927fca6cedcf517e0ac6b5e4f46e8b9a"} err="failed to get container status \"ce0cd3a1cb483fe415b0ae91addb6899927fca6cedcf517e0ac6b5e4f46e8b9a\": rpc error: code = NotFound desc = could not find container \"ce0cd3a1cb483fe415b0ae91addb6899927fca6cedcf517e0ac6b5e4f46e8b9a\": container with ID starting with ce0cd3a1cb483fe415b0ae91addb6899927fca6cedcf517e0ac6b5e4f46e8b9a not found: ID does not exist" Apr 17 20:07:34.377477 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:34.377426 2576 scope.go:117] "RemoveContainer" containerID="88f380ed1679fb604fffdab23d43c2bafd48d0d5bea25c0a8c55b73b3fe8984c" Apr 17 20:07:34.377670 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:07:34.377650 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88f380ed1679fb604fffdab23d43c2bafd48d0d5bea25c0a8c55b73b3fe8984c\": container with ID starting with 88f380ed1679fb604fffdab23d43c2bafd48d0d5bea25c0a8c55b73b3fe8984c not found: ID does not exist" containerID="88f380ed1679fb604fffdab23d43c2bafd48d0d5bea25c0a8c55b73b3fe8984c" Apr 17 20:07:34.377709 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:34.377675 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88f380ed1679fb604fffdab23d43c2bafd48d0d5bea25c0a8c55b73b3fe8984c"} err="failed to get container status \"88f380ed1679fb604fffdab23d43c2bafd48d0d5bea25c0a8c55b73b3fe8984c\": rpc error: code = NotFound desc = could not find container \"88f380ed1679fb604fffdab23d43c2bafd48d0d5bea25c0a8c55b73b3fe8984c\": container with ID starting with 88f380ed1679fb604fffdab23d43c2bafd48d0d5bea25c0a8c55b73b3fe8984c not found: ID does not exist" Apr 17 20:07:34.377709 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:34.377690 2576 scope.go:117] "RemoveContainer" containerID="4e5a1bc45ae7f8fd865c8f36d3b9799b860873e74862162264b641b978dc5e52" Apr 17 20:07:34.378161 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:07:34.378141 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e5a1bc45ae7f8fd865c8f36d3b9799b860873e74862162264b641b978dc5e52\": container with ID starting with 4e5a1bc45ae7f8fd865c8f36d3b9799b860873e74862162264b641b978dc5e52 not found: ID does not exist" containerID="4e5a1bc45ae7f8fd865c8f36d3b9799b860873e74862162264b641b978dc5e52" Apr 17 20:07:34.378224 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:34.378164 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e5a1bc45ae7f8fd865c8f36d3b9799b860873e74862162264b641b978dc5e52"} err="failed to get container status \"4e5a1bc45ae7f8fd865c8f36d3b9799b860873e74862162264b641b978dc5e52\": rpc error: code = NotFound desc = could not find container \"4e5a1bc45ae7f8fd865c8f36d3b9799b860873e74862162264b641b978dc5e52\": container with ID starting with 4e5a1bc45ae7f8fd865c8f36d3b9799b860873e74862162264b641b978dc5e52 not found: ID does not exist" Apr 17 20:07:34.378224 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:34.378178 2576 scope.go:117] "RemoveContainer" containerID="dc4ffe782285e0b09113efcac8b83e487839f3ca57b9b680384834d7535704fe" Apr 17 20:07:34.378382 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:07:34.378367 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc4ffe782285e0b09113efcac8b83e487839f3ca57b9b680384834d7535704fe\": container with ID starting with dc4ffe782285e0b09113efcac8b83e487839f3ca57b9b680384834d7535704fe not found: ID does not exist" containerID="dc4ffe782285e0b09113efcac8b83e487839f3ca57b9b680384834d7535704fe" Apr 17 20:07:34.378419 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:34.378384 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc4ffe782285e0b09113efcac8b83e487839f3ca57b9b680384834d7535704fe"} err="failed to get container status \"dc4ffe782285e0b09113efcac8b83e487839f3ca57b9b680384834d7535704fe\": rpc error: code = NotFound desc = could not find container \"dc4ffe782285e0b09113efcac8b83e487839f3ca57b9b680384834d7535704fe\": container with ID starting with dc4ffe782285e0b09113efcac8b83e487839f3ca57b9b680384834d7535704fe not found: ID does not exist" Apr 17 20:07:34.378419 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:34.378399 2576 scope.go:117] "RemoveContainer" containerID="a6ea9197e526b36e37b830d467849a7844dad40daf3d040dd574377df3f174ce" Apr 17 20:07:34.378643 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:34.378624 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6ea9197e526b36e37b830d467849a7844dad40daf3d040dd574377df3f174ce"} err="failed to get container status \"a6ea9197e526b36e37b830d467849a7844dad40daf3d040dd574377df3f174ce\": rpc error: code = NotFound desc = could not find container \"a6ea9197e526b36e37b830d467849a7844dad40daf3d040dd574377df3f174ce\": container with ID starting with a6ea9197e526b36e37b830d467849a7844dad40daf3d040dd574377df3f174ce not found: ID does not exist" Apr 17 20:07:34.378718 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:34.378646 2576 scope.go:117] "RemoveContainer" containerID="5b9448abd3f038b16df183c4b1f1e29e6a30fc1b2912407bdd71bbe9a194390c" Apr 17 20:07:34.378886 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:34.378868 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b9448abd3f038b16df183c4b1f1e29e6a30fc1b2912407bdd71bbe9a194390c"} err="failed to get container status \"5b9448abd3f038b16df183c4b1f1e29e6a30fc1b2912407bdd71bbe9a194390c\": rpc error: code = NotFound desc = could not find container \"5b9448abd3f038b16df183c4b1f1e29e6a30fc1b2912407bdd71bbe9a194390c\": container with ID starting with 5b9448abd3f038b16df183c4b1f1e29e6a30fc1b2912407bdd71bbe9a194390c not found: ID does not exist" Apr 17 20:07:34.378937 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:34.378887 2576 scope.go:117] "RemoveContainer" containerID="832b364ef9ab7670400c768540e8ee89eef29f3983e5ea526c9b454b7941595f" Apr 17 20:07:34.379127 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:34.379103 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"832b364ef9ab7670400c768540e8ee89eef29f3983e5ea526c9b454b7941595f"} err="failed to get container status \"832b364ef9ab7670400c768540e8ee89eef29f3983e5ea526c9b454b7941595f\": rpc error: code = NotFound desc = could not find container \"832b364ef9ab7670400c768540e8ee89eef29f3983e5ea526c9b454b7941595f\": container with ID starting with 832b364ef9ab7670400c768540e8ee89eef29f3983e5ea526c9b454b7941595f not found: ID does not exist" Apr 17 20:07:34.379127 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:34.379126 2576 scope.go:117] "RemoveContainer" containerID="ce0cd3a1cb483fe415b0ae91addb6899927fca6cedcf517e0ac6b5e4f46e8b9a" Apr 17 20:07:34.379331 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:34.379316 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce0cd3a1cb483fe415b0ae91addb6899927fca6cedcf517e0ac6b5e4f46e8b9a"} err="failed to get container status \"ce0cd3a1cb483fe415b0ae91addb6899927fca6cedcf517e0ac6b5e4f46e8b9a\": rpc error: code = NotFound desc = could not find container \"ce0cd3a1cb483fe415b0ae91addb6899927fca6cedcf517e0ac6b5e4f46e8b9a\": container with ID starting with ce0cd3a1cb483fe415b0ae91addb6899927fca6cedcf517e0ac6b5e4f46e8b9a not found: ID does not exist" Apr 17 20:07:34.379377 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:34.379331 2576 scope.go:117] "RemoveContainer" containerID="88f380ed1679fb604fffdab23d43c2bafd48d0d5bea25c0a8c55b73b3fe8984c" Apr 17 20:07:34.379525 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:34.379509 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88f380ed1679fb604fffdab23d43c2bafd48d0d5bea25c0a8c55b73b3fe8984c"} err="failed to get container status \"88f380ed1679fb604fffdab23d43c2bafd48d0d5bea25c0a8c55b73b3fe8984c\": rpc error: code = NotFound desc = could not find container \"88f380ed1679fb604fffdab23d43c2bafd48d0d5bea25c0a8c55b73b3fe8984c\": container with ID starting with 88f380ed1679fb604fffdab23d43c2bafd48d0d5bea25c0a8c55b73b3fe8984c not found: ID does not exist" Apr 17 20:07:34.379578 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:34.379525 2576 scope.go:117] "RemoveContainer" containerID="4e5a1bc45ae7f8fd865c8f36d3b9799b860873e74862162264b641b978dc5e52" Apr 17 20:07:34.379731 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:34.379715 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e5a1bc45ae7f8fd865c8f36d3b9799b860873e74862162264b641b978dc5e52"} err="failed to get container status \"4e5a1bc45ae7f8fd865c8f36d3b9799b860873e74862162264b641b978dc5e52\": rpc error: code = NotFound desc = could not find container \"4e5a1bc45ae7f8fd865c8f36d3b9799b860873e74862162264b641b978dc5e52\": container with ID starting with 4e5a1bc45ae7f8fd865c8f36d3b9799b860873e74862162264b641b978dc5e52 not found: ID does not exist" Apr 17 20:07:34.379731 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:34.379730 2576 scope.go:117] "RemoveContainer" containerID="dc4ffe782285e0b09113efcac8b83e487839f3ca57b9b680384834d7535704fe" Apr 17 20:07:34.380063 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:34.380036 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc4ffe782285e0b09113efcac8b83e487839f3ca57b9b680384834d7535704fe"} err="failed to get container status \"dc4ffe782285e0b09113efcac8b83e487839f3ca57b9b680384834d7535704fe\": rpc error: code = NotFound desc = could not find container \"dc4ffe782285e0b09113efcac8b83e487839f3ca57b9b680384834d7535704fe\": container with ID starting with dc4ffe782285e0b09113efcac8b83e487839f3ca57b9b680384834d7535704fe not found: ID does not exist" Apr 17 20:07:35.637621 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:35.637589 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" path="/var/lib/kubelet/pods/0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d/volumes" Apr 17 20:07:36.527701 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.527665 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 20:07:36.528302 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.528158 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="e5d48dcb-f9b5-4299-89f4-4829a18daddd" containerName="prometheus" containerID="cri-o://41b921a40669844a2e807a176a1c000edeb942134aa814bd8f60a019b15854c3" gracePeriod=600 Apr 17 20:07:36.528302 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.528186 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="e5d48dcb-f9b5-4299-89f4-4829a18daddd" containerName="thanos-sidecar" containerID="cri-o://f1aa297c24b90ccc087d4a3dfc011ac712ad7ffdebb0a10a5fae20a944c877d2" gracePeriod=600 Apr 17 20:07:36.528302 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.528194 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="e5d48dcb-f9b5-4299-89f4-4829a18daddd" containerName="kube-rbac-proxy-web" containerID="cri-o://4f1ed52c0f790adc33a775a988d3076cf3f4ae11c89f9858f43ccb5a466a3ef4" gracePeriod=600 Apr 17 20:07:36.528302 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.528237 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="e5d48dcb-f9b5-4299-89f4-4829a18daddd" containerName="config-reloader" containerID="cri-o://3667578875c98f6e2ba81e5229d551cc5625415c9e58508399f5158e8644fed2" gracePeriod=600 Apr 17 20:07:36.528302 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.528186 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="e5d48dcb-f9b5-4299-89f4-4829a18daddd" containerName="kube-rbac-proxy" containerID="cri-o://0eb1c770bfb0a201eae4bb5ef493a7c141a2eefa83df29b5406de63c99521617" gracePeriod=600 Apr 17 20:07:36.528638 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.528245 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="e5d48dcb-f9b5-4299-89f4-4829a18daddd" containerName="kube-rbac-proxy-thanos" containerID="cri-o://c9760fb1e9402d8529a0d1e053d741c2550676ef1a1ffe1180f5ce13668c252c" gracePeriod=600 Apr 17 20:07:36.768434 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.768411 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:36.950485 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.950444 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e5d48dcb-f9b5-4299-89f4-4829a18daddd-tls-assets\") pod \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " Apr 17 20:07:36.950677 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.950496 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-web-config\") pod \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " Apr 17 20:07:36.950677 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.950538 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cz8p9\" (UniqueName: \"kubernetes.io/projected/e5d48dcb-f9b5-4299-89f4-4829a18daddd-kube-api-access-cz8p9\") pod \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " Apr 17 20:07:36.950793 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.950710 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-secret-kube-rbac-proxy\") pod \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " Apr 17 20:07:36.950793 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.950783 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e5d48dcb-f9b5-4299-89f4-4829a18daddd-prometheus-k8s-db\") pod \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " Apr 17 20:07:36.950941 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.950814 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-config\") pod \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " Apr 17 20:07:36.950941 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.950866 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e5d48dcb-f9b5-4299-89f4-4829a18daddd-config-out\") pod \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " Apr 17 20:07:36.950941 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.950923 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " Apr 17 20:07:36.951073 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.950961 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-secret-prometheus-k8s-tls\") pod \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " Apr 17 20:07:36.951073 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.950989 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " Apr 17 20:07:36.951073 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.951014 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e5d48dcb-f9b5-4299-89f4-4829a18daddd-prometheus-k8s-rulefiles-0\") pod \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " Apr 17 20:07:36.951073 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.951061 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5d48dcb-f9b5-4299-89f4-4829a18daddd-configmap-kubelet-serving-ca-bundle\") pod \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " Apr 17 20:07:36.951274 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.951088 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5d48dcb-f9b5-4299-89f4-4829a18daddd-configmap-serving-certs-ca-bundle\") pod \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " Apr 17 20:07:36.951274 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.951114 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-thanos-prometheus-http-client-file\") pod \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " Apr 17 20:07:36.951274 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.951151 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5d48dcb-f9b5-4299-89f4-4829a18daddd-prometheus-trusted-ca-bundle\") pod \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " Apr 17 20:07:36.951274 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.951181 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-secret-grpc-tls\") pod \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " Apr 17 20:07:36.951274 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.951220 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-secret-metrics-client-certs\") pod \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " Apr 17 20:07:36.951274 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.951250 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e5d48dcb-f9b5-4299-89f4-4829a18daddd-configmap-metrics-client-ca\") pod \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\" (UID: \"e5d48dcb-f9b5-4299-89f4-4829a18daddd\") " Apr 17 20:07:36.951896 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.951653 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5d48dcb-f9b5-4299-89f4-4829a18daddd-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "e5d48dcb-f9b5-4299-89f4-4829a18daddd" (UID: "e5d48dcb-f9b5-4299-89f4-4829a18daddd"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:07:36.951896 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.951825 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5d48dcb-f9b5-4299-89f4-4829a18daddd-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "e5d48dcb-f9b5-4299-89f4-4829a18daddd" (UID: "e5d48dcb-f9b5-4299-89f4-4829a18daddd"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:07:36.952627 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.952118 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5d48dcb-f9b5-4299-89f4-4829a18daddd-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "e5d48dcb-f9b5-4299-89f4-4829a18daddd" (UID: "e5d48dcb-f9b5-4299-89f4-4829a18daddd"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:07:36.952627 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.952344 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5d48dcb-f9b5-4299-89f4-4829a18daddd-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "e5d48dcb-f9b5-4299-89f4-4829a18daddd" (UID: "e5d48dcb-f9b5-4299-89f4-4829a18daddd"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:07:36.953055 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.953002 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5d48dcb-f9b5-4299-89f4-4829a18daddd-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "e5d48dcb-f9b5-4299-89f4-4829a18daddd" (UID: "e5d48dcb-f9b5-4299-89f4-4829a18daddd"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:07:36.953055 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.953029 2576 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e5d48dcb-f9b5-4299-89f4-4829a18daddd-prometheus-k8s-db\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:07:36.953055 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.953055 2576 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5d48dcb-f9b5-4299-89f4-4829a18daddd-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:07:36.953254 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.953071 2576 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5d48dcb-f9b5-4299-89f4-4829a18daddd-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:07:36.953254 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.953087 2576 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5d48dcb-f9b5-4299-89f4-4829a18daddd-prometheus-trusted-ca-bundle\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:07:36.953726 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.953491 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5d48dcb-f9b5-4299-89f4-4829a18daddd-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "e5d48dcb-f9b5-4299-89f4-4829a18daddd" (UID: "e5d48dcb-f9b5-4299-89f4-4829a18daddd"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:07:36.953968 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.953940 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5d48dcb-f9b5-4299-89f4-4829a18daddd-kube-api-access-cz8p9" (OuterVolumeSpecName: "kube-api-access-cz8p9") pod "e5d48dcb-f9b5-4299-89f4-4829a18daddd" (UID: "e5d48dcb-f9b5-4299-89f4-4829a18daddd"). InnerVolumeSpecName "kube-api-access-cz8p9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:07:36.954508 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.954462 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "e5d48dcb-f9b5-4299-89f4-4829a18daddd" (UID: "e5d48dcb-f9b5-4299-89f4-4829a18daddd"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:07:36.954636 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.954609 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "e5d48dcb-f9b5-4299-89f4-4829a18daddd" (UID: "e5d48dcb-f9b5-4299-89f4-4829a18daddd"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:07:36.954714 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.954687 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "e5d48dcb-f9b5-4299-89f4-4829a18daddd" (UID: "e5d48dcb-f9b5-4299-89f4-4829a18daddd"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:07:36.954773 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.954729 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5d48dcb-f9b5-4299-89f4-4829a18daddd-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "e5d48dcb-f9b5-4299-89f4-4829a18daddd" (UID: "e5d48dcb-f9b5-4299-89f4-4829a18daddd"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:07:36.955008 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.954973 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5d48dcb-f9b5-4299-89f4-4829a18daddd-config-out" (OuterVolumeSpecName: "config-out") pod "e5d48dcb-f9b5-4299-89f4-4829a18daddd" (UID: "e5d48dcb-f9b5-4299-89f4-4829a18daddd"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:07:36.955724 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.955699 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "e5d48dcb-f9b5-4299-89f4-4829a18daddd" (UID: "e5d48dcb-f9b5-4299-89f4-4829a18daddd"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:07:36.955929 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.955907 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "e5d48dcb-f9b5-4299-89f4-4829a18daddd" (UID: "e5d48dcb-f9b5-4299-89f4-4829a18daddd"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:07:36.956020 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.955963 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-config" (OuterVolumeSpecName: "config") pod "e5d48dcb-f9b5-4299-89f4-4829a18daddd" (UID: "e5d48dcb-f9b5-4299-89f4-4829a18daddd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:07:36.956336 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.956316 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "e5d48dcb-f9b5-4299-89f4-4829a18daddd" (UID: "e5d48dcb-f9b5-4299-89f4-4829a18daddd"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:07:36.956434 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.956418 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "e5d48dcb-f9b5-4299-89f4-4829a18daddd" (UID: "e5d48dcb-f9b5-4299-89f4-4829a18daddd"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:07:36.963484 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:36.963462 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-web-config" (OuterVolumeSpecName: "web-config") pod "e5d48dcb-f9b5-4299-89f4-4829a18daddd" (UID: "e5d48dcb-f9b5-4299-89f4-4829a18daddd"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:07:37.053982 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.053945 2576 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:07:37.053982 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.053977 2576 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-secret-prometheus-k8s-tls\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:07:37.053982 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.053988 2576 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:07:37.054197 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.053997 2576 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e5d48dcb-f9b5-4299-89f4-4829a18daddd-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:07:37.054197 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.054008 2576 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-thanos-prometheus-http-client-file\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:07:37.054197 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.054019 2576 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-secret-grpc-tls\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:07:37.054197 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.054027 2576 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-secret-metrics-client-certs\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:07:37.054197 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.054036 2576 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e5d48dcb-f9b5-4299-89f4-4829a18daddd-configmap-metrics-client-ca\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:07:37.054197 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.054045 2576 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e5d48dcb-f9b5-4299-89f4-4829a18daddd-tls-assets\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:07:37.054197 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.054054 2576 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-web-config\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:07:37.054197 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.054062 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cz8p9\" (UniqueName: \"kubernetes.io/projected/e5d48dcb-f9b5-4299-89f4-4829a18daddd-kube-api-access-cz8p9\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:07:37.054197 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.054070 2576 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-secret-kube-rbac-proxy\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:07:37.054197 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.054081 2576 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e5d48dcb-f9b5-4299-89f4-4829a18daddd-config\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:07:37.054197 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.054089 2576 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e5d48dcb-f9b5-4299-89f4-4829a18daddd-config-out\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:07:37.342219 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.342179 2576 generic.go:358] "Generic (PLEG): container finished" podID="e5d48dcb-f9b5-4299-89f4-4829a18daddd" containerID="c9760fb1e9402d8529a0d1e053d741c2550676ef1a1ffe1180f5ce13668c252c" exitCode=0 Apr 17 20:07:37.342219 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.342213 2576 generic.go:358] "Generic (PLEG): container finished" podID="e5d48dcb-f9b5-4299-89f4-4829a18daddd" containerID="0eb1c770bfb0a201eae4bb5ef493a7c141a2eefa83df29b5406de63c99521617" exitCode=0 Apr 17 20:07:37.342219 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.342226 2576 generic.go:358] "Generic (PLEG): container finished" podID="e5d48dcb-f9b5-4299-89f4-4829a18daddd" containerID="4f1ed52c0f790adc33a775a988d3076cf3f4ae11c89f9858f43ccb5a466a3ef4" exitCode=0 Apr 17 20:07:37.342219 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.342219 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e5d48dcb-f9b5-4299-89f4-4829a18daddd","Type":"ContainerDied","Data":"c9760fb1e9402d8529a0d1e053d741c2550676ef1a1ffe1180f5ce13668c252c"} Apr 17 20:07:37.342590 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.342279 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e5d48dcb-f9b5-4299-89f4-4829a18daddd","Type":"ContainerDied","Data":"0eb1c770bfb0a201eae4bb5ef493a7c141a2eefa83df29b5406de63c99521617"} Apr 17 20:07:37.342590 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.342300 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e5d48dcb-f9b5-4299-89f4-4829a18daddd","Type":"ContainerDied","Data":"4f1ed52c0f790adc33a775a988d3076cf3f4ae11c89f9858f43ccb5a466a3ef4"} Apr 17 20:07:37.342590 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.342315 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e5d48dcb-f9b5-4299-89f4-4829a18daddd","Type":"ContainerDied","Data":"f1aa297c24b90ccc087d4a3dfc011ac712ad7ffdebb0a10a5fae20a944c877d2"} Apr 17 20:07:37.342590 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.342235 2576 generic.go:358] "Generic (PLEG): container finished" podID="e5d48dcb-f9b5-4299-89f4-4829a18daddd" containerID="f1aa297c24b90ccc087d4a3dfc011ac712ad7ffdebb0a10a5fae20a944c877d2" exitCode=0 Apr 17 20:07:37.342590 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.342340 2576 scope.go:117] "RemoveContainer" containerID="c9760fb1e9402d8529a0d1e053d741c2550676ef1a1ffe1180f5ce13668c252c" Apr 17 20:07:37.342590 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.342350 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.342590 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.342375 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e5d48dcb-f9b5-4299-89f4-4829a18daddd","Type":"ContainerDied","Data":"3667578875c98f6e2ba81e5229d551cc5625415c9e58508399f5158e8644fed2"} Apr 17 20:07:37.342590 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.342352 2576 generic.go:358] "Generic (PLEG): container finished" podID="e5d48dcb-f9b5-4299-89f4-4829a18daddd" containerID="3667578875c98f6e2ba81e5229d551cc5625415c9e58508399f5158e8644fed2" exitCode=0 Apr 17 20:07:37.342590 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.342475 2576 generic.go:358] "Generic (PLEG): container finished" podID="e5d48dcb-f9b5-4299-89f4-4829a18daddd" containerID="41b921a40669844a2e807a176a1c000edeb942134aa814bd8f60a019b15854c3" exitCode=0 Apr 17 20:07:37.342590 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.342580 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e5d48dcb-f9b5-4299-89f4-4829a18daddd","Type":"ContainerDied","Data":"41b921a40669844a2e807a176a1c000edeb942134aa814bd8f60a019b15854c3"} Apr 17 20:07:37.343120 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.342600 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e5d48dcb-f9b5-4299-89f4-4829a18daddd","Type":"ContainerDied","Data":"5b97c375857821b4a7534e43b3f0779d4be668541766d7bf75dbc1fb88916734"} Apr 17 20:07:37.349691 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.349657 2576 scope.go:117] "RemoveContainer" containerID="0eb1c770bfb0a201eae4bb5ef493a7c141a2eefa83df29b5406de63c99521617" Apr 17 20:07:37.356411 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.356395 2576 scope.go:117] "RemoveContainer" containerID="4f1ed52c0f790adc33a775a988d3076cf3f4ae11c89f9858f43ccb5a466a3ef4" Apr 17 20:07:37.362654 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.362631 2576 scope.go:117] "RemoveContainer" containerID="f1aa297c24b90ccc087d4a3dfc011ac712ad7ffdebb0a10a5fae20a944c877d2" Apr 17 20:07:37.365834 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.365811 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 20:07:37.369720 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.369702 2576 scope.go:117] "RemoveContainer" containerID="3667578875c98f6e2ba81e5229d551cc5625415c9e58508399f5158e8644fed2" Apr 17 20:07:37.369871 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.369784 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 20:07:37.376243 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.376211 2576 scope.go:117] "RemoveContainer" containerID="41b921a40669844a2e807a176a1c000edeb942134aa814bd8f60a019b15854c3" Apr 17 20:07:37.382978 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.382961 2576 scope.go:117] "RemoveContainer" containerID="f1b7c6b4daa7dbfc0d51e59834db22d799466673448e0e808cb451df35f40d16" Apr 17 20:07:37.389089 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.389069 2576 scope.go:117] "RemoveContainer" containerID="c9760fb1e9402d8529a0d1e053d741c2550676ef1a1ffe1180f5ce13668c252c" Apr 17 20:07:37.389322 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:07:37.389305 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9760fb1e9402d8529a0d1e053d741c2550676ef1a1ffe1180f5ce13668c252c\": container with ID starting with c9760fb1e9402d8529a0d1e053d741c2550676ef1a1ffe1180f5ce13668c252c not found: ID does not exist" containerID="c9760fb1e9402d8529a0d1e053d741c2550676ef1a1ffe1180f5ce13668c252c" Apr 17 20:07:37.389390 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.389335 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9760fb1e9402d8529a0d1e053d741c2550676ef1a1ffe1180f5ce13668c252c"} err="failed to get container status \"c9760fb1e9402d8529a0d1e053d741c2550676ef1a1ffe1180f5ce13668c252c\": rpc error: code = NotFound desc = could not find container \"c9760fb1e9402d8529a0d1e053d741c2550676ef1a1ffe1180f5ce13668c252c\": container with ID starting with c9760fb1e9402d8529a0d1e053d741c2550676ef1a1ffe1180f5ce13668c252c not found: ID does not exist" Apr 17 20:07:37.389390 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.389362 2576 scope.go:117] "RemoveContainer" containerID="0eb1c770bfb0a201eae4bb5ef493a7c141a2eefa83df29b5406de63c99521617" Apr 17 20:07:37.389604 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:07:37.389586 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0eb1c770bfb0a201eae4bb5ef493a7c141a2eefa83df29b5406de63c99521617\": container with ID starting with 0eb1c770bfb0a201eae4bb5ef493a7c141a2eefa83df29b5406de63c99521617 not found: ID does not exist" containerID="0eb1c770bfb0a201eae4bb5ef493a7c141a2eefa83df29b5406de63c99521617" Apr 17 20:07:37.389646 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.389610 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eb1c770bfb0a201eae4bb5ef493a7c141a2eefa83df29b5406de63c99521617"} err="failed to get container status \"0eb1c770bfb0a201eae4bb5ef493a7c141a2eefa83df29b5406de63c99521617\": rpc error: code = NotFound desc = could not find container \"0eb1c770bfb0a201eae4bb5ef493a7c141a2eefa83df29b5406de63c99521617\": container with ID starting with 0eb1c770bfb0a201eae4bb5ef493a7c141a2eefa83df29b5406de63c99521617 not found: ID does not exist" Apr 17 20:07:37.389646 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.389627 2576 scope.go:117] "RemoveContainer" containerID="4f1ed52c0f790adc33a775a988d3076cf3f4ae11c89f9858f43ccb5a466a3ef4" Apr 17 20:07:37.389855 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:07:37.389827 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f1ed52c0f790adc33a775a988d3076cf3f4ae11c89f9858f43ccb5a466a3ef4\": container with ID starting with 4f1ed52c0f790adc33a775a988d3076cf3f4ae11c89f9858f43ccb5a466a3ef4 not found: ID does not exist" containerID="4f1ed52c0f790adc33a775a988d3076cf3f4ae11c89f9858f43ccb5a466a3ef4" Apr 17 20:07:37.389909 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.389862 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f1ed52c0f790adc33a775a988d3076cf3f4ae11c89f9858f43ccb5a466a3ef4"} err="failed to get container status \"4f1ed52c0f790adc33a775a988d3076cf3f4ae11c89f9858f43ccb5a466a3ef4\": rpc error: code = NotFound desc = could not find container \"4f1ed52c0f790adc33a775a988d3076cf3f4ae11c89f9858f43ccb5a466a3ef4\": container with ID starting with 4f1ed52c0f790adc33a775a988d3076cf3f4ae11c89f9858f43ccb5a466a3ef4 not found: ID does not exist" Apr 17 20:07:37.389909 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.389882 2576 scope.go:117] "RemoveContainer" containerID="f1aa297c24b90ccc087d4a3dfc011ac712ad7ffdebb0a10a5fae20a944c877d2" Apr 17 20:07:37.390132 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:07:37.390115 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1aa297c24b90ccc087d4a3dfc011ac712ad7ffdebb0a10a5fae20a944c877d2\": container with ID starting with f1aa297c24b90ccc087d4a3dfc011ac712ad7ffdebb0a10a5fae20a944c877d2 not found: ID does not exist" containerID="f1aa297c24b90ccc087d4a3dfc011ac712ad7ffdebb0a10a5fae20a944c877d2" Apr 17 20:07:37.390204 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.390139 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1aa297c24b90ccc087d4a3dfc011ac712ad7ffdebb0a10a5fae20a944c877d2"} err="failed to get container status \"f1aa297c24b90ccc087d4a3dfc011ac712ad7ffdebb0a10a5fae20a944c877d2\": rpc error: code = NotFound desc = could not find container \"f1aa297c24b90ccc087d4a3dfc011ac712ad7ffdebb0a10a5fae20a944c877d2\": container with ID starting with f1aa297c24b90ccc087d4a3dfc011ac712ad7ffdebb0a10a5fae20a944c877d2 not found: ID does not exist" Apr 17 20:07:37.390204 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.390160 2576 scope.go:117] "RemoveContainer" containerID="3667578875c98f6e2ba81e5229d551cc5625415c9e58508399f5158e8644fed2" Apr 17 20:07:37.390381 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:07:37.390365 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3667578875c98f6e2ba81e5229d551cc5625415c9e58508399f5158e8644fed2\": container with ID starting with 3667578875c98f6e2ba81e5229d551cc5625415c9e58508399f5158e8644fed2 not found: ID does not exist" containerID="3667578875c98f6e2ba81e5229d551cc5625415c9e58508399f5158e8644fed2" Apr 17 20:07:37.390418 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.390385 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3667578875c98f6e2ba81e5229d551cc5625415c9e58508399f5158e8644fed2"} err="failed to get container status \"3667578875c98f6e2ba81e5229d551cc5625415c9e58508399f5158e8644fed2\": rpc error: code = NotFound desc = could not find container \"3667578875c98f6e2ba81e5229d551cc5625415c9e58508399f5158e8644fed2\": container with ID starting with 3667578875c98f6e2ba81e5229d551cc5625415c9e58508399f5158e8644fed2 not found: ID does not exist" Apr 17 20:07:37.390418 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.390400 2576 scope.go:117] "RemoveContainer" containerID="41b921a40669844a2e807a176a1c000edeb942134aa814bd8f60a019b15854c3" Apr 17 20:07:37.390615 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:07:37.390596 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41b921a40669844a2e807a176a1c000edeb942134aa814bd8f60a019b15854c3\": container with ID starting with 41b921a40669844a2e807a176a1c000edeb942134aa814bd8f60a019b15854c3 not found: ID does not exist" containerID="41b921a40669844a2e807a176a1c000edeb942134aa814bd8f60a019b15854c3" Apr 17 20:07:37.390652 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.390620 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41b921a40669844a2e807a176a1c000edeb942134aa814bd8f60a019b15854c3"} err="failed to get container status \"41b921a40669844a2e807a176a1c000edeb942134aa814bd8f60a019b15854c3\": rpc error: code = NotFound desc = could not find container \"41b921a40669844a2e807a176a1c000edeb942134aa814bd8f60a019b15854c3\": container with ID starting with 41b921a40669844a2e807a176a1c000edeb942134aa814bd8f60a019b15854c3 not found: ID does not exist" Apr 17 20:07:37.390652 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.390637 2576 scope.go:117] "RemoveContainer" containerID="f1b7c6b4daa7dbfc0d51e59834db22d799466673448e0e808cb451df35f40d16" Apr 17 20:07:37.390856 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:07:37.390827 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1b7c6b4daa7dbfc0d51e59834db22d799466673448e0e808cb451df35f40d16\": container with ID starting with f1b7c6b4daa7dbfc0d51e59834db22d799466673448e0e808cb451df35f40d16 not found: ID does not exist" containerID="f1b7c6b4daa7dbfc0d51e59834db22d799466673448e0e808cb451df35f40d16" Apr 17 20:07:37.390913 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.390862 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1b7c6b4daa7dbfc0d51e59834db22d799466673448e0e808cb451df35f40d16"} err="failed to get container status \"f1b7c6b4daa7dbfc0d51e59834db22d799466673448e0e808cb451df35f40d16\": rpc error: code = NotFound desc = could not find container \"f1b7c6b4daa7dbfc0d51e59834db22d799466673448e0e808cb451df35f40d16\": container with ID starting with f1b7c6b4daa7dbfc0d51e59834db22d799466673448e0e808cb451df35f40d16 not found: ID does not exist" Apr 17 20:07:37.390913 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.390884 2576 scope.go:117] "RemoveContainer" containerID="c9760fb1e9402d8529a0d1e053d741c2550676ef1a1ffe1180f5ce13668c252c" Apr 17 20:07:37.391112 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.391094 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9760fb1e9402d8529a0d1e053d741c2550676ef1a1ffe1180f5ce13668c252c"} err="failed to get container status \"c9760fb1e9402d8529a0d1e053d741c2550676ef1a1ffe1180f5ce13668c252c\": rpc error: code = NotFound desc = could not find container \"c9760fb1e9402d8529a0d1e053d741c2550676ef1a1ffe1180f5ce13668c252c\": container with ID starting with c9760fb1e9402d8529a0d1e053d741c2550676ef1a1ffe1180f5ce13668c252c not found: ID does not exist" Apr 17 20:07:37.391158 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.391113 2576 scope.go:117] "RemoveContainer" containerID="0eb1c770bfb0a201eae4bb5ef493a7c141a2eefa83df29b5406de63c99521617" Apr 17 20:07:37.391321 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.391305 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eb1c770bfb0a201eae4bb5ef493a7c141a2eefa83df29b5406de63c99521617"} err="failed to get container status \"0eb1c770bfb0a201eae4bb5ef493a7c141a2eefa83df29b5406de63c99521617\": rpc error: code = NotFound desc = could not find container \"0eb1c770bfb0a201eae4bb5ef493a7c141a2eefa83df29b5406de63c99521617\": container with ID starting with 0eb1c770bfb0a201eae4bb5ef493a7c141a2eefa83df29b5406de63c99521617 not found: ID does not exist" Apr 17 20:07:37.391371 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.391322 2576 scope.go:117] "RemoveContainer" containerID="4f1ed52c0f790adc33a775a988d3076cf3f4ae11c89f9858f43ccb5a466a3ef4" Apr 17 20:07:37.391529 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.391512 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f1ed52c0f790adc33a775a988d3076cf3f4ae11c89f9858f43ccb5a466a3ef4"} err="failed to get container status \"4f1ed52c0f790adc33a775a988d3076cf3f4ae11c89f9858f43ccb5a466a3ef4\": rpc error: code = NotFound desc = could not find container \"4f1ed52c0f790adc33a775a988d3076cf3f4ae11c89f9858f43ccb5a466a3ef4\": container with ID starting with 4f1ed52c0f790adc33a775a988d3076cf3f4ae11c89f9858f43ccb5a466a3ef4 not found: ID does not exist" Apr 17 20:07:37.391565 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.391529 2576 scope.go:117] "RemoveContainer" containerID="f1aa297c24b90ccc087d4a3dfc011ac712ad7ffdebb0a10a5fae20a944c877d2" Apr 17 20:07:37.391722 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.391695 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1aa297c24b90ccc087d4a3dfc011ac712ad7ffdebb0a10a5fae20a944c877d2"} err="failed to get container status \"f1aa297c24b90ccc087d4a3dfc011ac712ad7ffdebb0a10a5fae20a944c877d2\": rpc error: code = NotFound desc = could not find container \"f1aa297c24b90ccc087d4a3dfc011ac712ad7ffdebb0a10a5fae20a944c877d2\": container with ID starting with f1aa297c24b90ccc087d4a3dfc011ac712ad7ffdebb0a10a5fae20a944c877d2 not found: ID does not exist" Apr 17 20:07:37.391760 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.391722 2576 scope.go:117] "RemoveContainer" containerID="3667578875c98f6e2ba81e5229d551cc5625415c9e58508399f5158e8644fed2" Apr 17 20:07:37.391933 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.391918 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3667578875c98f6e2ba81e5229d551cc5625415c9e58508399f5158e8644fed2"} err="failed to get container status \"3667578875c98f6e2ba81e5229d551cc5625415c9e58508399f5158e8644fed2\": rpc error: code = NotFound desc = could not find container \"3667578875c98f6e2ba81e5229d551cc5625415c9e58508399f5158e8644fed2\": container with ID starting with 3667578875c98f6e2ba81e5229d551cc5625415c9e58508399f5158e8644fed2 not found: ID does not exist" Apr 17 20:07:37.391933 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.391932 2576 scope.go:117] "RemoveContainer" containerID="41b921a40669844a2e807a176a1c000edeb942134aa814bd8f60a019b15854c3" Apr 17 20:07:37.392109 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.392095 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41b921a40669844a2e807a176a1c000edeb942134aa814bd8f60a019b15854c3"} err="failed to get container status \"41b921a40669844a2e807a176a1c000edeb942134aa814bd8f60a019b15854c3\": rpc error: code = NotFound desc = could not find container \"41b921a40669844a2e807a176a1c000edeb942134aa814bd8f60a019b15854c3\": container with ID starting with 41b921a40669844a2e807a176a1c000edeb942134aa814bd8f60a019b15854c3 not found: ID does not exist" Apr 17 20:07:37.392146 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.392109 2576 scope.go:117] "RemoveContainer" containerID="f1b7c6b4daa7dbfc0d51e59834db22d799466673448e0e808cb451df35f40d16" Apr 17 20:07:37.392289 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.392271 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1b7c6b4daa7dbfc0d51e59834db22d799466673448e0e808cb451df35f40d16"} err="failed to get container status \"f1b7c6b4daa7dbfc0d51e59834db22d799466673448e0e808cb451df35f40d16\": rpc error: code = NotFound desc = could not find container \"f1b7c6b4daa7dbfc0d51e59834db22d799466673448e0e808cb451df35f40d16\": container with ID starting with f1b7c6b4daa7dbfc0d51e59834db22d799466673448e0e808cb451df35f40d16 not found: ID does not exist" Apr 17 20:07:37.392358 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.392291 2576 scope.go:117] "RemoveContainer" containerID="c9760fb1e9402d8529a0d1e053d741c2550676ef1a1ffe1180f5ce13668c252c" Apr 17 20:07:37.392503 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.392485 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9760fb1e9402d8529a0d1e053d741c2550676ef1a1ffe1180f5ce13668c252c"} err="failed to get container status \"c9760fb1e9402d8529a0d1e053d741c2550676ef1a1ffe1180f5ce13668c252c\": rpc error: code = NotFound desc = could not find container \"c9760fb1e9402d8529a0d1e053d741c2550676ef1a1ffe1180f5ce13668c252c\": container with ID starting with c9760fb1e9402d8529a0d1e053d741c2550676ef1a1ffe1180f5ce13668c252c not found: ID does not exist" Apr 17 20:07:37.392546 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.392504 2576 scope.go:117] "RemoveContainer" containerID="0eb1c770bfb0a201eae4bb5ef493a7c141a2eefa83df29b5406de63c99521617" Apr 17 20:07:37.392715 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.392697 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eb1c770bfb0a201eae4bb5ef493a7c141a2eefa83df29b5406de63c99521617"} err="failed to get container status \"0eb1c770bfb0a201eae4bb5ef493a7c141a2eefa83df29b5406de63c99521617\": rpc error: code = NotFound desc = could not find container \"0eb1c770bfb0a201eae4bb5ef493a7c141a2eefa83df29b5406de63c99521617\": container with ID starting with 0eb1c770bfb0a201eae4bb5ef493a7c141a2eefa83df29b5406de63c99521617 not found: ID does not exist" Apr 17 20:07:37.392751 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.392716 2576 scope.go:117] "RemoveContainer" containerID="4f1ed52c0f790adc33a775a988d3076cf3f4ae11c89f9858f43ccb5a466a3ef4" Apr 17 20:07:37.392900 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.392885 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f1ed52c0f790adc33a775a988d3076cf3f4ae11c89f9858f43ccb5a466a3ef4"} err="failed to get container status \"4f1ed52c0f790adc33a775a988d3076cf3f4ae11c89f9858f43ccb5a466a3ef4\": rpc error: code = NotFound desc = could not find container \"4f1ed52c0f790adc33a775a988d3076cf3f4ae11c89f9858f43ccb5a466a3ef4\": container with ID starting with 4f1ed52c0f790adc33a775a988d3076cf3f4ae11c89f9858f43ccb5a466a3ef4 not found: ID does not exist" Apr 17 20:07:37.392953 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.392900 2576 scope.go:117] "RemoveContainer" containerID="f1aa297c24b90ccc087d4a3dfc011ac712ad7ffdebb0a10a5fae20a944c877d2" Apr 17 20:07:37.393109 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.393095 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1aa297c24b90ccc087d4a3dfc011ac712ad7ffdebb0a10a5fae20a944c877d2"} err="failed to get container status \"f1aa297c24b90ccc087d4a3dfc011ac712ad7ffdebb0a10a5fae20a944c877d2\": rpc error: code = NotFound desc = could not find container \"f1aa297c24b90ccc087d4a3dfc011ac712ad7ffdebb0a10a5fae20a944c877d2\": container with ID starting with f1aa297c24b90ccc087d4a3dfc011ac712ad7ffdebb0a10a5fae20a944c877d2 not found: ID does not exist" Apr 17 20:07:37.393156 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.393109 2576 scope.go:117] "RemoveContainer" containerID="3667578875c98f6e2ba81e5229d551cc5625415c9e58508399f5158e8644fed2" Apr 17 20:07:37.393302 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.393286 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3667578875c98f6e2ba81e5229d551cc5625415c9e58508399f5158e8644fed2"} err="failed to get container status \"3667578875c98f6e2ba81e5229d551cc5625415c9e58508399f5158e8644fed2\": rpc error: code = NotFound desc = could not find container \"3667578875c98f6e2ba81e5229d551cc5625415c9e58508399f5158e8644fed2\": container with ID starting with 3667578875c98f6e2ba81e5229d551cc5625415c9e58508399f5158e8644fed2 not found: ID does not exist" Apr 17 20:07:37.393302 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.393302 2576 scope.go:117] "RemoveContainer" containerID="41b921a40669844a2e807a176a1c000edeb942134aa814bd8f60a019b15854c3" Apr 17 20:07:37.393489 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.393471 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41b921a40669844a2e807a176a1c000edeb942134aa814bd8f60a019b15854c3"} err="failed to get container status \"41b921a40669844a2e807a176a1c000edeb942134aa814bd8f60a019b15854c3\": rpc error: code = NotFound desc = could not find container \"41b921a40669844a2e807a176a1c000edeb942134aa814bd8f60a019b15854c3\": container with ID starting with 41b921a40669844a2e807a176a1c000edeb942134aa814bd8f60a019b15854c3 not found: ID does not exist" Apr 17 20:07:37.393529 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.393490 2576 scope.go:117] "RemoveContainer" containerID="f1b7c6b4daa7dbfc0d51e59834db22d799466673448e0e808cb451df35f40d16" Apr 17 20:07:37.393687 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.393672 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1b7c6b4daa7dbfc0d51e59834db22d799466673448e0e808cb451df35f40d16"} err="failed to get container status \"f1b7c6b4daa7dbfc0d51e59834db22d799466673448e0e808cb451df35f40d16\": rpc error: code = NotFound desc = could not find container \"f1b7c6b4daa7dbfc0d51e59834db22d799466673448e0e808cb451df35f40d16\": container with ID starting with f1b7c6b4daa7dbfc0d51e59834db22d799466673448e0e808cb451df35f40d16 not found: ID does not exist" Apr 17 20:07:37.393738 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.393687 2576 scope.go:117] "RemoveContainer" containerID="c9760fb1e9402d8529a0d1e053d741c2550676ef1a1ffe1180f5ce13668c252c" Apr 17 20:07:37.393886 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.393871 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9760fb1e9402d8529a0d1e053d741c2550676ef1a1ffe1180f5ce13668c252c"} err="failed to get container status \"c9760fb1e9402d8529a0d1e053d741c2550676ef1a1ffe1180f5ce13668c252c\": rpc error: code = NotFound desc = could not find container \"c9760fb1e9402d8529a0d1e053d741c2550676ef1a1ffe1180f5ce13668c252c\": container with ID starting with c9760fb1e9402d8529a0d1e053d741c2550676ef1a1ffe1180f5ce13668c252c not found: ID does not exist" Apr 17 20:07:37.393939 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.393886 2576 scope.go:117] "RemoveContainer" containerID="0eb1c770bfb0a201eae4bb5ef493a7c141a2eefa83df29b5406de63c99521617" Apr 17 20:07:37.394078 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.394059 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eb1c770bfb0a201eae4bb5ef493a7c141a2eefa83df29b5406de63c99521617"} err="failed to get container status \"0eb1c770bfb0a201eae4bb5ef493a7c141a2eefa83df29b5406de63c99521617\": rpc error: code = NotFound desc = could not find container \"0eb1c770bfb0a201eae4bb5ef493a7c141a2eefa83df29b5406de63c99521617\": container with ID starting with 0eb1c770bfb0a201eae4bb5ef493a7c141a2eefa83df29b5406de63c99521617 not found: ID does not exist" Apr 17 20:07:37.394130 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.394078 2576 scope.go:117] "RemoveContainer" containerID="4f1ed52c0f790adc33a775a988d3076cf3f4ae11c89f9858f43ccb5a466a3ef4" Apr 17 20:07:37.394279 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.394260 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f1ed52c0f790adc33a775a988d3076cf3f4ae11c89f9858f43ccb5a466a3ef4"} err="failed to get container status \"4f1ed52c0f790adc33a775a988d3076cf3f4ae11c89f9858f43ccb5a466a3ef4\": rpc error: code = NotFound desc = could not find container \"4f1ed52c0f790adc33a775a988d3076cf3f4ae11c89f9858f43ccb5a466a3ef4\": container with ID starting with 4f1ed52c0f790adc33a775a988d3076cf3f4ae11c89f9858f43ccb5a466a3ef4 not found: ID does not exist" Apr 17 20:07:37.394323 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.394279 2576 scope.go:117] "RemoveContainer" containerID="f1aa297c24b90ccc087d4a3dfc011ac712ad7ffdebb0a10a5fae20a944c877d2" Apr 17 20:07:37.394475 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.394461 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1aa297c24b90ccc087d4a3dfc011ac712ad7ffdebb0a10a5fae20a944c877d2"} err="failed to get container status \"f1aa297c24b90ccc087d4a3dfc011ac712ad7ffdebb0a10a5fae20a944c877d2\": rpc error: code = NotFound desc = could not find container \"f1aa297c24b90ccc087d4a3dfc011ac712ad7ffdebb0a10a5fae20a944c877d2\": container with ID starting with f1aa297c24b90ccc087d4a3dfc011ac712ad7ffdebb0a10a5fae20a944c877d2 not found: ID does not exist" Apr 17 20:07:37.394512 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.394477 2576 scope.go:117] "RemoveContainer" containerID="3667578875c98f6e2ba81e5229d551cc5625415c9e58508399f5158e8644fed2" Apr 17 20:07:37.394672 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.394652 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3667578875c98f6e2ba81e5229d551cc5625415c9e58508399f5158e8644fed2"} err="failed to get container status \"3667578875c98f6e2ba81e5229d551cc5625415c9e58508399f5158e8644fed2\": rpc error: code = NotFound desc = could not find container \"3667578875c98f6e2ba81e5229d551cc5625415c9e58508399f5158e8644fed2\": container with ID starting with 3667578875c98f6e2ba81e5229d551cc5625415c9e58508399f5158e8644fed2 not found: ID does not exist" Apr 17 20:07:37.394748 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.394673 2576 scope.go:117] "RemoveContainer" containerID="41b921a40669844a2e807a176a1c000edeb942134aa814bd8f60a019b15854c3" Apr 17 20:07:37.394994 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.394968 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41b921a40669844a2e807a176a1c000edeb942134aa814bd8f60a019b15854c3"} err="failed to get container status \"41b921a40669844a2e807a176a1c000edeb942134aa814bd8f60a019b15854c3\": rpc error: code = NotFound desc = could not find container \"41b921a40669844a2e807a176a1c000edeb942134aa814bd8f60a019b15854c3\": container with ID starting with 41b921a40669844a2e807a176a1c000edeb942134aa814bd8f60a019b15854c3 not found: ID does not exist" Apr 17 20:07:37.395089 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.395004 2576 scope.go:117] "RemoveContainer" containerID="f1b7c6b4daa7dbfc0d51e59834db22d799466673448e0e808cb451df35f40d16" Apr 17 20:07:37.395921 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.395892 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1b7c6b4daa7dbfc0d51e59834db22d799466673448e0e808cb451df35f40d16"} err="failed to get container status \"f1b7c6b4daa7dbfc0d51e59834db22d799466673448e0e808cb451df35f40d16\": rpc error: code = NotFound desc = could not find container \"f1b7c6b4daa7dbfc0d51e59834db22d799466673448e0e808cb451df35f40d16\": container with ID starting with f1b7c6b4daa7dbfc0d51e59834db22d799466673448e0e808cb451df35f40d16 not found: ID does not exist" Apr 17 20:07:37.395921 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.395921 2576 scope.go:117] "RemoveContainer" containerID="c9760fb1e9402d8529a0d1e053d741c2550676ef1a1ffe1180f5ce13668c252c" Apr 17 20:07:37.396193 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.396176 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9760fb1e9402d8529a0d1e053d741c2550676ef1a1ffe1180f5ce13668c252c"} err="failed to get container status \"c9760fb1e9402d8529a0d1e053d741c2550676ef1a1ffe1180f5ce13668c252c\": rpc error: code = NotFound desc = could not find container \"c9760fb1e9402d8529a0d1e053d741c2550676ef1a1ffe1180f5ce13668c252c\": container with ID starting with c9760fb1e9402d8529a0d1e053d741c2550676ef1a1ffe1180f5ce13668c252c not found: ID does not exist" Apr 17 20:07:37.396246 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.396193 2576 scope.go:117] "RemoveContainer" containerID="0eb1c770bfb0a201eae4bb5ef493a7c141a2eefa83df29b5406de63c99521617" Apr 17 20:07:37.396441 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.396421 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eb1c770bfb0a201eae4bb5ef493a7c141a2eefa83df29b5406de63c99521617"} err="failed to get container status \"0eb1c770bfb0a201eae4bb5ef493a7c141a2eefa83df29b5406de63c99521617\": rpc error: code = NotFound desc = could not find container \"0eb1c770bfb0a201eae4bb5ef493a7c141a2eefa83df29b5406de63c99521617\": container with ID starting with 0eb1c770bfb0a201eae4bb5ef493a7c141a2eefa83df29b5406de63c99521617 not found: ID does not exist" Apr 17 20:07:37.396516 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.396442 2576 scope.go:117] "RemoveContainer" containerID="4f1ed52c0f790adc33a775a988d3076cf3f4ae11c89f9858f43ccb5a466a3ef4" Apr 17 20:07:37.396609 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.396589 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 20:07:37.396700 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.396656 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f1ed52c0f790adc33a775a988d3076cf3f4ae11c89f9858f43ccb5a466a3ef4"} err="failed to get container status \"4f1ed52c0f790adc33a775a988d3076cf3f4ae11c89f9858f43ccb5a466a3ef4\": rpc error: code = NotFound desc = could not find container \"4f1ed52c0f790adc33a775a988d3076cf3f4ae11c89f9858f43ccb5a466a3ef4\": container with ID starting with 4f1ed52c0f790adc33a775a988d3076cf3f4ae11c89f9858f43ccb5a466a3ef4 not found: ID does not exist" Apr 17 20:07:37.396700 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.396671 2576 scope.go:117] "RemoveContainer" containerID="f1aa297c24b90ccc087d4a3dfc011ac712ad7ffdebb0a10a5fae20a944c877d2" Apr 17 20:07:37.396933 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.396908 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1aa297c24b90ccc087d4a3dfc011ac712ad7ffdebb0a10a5fae20a944c877d2"} err="failed to get container status \"f1aa297c24b90ccc087d4a3dfc011ac712ad7ffdebb0a10a5fae20a944c877d2\": rpc error: code = NotFound desc = could not find container \"f1aa297c24b90ccc087d4a3dfc011ac712ad7ffdebb0a10a5fae20a944c877d2\": container with ID starting with f1aa297c24b90ccc087d4a3dfc011ac712ad7ffdebb0a10a5fae20a944c877d2 not found: ID does not exist" Apr 17 20:07:37.397003 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.396939 2576 scope.go:117] "RemoveContainer" containerID="3667578875c98f6e2ba81e5229d551cc5625415c9e58508399f5158e8644fed2" Apr 17 20:07:37.397003 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.396921 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5d48dcb-f9b5-4299-89f4-4829a18daddd" containerName="kube-rbac-proxy-thanos" Apr 17 20:07:37.397003 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.396990 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5d48dcb-f9b5-4299-89f4-4829a18daddd" containerName="kube-rbac-proxy-thanos" Apr 17 20:07:37.397132 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397008 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" containerName="alertmanager" Apr 17 20:07:37.397132 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397017 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" containerName="alertmanager" Apr 17 20:07:37.397132 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397032 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" containerName="kube-rbac-proxy" Apr 17 20:07:37.397132 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397041 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" containerName="kube-rbac-proxy" Apr 17 20:07:37.397132 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397051 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5d48dcb-f9b5-4299-89f4-4829a18daddd" containerName="prometheus" Apr 17 20:07:37.397132 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397058 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5d48dcb-f9b5-4299-89f4-4829a18daddd" containerName="prometheus" Apr 17 20:07:37.397132 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397073 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5d48dcb-f9b5-4299-89f4-4829a18daddd" containerName="config-reloader" Apr 17 20:07:37.397132 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397082 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5d48dcb-f9b5-4299-89f4-4829a18daddd" containerName="config-reloader" Apr 17 20:07:37.397132 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397093 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5d48dcb-f9b5-4299-89f4-4829a18daddd" containerName="kube-rbac-proxy" Apr 17 20:07:37.397132 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397101 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5d48dcb-f9b5-4299-89f4-4829a18daddd" containerName="kube-rbac-proxy" Apr 17 20:07:37.397132 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397115 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" containerName="init-config-reloader" Apr 17 20:07:37.397132 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397123 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" containerName="init-config-reloader" Apr 17 20:07:37.397132 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397135 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" containerName="prom-label-proxy" Apr 17 20:07:37.397601 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397143 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" containerName="prom-label-proxy" Apr 17 20:07:37.397601 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397155 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5d48dcb-f9b5-4299-89f4-4829a18daddd" containerName="init-config-reloader" Apr 17 20:07:37.397601 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397163 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5d48dcb-f9b5-4299-89f4-4829a18daddd" containerName="init-config-reloader" Apr 17 20:07:37.397601 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397172 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" containerName="config-reloader" Apr 17 20:07:37.397601 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397182 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" containerName="config-reloader" Apr 17 20:07:37.397601 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397194 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5d48dcb-f9b5-4299-89f4-4829a18daddd" containerName="thanos-sidecar" Apr 17 20:07:37.397601 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397202 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5d48dcb-f9b5-4299-89f4-4829a18daddd" containerName="thanos-sidecar" Apr 17 20:07:37.397601 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397213 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8eafe28a-1549-4404-a39c-dad771c51b73" containerName="registry" Apr 17 20:07:37.397601 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397222 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eafe28a-1549-4404-a39c-dad771c51b73" containerName="registry" Apr 17 20:07:37.397601 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397229 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3667578875c98f6e2ba81e5229d551cc5625415c9e58508399f5158e8644fed2"} err="failed to get container status \"3667578875c98f6e2ba81e5229d551cc5625415c9e58508399f5158e8644fed2\": rpc error: code = NotFound desc = could not find container \"3667578875c98f6e2ba81e5229d551cc5625415c9e58508399f5158e8644fed2\": container with ID starting with 3667578875c98f6e2ba81e5229d551cc5625415c9e58508399f5158e8644fed2 not found: ID does not exist" Apr 17 20:07:37.397601 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397253 2576 scope.go:117] "RemoveContainer" containerID="41b921a40669844a2e807a176a1c000edeb942134aa814bd8f60a019b15854c3" Apr 17 20:07:37.397601 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397236 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5d48dcb-f9b5-4299-89f4-4829a18daddd" containerName="kube-rbac-proxy-web" Apr 17 20:07:37.397601 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397298 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5d48dcb-f9b5-4299-89f4-4829a18daddd" containerName="kube-rbac-proxy-web" Apr 17 20:07:37.397601 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397316 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" containerName="kube-rbac-proxy-web" Apr 17 20:07:37.397601 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397323 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" containerName="kube-rbac-proxy-web" Apr 17 20:07:37.397601 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397332 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" containerName="kube-rbac-proxy-metric" Apr 17 20:07:37.397601 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397341 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" containerName="kube-rbac-proxy-metric" Apr 17 20:07:37.397601 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397420 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e5d48dcb-f9b5-4299-89f4-4829a18daddd" containerName="prometheus" Apr 17 20:07:37.397601 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397431 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" containerName="kube-rbac-proxy-web" Apr 17 20:07:37.397601 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397442 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" containerName="config-reloader" Apr 17 20:07:37.397601 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397453 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e5d48dcb-f9b5-4299-89f4-4829a18daddd" containerName="thanos-sidecar" Apr 17 20:07:37.397601 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397463 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" containerName="kube-rbac-proxy" Apr 17 20:07:37.397601 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397470 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" containerName="kube-rbac-proxy-metric" Apr 17 20:07:37.397601 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397479 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e5d48dcb-f9b5-4299-89f4-4829a18daddd" containerName="kube-rbac-proxy-web" Apr 17 20:07:37.397601 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397489 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e5d48dcb-f9b5-4299-89f4-4829a18daddd" containerName="kube-rbac-proxy" Apr 17 20:07:37.397601 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397498 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e5d48dcb-f9b5-4299-89f4-4829a18daddd" containerName="config-reloader" Apr 17 20:07:37.397601 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397495 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41b921a40669844a2e807a176a1c000edeb942134aa814bd8f60a019b15854c3"} err="failed to get container status \"41b921a40669844a2e807a176a1c000edeb942134aa814bd8f60a019b15854c3\": rpc error: code = NotFound desc = could not find container \"41b921a40669844a2e807a176a1c000edeb942134aa814bd8f60a019b15854c3\": container with ID starting with 41b921a40669844a2e807a176a1c000edeb942134aa814bd8f60a019b15854c3 not found: ID does not exist" Apr 17 20:07:37.397601 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397505 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e5d48dcb-f9b5-4299-89f4-4829a18daddd" containerName="kube-rbac-proxy-thanos" Apr 17 20:07:37.397601 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397509 2576 scope.go:117] "RemoveContainer" containerID="f1b7c6b4daa7dbfc0d51e59834db22d799466673448e0e808cb451df35f40d16" Apr 17 20:07:37.397601 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397513 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" containerName="alertmanager" Apr 17 20:07:37.397601 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397522 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8eafe28a-1549-4404-a39c-dad771c51b73" containerName="registry" Apr 17 20:07:37.397601 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397531 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0b1cc99a-9c75-4a2c-bacd-46e8d69cd42d" containerName="prom-label-proxy" Apr 17 20:07:37.398809 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397771 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1b7c6b4daa7dbfc0d51e59834db22d799466673448e0e808cb451df35f40d16"} err="failed to get container status \"f1b7c6b4daa7dbfc0d51e59834db22d799466673448e0e808cb451df35f40d16\": rpc error: code = NotFound desc = could not find container \"f1b7c6b4daa7dbfc0d51e59834db22d799466673448e0e808cb451df35f40d16\": container with ID starting with f1b7c6b4daa7dbfc0d51e59834db22d799466673448e0e808cb451df35f40d16 not found: ID does not exist" Apr 17 20:07:37.398809 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.397783 2576 scope.go:117] "RemoveContainer" containerID="c9760fb1e9402d8529a0d1e053d741c2550676ef1a1ffe1180f5ce13668c252c" Apr 17 20:07:37.398809 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.398003 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9760fb1e9402d8529a0d1e053d741c2550676ef1a1ffe1180f5ce13668c252c"} err="failed to get container status \"c9760fb1e9402d8529a0d1e053d741c2550676ef1a1ffe1180f5ce13668c252c\": rpc error: code = NotFound desc = could not find container \"c9760fb1e9402d8529a0d1e053d741c2550676ef1a1ffe1180f5ce13668c252c\": container with ID starting with c9760fb1e9402d8529a0d1e053d741c2550676ef1a1ffe1180f5ce13668c252c not found: ID does not exist" Apr 17 20:07:37.398809 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.398031 2576 scope.go:117] "RemoveContainer" containerID="0eb1c770bfb0a201eae4bb5ef493a7c141a2eefa83df29b5406de63c99521617" Apr 17 20:07:37.398809 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.398265 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eb1c770bfb0a201eae4bb5ef493a7c141a2eefa83df29b5406de63c99521617"} err="failed to get container status \"0eb1c770bfb0a201eae4bb5ef493a7c141a2eefa83df29b5406de63c99521617\": rpc error: code = NotFound desc = could not find container \"0eb1c770bfb0a201eae4bb5ef493a7c141a2eefa83df29b5406de63c99521617\": container with ID starting with 0eb1c770bfb0a201eae4bb5ef493a7c141a2eefa83df29b5406de63c99521617 not found: ID does not exist" Apr 17 20:07:37.398809 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.398285 2576 scope.go:117] "RemoveContainer" containerID="4f1ed52c0f790adc33a775a988d3076cf3f4ae11c89f9858f43ccb5a466a3ef4" Apr 17 20:07:37.398809 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.398501 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f1ed52c0f790adc33a775a988d3076cf3f4ae11c89f9858f43ccb5a466a3ef4"} err="failed to get container status \"4f1ed52c0f790adc33a775a988d3076cf3f4ae11c89f9858f43ccb5a466a3ef4\": rpc error: code = NotFound desc = could not find container \"4f1ed52c0f790adc33a775a988d3076cf3f4ae11c89f9858f43ccb5a466a3ef4\": container with ID starting with 4f1ed52c0f790adc33a775a988d3076cf3f4ae11c89f9858f43ccb5a466a3ef4 not found: ID does not exist" Apr 17 20:07:37.398809 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.398523 2576 scope.go:117] "RemoveContainer" containerID="f1aa297c24b90ccc087d4a3dfc011ac712ad7ffdebb0a10a5fae20a944c877d2" Apr 17 20:07:37.398809 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.398704 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1aa297c24b90ccc087d4a3dfc011ac712ad7ffdebb0a10a5fae20a944c877d2"} err="failed to get container status \"f1aa297c24b90ccc087d4a3dfc011ac712ad7ffdebb0a10a5fae20a944c877d2\": rpc error: code = NotFound desc = could not find container \"f1aa297c24b90ccc087d4a3dfc011ac712ad7ffdebb0a10a5fae20a944c877d2\": container with ID starting with f1aa297c24b90ccc087d4a3dfc011ac712ad7ffdebb0a10a5fae20a944c877d2 not found: ID does not exist" Apr 17 20:07:37.398809 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.398717 2576 scope.go:117] "RemoveContainer" containerID="3667578875c98f6e2ba81e5229d551cc5625415c9e58508399f5158e8644fed2" Apr 17 20:07:37.399175 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.398907 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3667578875c98f6e2ba81e5229d551cc5625415c9e58508399f5158e8644fed2"} err="failed to get container status \"3667578875c98f6e2ba81e5229d551cc5625415c9e58508399f5158e8644fed2\": rpc error: code = NotFound desc = could not find container \"3667578875c98f6e2ba81e5229d551cc5625415c9e58508399f5158e8644fed2\": container with ID starting with 3667578875c98f6e2ba81e5229d551cc5625415c9e58508399f5158e8644fed2 not found: ID does not exist" Apr 17 20:07:37.399175 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.398923 2576 scope.go:117] "RemoveContainer" containerID="41b921a40669844a2e807a176a1c000edeb942134aa814bd8f60a019b15854c3" Apr 17 20:07:37.399175 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.399110 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41b921a40669844a2e807a176a1c000edeb942134aa814bd8f60a019b15854c3"} err="failed to get container status \"41b921a40669844a2e807a176a1c000edeb942134aa814bd8f60a019b15854c3\": rpc error: code = NotFound desc = could not find container \"41b921a40669844a2e807a176a1c000edeb942134aa814bd8f60a019b15854c3\": container with ID starting with 41b921a40669844a2e807a176a1c000edeb942134aa814bd8f60a019b15854c3 not found: ID does not exist" Apr 17 20:07:37.399175 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.399129 2576 scope.go:117] "RemoveContainer" containerID="f1b7c6b4daa7dbfc0d51e59834db22d799466673448e0e808cb451df35f40d16" Apr 17 20:07:37.399371 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.399351 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1b7c6b4daa7dbfc0d51e59834db22d799466673448e0e808cb451df35f40d16"} err="failed to get container status \"f1b7c6b4daa7dbfc0d51e59834db22d799466673448e0e808cb451df35f40d16\": rpc error: code = NotFound desc = could not find container \"f1b7c6b4daa7dbfc0d51e59834db22d799466673448e0e808cb451df35f40d16\": container with ID starting with f1b7c6b4daa7dbfc0d51e59834db22d799466673448e0e808cb451df35f40d16 not found: ID does not exist" Apr 17 20:07:37.401998 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.400701 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.403720 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.403701 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 20:07:37.403884 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.403718 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 20:07:37.403884 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.403703 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-jmxvp\"" Apr 17 20:07:37.403884 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.403743 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 20:07:37.403884 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.403753 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 20:07:37.404180 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.404164 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 20:07:37.404291 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.404199 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-3bob03gk936tc\"" Apr 17 20:07:37.404594 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.404570 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 20:07:37.404701 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.404609 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 20:07:37.404769 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.404716 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 20:07:37.404883 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.404827 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 20:07:37.405056 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.405019 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 20:07:37.407035 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.407012 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 20:07:37.410543 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.410525 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 20:07:37.413249 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.413216 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 20:07:37.558547 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.558518 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fd866433-16f1-482d-88aa-92bdff80ce12-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.558682 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.558552 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fd866433-16f1-482d-88aa-92bdff80ce12-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.558682 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.558579 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd866433-16f1-482d-88aa-92bdff80ce12-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.558682 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.558632 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd866433-16f1-482d-88aa-92bdff80ce12-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.558682 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.558668 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/fd866433-16f1-482d-88aa-92bdff80ce12-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.558926 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.558689 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fd866433-16f1-482d-88aa-92bdff80ce12-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.558926 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.558705 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fd866433-16f1-482d-88aa-92bdff80ce12-web-config\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.558926 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.558797 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/fd866433-16f1-482d-88aa-92bdff80ce12-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.558926 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.558855 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fd866433-16f1-482d-88aa-92bdff80ce12-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.558926 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.558891 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fd866433-16f1-482d-88aa-92bdff80ce12-config-out\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.558926 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.558924 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd866433-16f1-482d-88aa-92bdff80ce12-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.559213 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.558958 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fd866433-16f1-482d-88aa-92bdff80ce12-config\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.559213 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.559005 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fd866433-16f1-482d-88aa-92bdff80ce12-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.559213 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.559031 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njpcb\" (UniqueName: \"kubernetes.io/projected/fd866433-16f1-482d-88aa-92bdff80ce12-kube-api-access-njpcb\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.559213 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.559065 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/fd866433-16f1-482d-88aa-92bdff80ce12-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.559213 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.559096 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fd866433-16f1-482d-88aa-92bdff80ce12-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.559213 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.559143 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fd866433-16f1-482d-88aa-92bdff80ce12-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.559213 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.559181 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/fd866433-16f1-482d-88aa-92bdff80ce12-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.638276 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.638198 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5d48dcb-f9b5-4299-89f4-4829a18daddd" path="/var/lib/kubelet/pods/e5d48dcb-f9b5-4299-89f4-4829a18daddd/volumes" Apr 17 20:07:37.660281 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.660248 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fd866433-16f1-482d-88aa-92bdff80ce12-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.660409 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.660287 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-njpcb\" (UniqueName: \"kubernetes.io/projected/fd866433-16f1-482d-88aa-92bdff80ce12-kube-api-access-njpcb\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.660409 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.660316 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/fd866433-16f1-482d-88aa-92bdff80ce12-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.660409 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.660344 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fd866433-16f1-482d-88aa-92bdff80ce12-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.660409 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.660381 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fd866433-16f1-482d-88aa-92bdff80ce12-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.660676 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.660417 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/fd866433-16f1-482d-88aa-92bdff80ce12-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.660676 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.660461 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fd866433-16f1-482d-88aa-92bdff80ce12-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.660676 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.660508 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fd866433-16f1-482d-88aa-92bdff80ce12-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.660676 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.660556 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd866433-16f1-482d-88aa-92bdff80ce12-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.660676 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.660590 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd866433-16f1-482d-88aa-92bdff80ce12-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.660676 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.660622 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/fd866433-16f1-482d-88aa-92bdff80ce12-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.661027 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.660735 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/fd866433-16f1-482d-88aa-92bdff80ce12-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.661191 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.661166 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fd866433-16f1-482d-88aa-92bdff80ce12-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.661256 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.661211 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fd866433-16f1-482d-88aa-92bdff80ce12-web-config\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.661256 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.661223 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fd866433-16f1-482d-88aa-92bdff80ce12-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.661347 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.661254 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/fd866433-16f1-482d-88aa-92bdff80ce12-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.661347 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.661287 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fd866433-16f1-482d-88aa-92bdff80ce12-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.661347 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.661315 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fd866433-16f1-482d-88aa-92bdff80ce12-config-out\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.661498 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.661345 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd866433-16f1-482d-88aa-92bdff80ce12-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.661498 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.661384 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fd866433-16f1-482d-88aa-92bdff80ce12-config\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.661669 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.661642 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd866433-16f1-482d-88aa-92bdff80ce12-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.662004 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.661976 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd866433-16f1-482d-88aa-92bdff80ce12-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.663143 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.663120 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd866433-16f1-482d-88aa-92bdff80ce12-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.663626 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.663598 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fd866433-16f1-482d-88aa-92bdff80ce12-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.663703 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.663647 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fd866433-16f1-482d-88aa-92bdff80ce12-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.664383 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.664360 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fd866433-16f1-482d-88aa-92bdff80ce12-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.664778 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.664716 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fd866433-16f1-482d-88aa-92bdff80ce12-config-out\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.665000 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.664875 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fd866433-16f1-482d-88aa-92bdff80ce12-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.665157 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.665037 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/fd866433-16f1-482d-88aa-92bdff80ce12-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.665612 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.665579 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/fd866433-16f1-482d-88aa-92bdff80ce12-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.665831 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.665796 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/fd866433-16f1-482d-88aa-92bdff80ce12-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.665976 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.665953 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fd866433-16f1-482d-88aa-92bdff80ce12-config\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.666308 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.666286 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fd866433-16f1-482d-88aa-92bdff80ce12-web-config\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.666601 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.666578 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fd866433-16f1-482d-88aa-92bdff80ce12-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.666999 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.666979 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fd866433-16f1-482d-88aa-92bdff80ce12-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.668769 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.668752 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-njpcb\" (UniqueName: \"kubernetes.io/projected/fd866433-16f1-482d-88aa-92bdff80ce12-kube-api-access-njpcb\") pod \"prometheus-k8s-0\" (UID: \"fd866433-16f1-482d-88aa-92bdff80ce12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.711992 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.711953 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:07:37.838492 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:37.838457 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 20:07:37.845171 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:07:37.845131 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd866433_16f1_482d_88aa_92bdff80ce12.slice/crio-bbefe08a855cf1a129786920d02a6272a79cd72ba7922a4232b6a7327331f85b WatchSource:0}: Error finding container bbefe08a855cf1a129786920d02a6272a79cd72ba7922a4232b6a7327331f85b: Status 404 returned error can't find the container with id bbefe08a855cf1a129786920d02a6272a79cd72ba7922a4232b6a7327331f85b Apr 17 20:07:38.348036 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:38.347993 2576 generic.go:358] "Generic (PLEG): container finished" podID="fd866433-16f1-482d-88aa-92bdff80ce12" containerID="0697643cefdeea0fb387e73bb0dcf0525a965ca0b29712758ad36b2536fe7dbc" exitCode=0 Apr 17 20:07:38.348203 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:38.348047 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fd866433-16f1-482d-88aa-92bdff80ce12","Type":"ContainerDied","Data":"0697643cefdeea0fb387e73bb0dcf0525a965ca0b29712758ad36b2536fe7dbc"} Apr 17 20:07:38.348203 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:38.348082 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fd866433-16f1-482d-88aa-92bdff80ce12","Type":"ContainerStarted","Data":"bbefe08a855cf1a129786920d02a6272a79cd72ba7922a4232b6a7327331f85b"} Apr 17 20:07:39.354128 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:39.354093 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fd866433-16f1-482d-88aa-92bdff80ce12","Type":"ContainerStarted","Data":"72d3a130303fba2a474f83f0d7475cf3e4f0a64947255d0da88dcbe8bcdf5df6"} Apr 17 20:07:39.354128 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:39.354128 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fd866433-16f1-482d-88aa-92bdff80ce12","Type":"ContainerStarted","Data":"a225a99d7c815ef3ff2ce31b107d96114edf421954af97f506cd45b7565676e6"} Apr 17 20:07:39.354515 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:39.354138 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fd866433-16f1-482d-88aa-92bdff80ce12","Type":"ContainerStarted","Data":"dfdeb1ef4e6671d42739a16b09af9974c9fa8500f12d501d50b2d21f7c90b9d3"} Apr 17 20:07:39.354515 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:39.354146 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fd866433-16f1-482d-88aa-92bdff80ce12","Type":"ContainerStarted","Data":"5caa5981a282f1f45e3bcb6f4c845d8810620ccbbc1310741e3a94d09c577cc5"} Apr 17 20:07:39.354515 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:39.354155 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fd866433-16f1-482d-88aa-92bdff80ce12","Type":"ContainerStarted","Data":"f738a8b5ece38ae95c3d5598dc25d6107a29113fedc3da5c0ae228ac4559f022"} Apr 17 20:07:39.354515 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:39.354163 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fd866433-16f1-482d-88aa-92bdff80ce12","Type":"ContainerStarted","Data":"c11399844a767902a142778a32c0149758ea44bd9b1c8657fe8a48f9cddc315b"} Apr 17 20:07:39.381029 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:39.380971 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.380941231 podStartE2EDuration="2.380941231s" podCreationTimestamp="2026-04-17 20:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:07:39.379436155 +0000 UTC m=+198.247255916" watchObservedRunningTime="2026-04-17 20:07:39.380941231 +0000 UTC m=+198.248760982" Apr 17 20:07:42.712701 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:07:42.712658 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:08:12.101253 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:08:12.101211 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-vb6rl"] Apr 17 20:08:12.105029 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:08:12.105005 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vb6rl" Apr 17 20:08:12.107110 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:08:12.107086 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 20:08:12.111310 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:08:12.111288 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-vb6rl"] Apr 17 20:08:12.224019 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:08:12.223985 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b1fec543-b49e-43b9-ae94-fbf7c58061e8-kubelet-config\") pod \"global-pull-secret-syncer-vb6rl\" (UID: \"b1fec543-b49e-43b9-ae94-fbf7c58061e8\") " pod="kube-system/global-pull-secret-syncer-vb6rl" Apr 17 20:08:12.224019 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:08:12.224028 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b1fec543-b49e-43b9-ae94-fbf7c58061e8-dbus\") pod \"global-pull-secret-syncer-vb6rl\" (UID: \"b1fec543-b49e-43b9-ae94-fbf7c58061e8\") " pod="kube-system/global-pull-secret-syncer-vb6rl" Apr 17 20:08:12.224238 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:08:12.224051 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b1fec543-b49e-43b9-ae94-fbf7c58061e8-original-pull-secret\") pod \"global-pull-secret-syncer-vb6rl\" (UID: \"b1fec543-b49e-43b9-ae94-fbf7c58061e8\") " pod="kube-system/global-pull-secret-syncer-vb6rl" Apr 17 20:08:12.325443 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:08:12.325407 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b1fec543-b49e-43b9-ae94-fbf7c58061e8-kubelet-config\") pod \"global-pull-secret-syncer-vb6rl\" (UID: \"b1fec543-b49e-43b9-ae94-fbf7c58061e8\") " pod="kube-system/global-pull-secret-syncer-vb6rl" Apr 17 20:08:12.325443 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:08:12.325448 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b1fec543-b49e-43b9-ae94-fbf7c58061e8-dbus\") pod \"global-pull-secret-syncer-vb6rl\" (UID: \"b1fec543-b49e-43b9-ae94-fbf7c58061e8\") " pod="kube-system/global-pull-secret-syncer-vb6rl" Apr 17 20:08:12.325639 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:08:12.325472 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b1fec543-b49e-43b9-ae94-fbf7c58061e8-original-pull-secret\") pod \"global-pull-secret-syncer-vb6rl\" (UID: \"b1fec543-b49e-43b9-ae94-fbf7c58061e8\") " pod="kube-system/global-pull-secret-syncer-vb6rl" Apr 17 20:08:12.325639 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:08:12.325546 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b1fec543-b49e-43b9-ae94-fbf7c58061e8-kubelet-config\") pod \"global-pull-secret-syncer-vb6rl\" (UID: \"b1fec543-b49e-43b9-ae94-fbf7c58061e8\") " pod="kube-system/global-pull-secret-syncer-vb6rl" Apr 17 20:08:12.325639 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:08:12.325608 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b1fec543-b49e-43b9-ae94-fbf7c58061e8-dbus\") pod \"global-pull-secret-syncer-vb6rl\" (UID: \"b1fec543-b49e-43b9-ae94-fbf7c58061e8\") " pod="kube-system/global-pull-secret-syncer-vb6rl" Apr 17 20:08:12.327675 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:08:12.327647 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b1fec543-b49e-43b9-ae94-fbf7c58061e8-original-pull-secret\") pod \"global-pull-secret-syncer-vb6rl\" (UID: \"b1fec543-b49e-43b9-ae94-fbf7c58061e8\") " pod="kube-system/global-pull-secret-syncer-vb6rl" Apr 17 20:08:12.414470 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:08:12.414392 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vb6rl" Apr 17 20:08:12.738909 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:08:12.738811 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-vb6rl"] Apr 17 20:08:12.742383 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:08:12.742350 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1fec543_b49e_43b9_ae94_fbf7c58061e8.slice/crio-6e3229371230eda13556debfc115b805c624e75ea30b8a91aee09acb23d905a5 WatchSource:0}: Error finding container 6e3229371230eda13556debfc115b805c624e75ea30b8a91aee09acb23d905a5: Status 404 returned error can't find the container with id 6e3229371230eda13556debfc115b805c624e75ea30b8a91aee09acb23d905a5 Apr 17 20:08:13.457407 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:08:13.457366 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-vb6rl" event={"ID":"b1fec543-b49e-43b9-ae94-fbf7c58061e8","Type":"ContainerStarted","Data":"6e3229371230eda13556debfc115b805c624e75ea30b8a91aee09acb23d905a5"} Apr 17 20:08:17.472608 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:08:17.472573 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-vb6rl" event={"ID":"b1fec543-b49e-43b9-ae94-fbf7c58061e8","Type":"ContainerStarted","Data":"ff51bdca13053e09af245afb76fb22a1b918d0cc2c4168d8e54f03e135f756d7"} Apr 17 20:08:17.487117 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:08:17.487069 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-vb6rl" podStartSLOduration=1.7620765980000002 podStartE2EDuration="5.487055144s" podCreationTimestamp="2026-04-17 20:08:12 +0000 UTC" firstStartedPulling="2026-04-17 20:08:12.744006437 +0000 UTC m=+231.611826176" lastFinishedPulling="2026-04-17 20:08:16.468984981 +0000 UTC m=+235.336804722" observedRunningTime="2026-04-17 20:08:17.486007672 +0000 UTC m=+236.353827434" watchObservedRunningTime="2026-04-17 20:08:17.487055144 +0000 UTC m=+236.354874905" Apr 17 20:08:37.712311 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:08:37.712231 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:08:37.727353 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:08:37.727326 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:08:38.551170 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:08:38.551141 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:09:21.526553 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:09:21.526512 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2wkz_8c7c8612-982d-4a5d-81bd-d18fab9adeef/ovn-acl-logging/0.log" Apr 17 20:09:21.529243 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:09:21.529223 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2wkz_8c7c8612-982d-4a5d-81bd-d18fab9adeef/ovn-acl-logging/0.log" Apr 17 20:09:21.530632 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:09:21.530611 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 20:09:52.583518 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:09:52.583482 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-n5wb2"] Apr 17 20:09:52.585315 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:09:52.585299 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-n5wb2" Apr 17 20:09:52.587376 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:09:52.587355 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 17 20:09:52.588094 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:09:52.588078 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-qssm2\"" Apr 17 20:09:52.588155 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:09:52.588080 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 17 20:09:52.592817 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:09:52.592793 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-n5wb2"] Apr 17 20:09:52.620858 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:09:52.620810 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de5f91a3-8486-423a-9602-778d4f161925-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-n5wb2\" (UID: \"de5f91a3-8486-423a-9602-778d4f161925\") " pod="cert-manager/cert-manager-webhook-597b96b99b-n5wb2" Apr 17 20:09:52.621001 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:09:52.620881 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks67z\" (UniqueName: \"kubernetes.io/projected/de5f91a3-8486-423a-9602-778d4f161925-kube-api-access-ks67z\") pod \"cert-manager-webhook-597b96b99b-n5wb2\" (UID: \"de5f91a3-8486-423a-9602-778d4f161925\") " pod="cert-manager/cert-manager-webhook-597b96b99b-n5wb2" Apr 17 20:09:52.721773 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:09:52.721737 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de5f91a3-8486-423a-9602-778d4f161925-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-n5wb2\" (UID: \"de5f91a3-8486-423a-9602-778d4f161925\") " pod="cert-manager/cert-manager-webhook-597b96b99b-n5wb2" Apr 17 20:09:52.721952 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:09:52.721803 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ks67z\" (UniqueName: \"kubernetes.io/projected/de5f91a3-8486-423a-9602-778d4f161925-kube-api-access-ks67z\") pod \"cert-manager-webhook-597b96b99b-n5wb2\" (UID: \"de5f91a3-8486-423a-9602-778d4f161925\") " pod="cert-manager/cert-manager-webhook-597b96b99b-n5wb2" Apr 17 20:09:52.729237 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:09:52.729212 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks67z\" (UniqueName: \"kubernetes.io/projected/de5f91a3-8486-423a-9602-778d4f161925-kube-api-access-ks67z\") pod \"cert-manager-webhook-597b96b99b-n5wb2\" (UID: \"de5f91a3-8486-423a-9602-778d4f161925\") " pod="cert-manager/cert-manager-webhook-597b96b99b-n5wb2" Apr 17 20:09:52.731340 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:09:52.731317 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de5f91a3-8486-423a-9602-778d4f161925-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-n5wb2\" (UID: \"de5f91a3-8486-423a-9602-778d4f161925\") " pod="cert-manager/cert-manager-webhook-597b96b99b-n5wb2" Apr 17 20:09:52.904235 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:09:52.904147 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-n5wb2" Apr 17 20:09:53.023624 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:09:53.023592 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-n5wb2"] Apr 17 20:09:53.027720 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:09:53.027693 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde5f91a3_8486_423a_9602_778d4f161925.slice/crio-73e166a7dd3cd7822e77fcdbe7bfa113b5715f74eb5ba4d7f7ef570cdb857623 WatchSource:0}: Error finding container 73e166a7dd3cd7822e77fcdbe7bfa113b5715f74eb5ba4d7f7ef570cdb857623: Status 404 returned error can't find the container with id 73e166a7dd3cd7822e77fcdbe7bfa113b5715f74eb5ba4d7f7ef570cdb857623 Apr 17 20:09:53.029826 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:09:53.029808 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 20:09:53.745293 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:09:53.745255 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-n5wb2" event={"ID":"de5f91a3-8486-423a-9602-778d4f161925","Type":"ContainerStarted","Data":"73e166a7dd3cd7822e77fcdbe7bfa113b5715f74eb5ba4d7f7ef570cdb857623"} Apr 17 20:09:56.756191 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:09:56.756147 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-n5wb2" event={"ID":"de5f91a3-8486-423a-9602-778d4f161925","Type":"ContainerStarted","Data":"55fd2d990f483272b08878d528a0d4d8ff1db05388e7176a565a3c3a86dca8f5"} Apr 17 20:09:56.756642 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:09:56.756279 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-n5wb2" Apr 17 20:09:56.770952 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:09:56.770834 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-n5wb2" podStartSLOduration=1.291991346 podStartE2EDuration="4.770815939s" podCreationTimestamp="2026-04-17 20:09:52 +0000 UTC" firstStartedPulling="2026-04-17 20:09:53.029964596 +0000 UTC m=+331.897784335" lastFinishedPulling="2026-04-17 20:09:56.508789188 +0000 UTC m=+335.376608928" observedRunningTime="2026-04-17 20:09:56.770146342 +0000 UTC m=+335.637966127" watchObservedRunningTime="2026-04-17 20:09:56.770815939 +0000 UTC m=+335.638635702" Apr 17 20:10:02.761287 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:02.761215 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-n5wb2" Apr 17 20:10:22.917073 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:22.917042 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-ghzxc"] Apr 17 20:10:22.924304 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:22.924287 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-ghzxc" Apr 17 20:10:22.930763 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:22.930482 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-csxrb\"" Apr 17 20:10:22.930763 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:22.930550 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 17 20:10:22.930763 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:22.930482 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 17 20:10:22.930976 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:22.930926 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 20:10:22.931056 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:22.931037 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 20:10:22.942886 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:22.942863 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-ghzxc"] Apr 17 20:10:23.055108 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:23.055071 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f85k\" (UniqueName: \"kubernetes.io/projected/cff41bde-c7d2-44c8-ad4b-befce0847da2-kube-api-access-4f85k\") pod \"opendatahub-operator-controller-manager-6bcb6fdd5f-ghzxc\" (UID: \"cff41bde-c7d2-44c8-ad4b-befce0847da2\") " pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-ghzxc" Apr 17 20:10:23.055108 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:23.055108 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cff41bde-c7d2-44c8-ad4b-befce0847da2-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6bcb6fdd5f-ghzxc\" (UID: \"cff41bde-c7d2-44c8-ad4b-befce0847da2\") " pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-ghzxc" Apr 17 20:10:23.055331 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:23.055211 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cff41bde-c7d2-44c8-ad4b-befce0847da2-webhook-cert\") pod \"opendatahub-operator-controller-manager-6bcb6fdd5f-ghzxc\" (UID: \"cff41bde-c7d2-44c8-ad4b-befce0847da2\") " pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-ghzxc" Apr 17 20:10:23.156105 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:23.156069 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4f85k\" (UniqueName: \"kubernetes.io/projected/cff41bde-c7d2-44c8-ad4b-befce0847da2-kube-api-access-4f85k\") pod \"opendatahub-operator-controller-manager-6bcb6fdd5f-ghzxc\" (UID: \"cff41bde-c7d2-44c8-ad4b-befce0847da2\") " pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-ghzxc" Apr 17 20:10:23.156105 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:23.156107 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cff41bde-c7d2-44c8-ad4b-befce0847da2-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6bcb6fdd5f-ghzxc\" (UID: \"cff41bde-c7d2-44c8-ad4b-befce0847da2\") " pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-ghzxc" Apr 17 20:10:23.156357 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:23.156255 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cff41bde-c7d2-44c8-ad4b-befce0847da2-webhook-cert\") pod \"opendatahub-operator-controller-manager-6bcb6fdd5f-ghzxc\" (UID: \"cff41bde-c7d2-44c8-ad4b-befce0847da2\") " pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-ghzxc" Apr 17 20:10:23.158634 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:23.158612 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cff41bde-c7d2-44c8-ad4b-befce0847da2-webhook-cert\") pod \"opendatahub-operator-controller-manager-6bcb6fdd5f-ghzxc\" (UID: \"cff41bde-c7d2-44c8-ad4b-befce0847da2\") " pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-ghzxc" Apr 17 20:10:23.158749 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:23.158729 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cff41bde-c7d2-44c8-ad4b-befce0847da2-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6bcb6fdd5f-ghzxc\" (UID: \"cff41bde-c7d2-44c8-ad4b-befce0847da2\") " pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-ghzxc" Apr 17 20:10:23.167576 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:23.167513 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f85k\" (UniqueName: \"kubernetes.io/projected/cff41bde-c7d2-44c8-ad4b-befce0847da2-kube-api-access-4f85k\") pod \"opendatahub-operator-controller-manager-6bcb6fdd5f-ghzxc\" (UID: \"cff41bde-c7d2-44c8-ad4b-befce0847da2\") " pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-ghzxc" Apr 17 20:10:23.233964 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:23.233928 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-ghzxc" Apr 17 20:10:23.357117 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:23.357094 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-ghzxc"] Apr 17 20:10:23.359826 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:10:23.359800 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcff41bde_c7d2_44c8_ad4b_befce0847da2.slice/crio-a03f24ca0d9b585132f090dbb94d007b651000615b15f5e8170e446dac2e3bf4 WatchSource:0}: Error finding container a03f24ca0d9b585132f090dbb94d007b651000615b15f5e8170e446dac2e3bf4: Status 404 returned error can't find the container with id a03f24ca0d9b585132f090dbb94d007b651000615b15f5e8170e446dac2e3bf4 Apr 17 20:10:23.833586 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:23.833545 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-ghzxc" event={"ID":"cff41bde-c7d2-44c8-ad4b-befce0847da2","Type":"ContainerStarted","Data":"a03f24ca0d9b585132f090dbb94d007b651000615b15f5e8170e446dac2e3bf4"} Apr 17 20:10:26.844318 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:26.844280 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-ghzxc" event={"ID":"cff41bde-c7d2-44c8-ad4b-befce0847da2","Type":"ContainerStarted","Data":"a80667f10d9162d9970ce53fef4f46433f4833bc6d24f761ce8b721aa9281d38"} Apr 17 20:10:26.844725 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:26.844385 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-ghzxc" Apr 17 20:10:26.867549 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:26.867501 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-ghzxc" podStartSLOduration=2.443781248 podStartE2EDuration="4.867488095s" podCreationTimestamp="2026-04-17 20:10:22 +0000 UTC" firstStartedPulling="2026-04-17 20:10:23.361652195 +0000 UTC m=+362.229471934" lastFinishedPulling="2026-04-17 20:10:25.785359031 +0000 UTC m=+364.653178781" observedRunningTime="2026-04-17 20:10:26.866281426 +0000 UTC m=+365.734101197" watchObservedRunningTime="2026-04-17 20:10:26.867488095 +0000 UTC m=+365.735307855" Apr 17 20:10:33.707513 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:33.707481 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-fcb6f8ffb-q95vl"] Apr 17 20:10:33.710343 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:33.710327 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-fcb6f8ffb-q95vl" Apr 17 20:10:33.714100 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:33.714067 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-pgrnq\"" Apr 17 20:10:33.714250 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:33.714177 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 17 20:10:33.714316 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:33.714077 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 17 20:10:33.714393 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:33.714075 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:10:33.714999 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:33.714976 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 17 20:10:33.715190 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:33.715172 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 17 20:10:33.719575 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:33.719550 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-fcb6f8ffb-q95vl"] Apr 17 20:10:33.851428 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:33.851381 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mffr8\" (UniqueName: \"kubernetes.io/projected/4fac432b-3436-490f-acc2-b7c3e4f955da-kube-api-access-mffr8\") pod \"lws-controller-manager-fcb6f8ffb-q95vl\" (UID: \"4fac432b-3436-490f-acc2-b7c3e4f955da\") " pod="openshift-lws-operator/lws-controller-manager-fcb6f8ffb-q95vl" Apr 17 20:10:33.851428 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:33.851430 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4fac432b-3436-490f-acc2-b7c3e4f955da-cert\") pod \"lws-controller-manager-fcb6f8ffb-q95vl\" (UID: \"4fac432b-3436-490f-acc2-b7c3e4f955da\") " pod="openshift-lws-operator/lws-controller-manager-fcb6f8ffb-q95vl" Apr 17 20:10:33.851641 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:33.851449 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/4fac432b-3436-490f-acc2-b7c3e4f955da-manager-config\") pod \"lws-controller-manager-fcb6f8ffb-q95vl\" (UID: \"4fac432b-3436-490f-acc2-b7c3e4f955da\") " pod="openshift-lws-operator/lws-controller-manager-fcb6f8ffb-q95vl" Apr 17 20:10:33.851641 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:33.851496 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/4fac432b-3436-490f-acc2-b7c3e4f955da-metrics-cert\") pod \"lws-controller-manager-fcb6f8ffb-q95vl\" (UID: \"4fac432b-3436-490f-acc2-b7c3e4f955da\") " pod="openshift-lws-operator/lws-controller-manager-fcb6f8ffb-q95vl" Apr 17 20:10:33.951966 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:33.951923 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/4fac432b-3436-490f-acc2-b7c3e4f955da-metrics-cert\") pod \"lws-controller-manager-fcb6f8ffb-q95vl\" (UID: \"4fac432b-3436-490f-acc2-b7c3e4f955da\") " pod="openshift-lws-operator/lws-controller-manager-fcb6f8ffb-q95vl" Apr 17 20:10:33.952138 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:33.952014 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mffr8\" (UniqueName: \"kubernetes.io/projected/4fac432b-3436-490f-acc2-b7c3e4f955da-kube-api-access-mffr8\") pod \"lws-controller-manager-fcb6f8ffb-q95vl\" (UID: \"4fac432b-3436-490f-acc2-b7c3e4f955da\") " pod="openshift-lws-operator/lws-controller-manager-fcb6f8ffb-q95vl" Apr 17 20:10:33.952138 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:33.952040 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4fac432b-3436-490f-acc2-b7c3e4f955da-cert\") pod \"lws-controller-manager-fcb6f8ffb-q95vl\" (UID: \"4fac432b-3436-490f-acc2-b7c3e4f955da\") " pod="openshift-lws-operator/lws-controller-manager-fcb6f8ffb-q95vl" Apr 17 20:10:33.952138 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:33.952056 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/4fac432b-3436-490f-acc2-b7c3e4f955da-manager-config\") pod \"lws-controller-manager-fcb6f8ffb-q95vl\" (UID: \"4fac432b-3436-490f-acc2-b7c3e4f955da\") " pod="openshift-lws-operator/lws-controller-manager-fcb6f8ffb-q95vl" Apr 17 20:10:33.952748 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:33.952723 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/4fac432b-3436-490f-acc2-b7c3e4f955da-manager-config\") pod \"lws-controller-manager-fcb6f8ffb-q95vl\" (UID: \"4fac432b-3436-490f-acc2-b7c3e4f955da\") " pod="openshift-lws-operator/lws-controller-manager-fcb6f8ffb-q95vl" Apr 17 20:10:33.954388 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:33.954364 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/4fac432b-3436-490f-acc2-b7c3e4f955da-metrics-cert\") pod \"lws-controller-manager-fcb6f8ffb-q95vl\" (UID: \"4fac432b-3436-490f-acc2-b7c3e4f955da\") " pod="openshift-lws-operator/lws-controller-manager-fcb6f8ffb-q95vl" Apr 17 20:10:33.954475 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:33.954434 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4fac432b-3436-490f-acc2-b7c3e4f955da-cert\") pod \"lws-controller-manager-fcb6f8ffb-q95vl\" (UID: \"4fac432b-3436-490f-acc2-b7c3e4f955da\") " pod="openshift-lws-operator/lws-controller-manager-fcb6f8ffb-q95vl" Apr 17 20:10:33.966476 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:33.966402 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mffr8\" (UniqueName: \"kubernetes.io/projected/4fac432b-3436-490f-acc2-b7c3e4f955da-kube-api-access-mffr8\") pod \"lws-controller-manager-fcb6f8ffb-q95vl\" (UID: \"4fac432b-3436-490f-acc2-b7c3e4f955da\") " pod="openshift-lws-operator/lws-controller-manager-fcb6f8ffb-q95vl" Apr 17 20:10:34.022453 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:34.022407 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-fcb6f8ffb-q95vl" Apr 17 20:10:34.143717 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:34.143687 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-fcb6f8ffb-q95vl"] Apr 17 20:10:34.146049 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:10:34.146022 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fac432b_3436_490f_acc2_b7c3e4f955da.slice/crio-c8570a9242bf8f218e908261fc78a8b1219aadc3f4c11e2d386ff21b7e74c41b WatchSource:0}: Error finding container c8570a9242bf8f218e908261fc78a8b1219aadc3f4c11e2d386ff21b7e74c41b: Status 404 returned error can't find the container with id c8570a9242bf8f218e908261fc78a8b1219aadc3f4c11e2d386ff21b7e74c41b Apr 17 20:10:34.869740 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:34.869705 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-fcb6f8ffb-q95vl" event={"ID":"4fac432b-3436-490f-acc2-b7c3e4f955da","Type":"ContainerStarted","Data":"c8570a9242bf8f218e908261fc78a8b1219aadc3f4c11e2d386ff21b7e74c41b"} Apr 17 20:10:36.877802 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:36.877702 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-fcb6f8ffb-q95vl" event={"ID":"4fac432b-3436-490f-acc2-b7c3e4f955da","Type":"ContainerStarted","Data":"92ad405d7c1b497e91f1c469193f410dc3ba5eecfb745311e5ede1f518ff73cf"} Apr 17 20:10:36.877802 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:36.877764 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-fcb6f8ffb-q95vl" Apr 17 20:10:36.893949 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:36.893884 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-fcb6f8ffb-q95vl" podStartSLOduration=1.468184948 podStartE2EDuration="3.893869043s" podCreationTimestamp="2026-04-17 20:10:33 +0000 UTC" firstStartedPulling="2026-04-17 20:10:34.147747868 +0000 UTC m=+373.015567607" lastFinishedPulling="2026-04-17 20:10:36.573431958 +0000 UTC m=+375.441251702" observedRunningTime="2026-04-17 20:10:36.891982857 +0000 UTC m=+375.759802617" watchObservedRunningTime="2026-04-17 20:10:36.893869043 +0000 UTC m=+375.761688804" Apr 17 20:10:37.850343 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:37.850316 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-ghzxc" Apr 17 20:10:40.516124 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:40.516089 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-6568cc58bc-nthq7"] Apr 17 20:10:40.519588 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:40.519569 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-6568cc58bc-nthq7" Apr 17 20:10:40.521947 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:40.521926 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 17 20:10:40.522078 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:40.521959 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 17 20:10:40.522078 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:40.521989 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-lx7ns\"" Apr 17 20:10:40.534931 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:40.533799 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-6568cc58bc-nthq7"] Apr 17 20:10:40.607679 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:40.607636 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2ead0945-3986-4af6-8922-0d1eeabb3e3b-tmp\") pod \"kube-auth-proxy-6568cc58bc-nthq7\" (UID: \"2ead0945-3986-4af6-8922-0d1eeabb3e3b\") " pod="openshift-ingress/kube-auth-proxy-6568cc58bc-nthq7" Apr 17 20:10:40.607679 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:40.607676 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl45m\" (UniqueName: \"kubernetes.io/projected/2ead0945-3986-4af6-8922-0d1eeabb3e3b-kube-api-access-hl45m\") pod \"kube-auth-proxy-6568cc58bc-nthq7\" (UID: \"2ead0945-3986-4af6-8922-0d1eeabb3e3b\") " pod="openshift-ingress/kube-auth-proxy-6568cc58bc-nthq7" Apr 17 20:10:40.607914 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:40.607824 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2ead0945-3986-4af6-8922-0d1eeabb3e3b-tls-certs\") pod \"kube-auth-proxy-6568cc58bc-nthq7\" (UID: \"2ead0945-3986-4af6-8922-0d1eeabb3e3b\") " pod="openshift-ingress/kube-auth-proxy-6568cc58bc-nthq7" Apr 17 20:10:40.709129 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:40.709087 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2ead0945-3986-4af6-8922-0d1eeabb3e3b-tls-certs\") pod \"kube-auth-proxy-6568cc58bc-nthq7\" (UID: \"2ead0945-3986-4af6-8922-0d1eeabb3e3b\") " pod="openshift-ingress/kube-auth-proxy-6568cc58bc-nthq7" Apr 17 20:10:40.709283 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:40.709141 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2ead0945-3986-4af6-8922-0d1eeabb3e3b-tmp\") pod \"kube-auth-proxy-6568cc58bc-nthq7\" (UID: \"2ead0945-3986-4af6-8922-0d1eeabb3e3b\") " pod="openshift-ingress/kube-auth-proxy-6568cc58bc-nthq7" Apr 17 20:10:40.709359 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:40.709312 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hl45m\" (UniqueName: \"kubernetes.io/projected/2ead0945-3986-4af6-8922-0d1eeabb3e3b-kube-api-access-hl45m\") pod \"kube-auth-proxy-6568cc58bc-nthq7\" (UID: \"2ead0945-3986-4af6-8922-0d1eeabb3e3b\") " pod="openshift-ingress/kube-auth-proxy-6568cc58bc-nthq7" Apr 17 20:10:40.711482 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:40.711451 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2ead0945-3986-4af6-8922-0d1eeabb3e3b-tmp\") pod \"kube-auth-proxy-6568cc58bc-nthq7\" (UID: \"2ead0945-3986-4af6-8922-0d1eeabb3e3b\") " pod="openshift-ingress/kube-auth-proxy-6568cc58bc-nthq7" Apr 17 20:10:40.711720 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:40.711699 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2ead0945-3986-4af6-8922-0d1eeabb3e3b-tls-certs\") pod \"kube-auth-proxy-6568cc58bc-nthq7\" (UID: \"2ead0945-3986-4af6-8922-0d1eeabb3e3b\") " pod="openshift-ingress/kube-auth-proxy-6568cc58bc-nthq7" Apr 17 20:10:40.718283 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:40.718259 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl45m\" (UniqueName: \"kubernetes.io/projected/2ead0945-3986-4af6-8922-0d1eeabb3e3b-kube-api-access-hl45m\") pod \"kube-auth-proxy-6568cc58bc-nthq7\" (UID: \"2ead0945-3986-4af6-8922-0d1eeabb3e3b\") " pod="openshift-ingress/kube-auth-proxy-6568cc58bc-nthq7" Apr 17 20:10:40.831977 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:40.831938 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-6568cc58bc-nthq7" Apr 17 20:10:40.954974 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:40.954937 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-6568cc58bc-nthq7"] Apr 17 20:10:40.958782 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:10:40.958747 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ead0945_3986_4af6_8922_0d1eeabb3e3b.slice/crio-d65840932456649ca80ce2ccd052fd05103a580aeb22f11411d9bdb058054609 WatchSource:0}: Error finding container d65840932456649ca80ce2ccd052fd05103a580aeb22f11411d9bdb058054609: Status 404 returned error can't find the container with id d65840932456649ca80ce2ccd052fd05103a580aeb22f11411d9bdb058054609 Apr 17 20:10:41.898796 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:41.898740 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-6568cc58bc-nthq7" event={"ID":"2ead0945-3986-4af6-8922-0d1eeabb3e3b","Type":"ContainerStarted","Data":"d65840932456649ca80ce2ccd052fd05103a580aeb22f11411d9bdb058054609"} Apr 17 20:10:44.915577 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:44.915483 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-6568cc58bc-nthq7" event={"ID":"2ead0945-3986-4af6-8922-0d1eeabb3e3b","Type":"ContainerStarted","Data":"800d25b76ca82c4322f9a5af19b95baacbd58ae9e5fc54a96cedf20f8bd239ef"} Apr 17 20:10:44.932191 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:44.932148 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-6568cc58bc-nthq7" podStartSLOduration=1.313793878 podStartE2EDuration="4.932133993s" podCreationTimestamp="2026-04-17 20:10:40 +0000 UTC" firstStartedPulling="2026-04-17 20:10:40.960615127 +0000 UTC m=+379.828434866" lastFinishedPulling="2026-04-17 20:10:44.578955242 +0000 UTC m=+383.446774981" observedRunningTime="2026-04-17 20:10:44.930255562 +0000 UTC m=+383.798075333" watchObservedRunningTime="2026-04-17 20:10:44.932133993 +0000 UTC m=+383.799953753" Apr 17 20:10:47.884030 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:10:47.883998 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-fcb6f8ffb-q95vl" Apr 17 20:12:21.550453 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:12:21.550419 2576 scope.go:117] "RemoveContainer" containerID="aceccba09fec67b6a07cee09e662f287b4bfe2bdcbff21e72345c3a085f386ab" Apr 17 20:13:28.552509 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:28.552483 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-fhhwn"] Apr 17 20:13:28.555835 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:28.555818 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-fhhwn" Apr 17 20:13:28.558038 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:28.558019 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 20:13:28.558299 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:28.558283 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-vg7cl\"" Apr 17 20:13:28.558354 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:28.558295 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 17 20:13:28.558903 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:28.558886 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 20:13:28.563441 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:28.563407 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-fhhwn"] Apr 17 20:13:28.657751 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:28.657721 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-fhhwn"] Apr 17 20:13:28.694126 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:28.694096 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/ad4503d5-73cc-41f2-9605-70ea78a702ca-config-file\") pod \"limitador-limitador-7d549b5b-fhhwn\" (UID: \"ad4503d5-73cc-41f2-9605-70ea78a702ca\") " pod="kuadrant-system/limitador-limitador-7d549b5b-fhhwn" Apr 17 20:13:28.694282 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:28.694131 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g48c\" (UniqueName: \"kubernetes.io/projected/ad4503d5-73cc-41f2-9605-70ea78a702ca-kube-api-access-4g48c\") pod \"limitador-limitador-7d549b5b-fhhwn\" (UID: \"ad4503d5-73cc-41f2-9605-70ea78a702ca\") " pod="kuadrant-system/limitador-limitador-7d549b5b-fhhwn" Apr 17 20:13:28.795098 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:28.795062 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/ad4503d5-73cc-41f2-9605-70ea78a702ca-config-file\") pod \"limitador-limitador-7d549b5b-fhhwn\" (UID: \"ad4503d5-73cc-41f2-9605-70ea78a702ca\") " pod="kuadrant-system/limitador-limitador-7d549b5b-fhhwn" Apr 17 20:13:28.795294 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:28.795110 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4g48c\" (UniqueName: \"kubernetes.io/projected/ad4503d5-73cc-41f2-9605-70ea78a702ca-kube-api-access-4g48c\") pod \"limitador-limitador-7d549b5b-fhhwn\" (UID: \"ad4503d5-73cc-41f2-9605-70ea78a702ca\") " pod="kuadrant-system/limitador-limitador-7d549b5b-fhhwn" Apr 17 20:13:28.795741 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:28.795720 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/ad4503d5-73cc-41f2-9605-70ea78a702ca-config-file\") pod \"limitador-limitador-7d549b5b-fhhwn\" (UID: \"ad4503d5-73cc-41f2-9605-70ea78a702ca\") " pod="kuadrant-system/limitador-limitador-7d549b5b-fhhwn" Apr 17 20:13:28.802428 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:28.802401 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g48c\" (UniqueName: \"kubernetes.io/projected/ad4503d5-73cc-41f2-9605-70ea78a702ca-kube-api-access-4g48c\") pod \"limitador-limitador-7d549b5b-fhhwn\" (UID: \"ad4503d5-73cc-41f2-9605-70ea78a702ca\") " pod="kuadrant-system/limitador-limitador-7d549b5b-fhhwn" Apr 17 20:13:28.866449 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:28.866356 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-fhhwn" Apr 17 20:13:28.986338 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:28.986301 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-fhhwn"] Apr 17 20:13:28.988826 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:13:28.988801 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad4503d5_73cc_41f2_9605_70ea78a702ca.slice/crio-2ef62b46687f15f7ec713b11557cb8e19f7003c9101f680ce2a5cb34663194be WatchSource:0}: Error finding container 2ef62b46687f15f7ec713b11557cb8e19f7003c9101f680ce2a5cb34663194be: Status 404 returned error can't find the container with id 2ef62b46687f15f7ec713b11557cb8e19f7003c9101f680ce2a5cb34663194be Apr 17 20:13:29.365194 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:29.365163 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-6vtnx"] Apr 17 20:13:29.370203 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:29.370178 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-6vtnx" Apr 17 20:13:29.372497 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:29.372473 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-lp6jc\"" Apr 17 20:13:29.373654 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:29.373630 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-6vtnx"] Apr 17 20:13:29.445673 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:29.445639 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-fhhwn" event={"ID":"ad4503d5-73cc-41f2-9605-70ea78a702ca","Type":"ContainerStarted","Data":"2ef62b46687f15f7ec713b11557cb8e19f7003c9101f680ce2a5cb34663194be"} Apr 17 20:13:29.500526 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:29.500488 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6ljt\" (UniqueName: \"kubernetes.io/projected/755db018-6048-474e-bc46-94402ec6cd70-kube-api-access-j6ljt\") pod \"authorino-f99f4b5cd-6vtnx\" (UID: \"755db018-6048-474e-bc46-94402ec6cd70\") " pod="kuadrant-system/authorino-f99f4b5cd-6vtnx" Apr 17 20:13:29.563950 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:29.563915 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-vn757"] Apr 17 20:13:29.567203 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:29.567187 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-vn757" Apr 17 20:13:29.574620 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:29.574592 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-vn757"] Apr 17 20:13:29.601475 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:29.601440 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j6ljt\" (UniqueName: \"kubernetes.io/projected/755db018-6048-474e-bc46-94402ec6cd70-kube-api-access-j6ljt\") pod \"authorino-f99f4b5cd-6vtnx\" (UID: \"755db018-6048-474e-bc46-94402ec6cd70\") " pod="kuadrant-system/authorino-f99f4b5cd-6vtnx" Apr 17 20:13:29.608932 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:29.608901 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6ljt\" (UniqueName: \"kubernetes.io/projected/755db018-6048-474e-bc46-94402ec6cd70-kube-api-access-j6ljt\") pod \"authorino-f99f4b5cd-6vtnx\" (UID: \"755db018-6048-474e-bc46-94402ec6cd70\") " pod="kuadrant-system/authorino-f99f4b5cd-6vtnx" Apr 17 20:13:29.681558 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:29.681480 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-6vtnx" Apr 17 20:13:29.703021 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:29.702985 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmvtn\" (UniqueName: \"kubernetes.io/projected/beb9ac41-f38b-4a84-a442-db39de1c166c-kube-api-access-mmvtn\") pod \"authorino-7498df8756-vn757\" (UID: \"beb9ac41-f38b-4a84-a442-db39de1c166c\") " pod="kuadrant-system/authorino-7498df8756-vn757" Apr 17 20:13:29.805059 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:29.804531 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mmvtn\" (UniqueName: \"kubernetes.io/projected/beb9ac41-f38b-4a84-a442-db39de1c166c-kube-api-access-mmvtn\") pod \"authorino-7498df8756-vn757\" (UID: \"beb9ac41-f38b-4a84-a442-db39de1c166c\") " pod="kuadrant-system/authorino-7498df8756-vn757" Apr 17 20:13:29.818171 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:29.818108 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmvtn\" (UniqueName: \"kubernetes.io/projected/beb9ac41-f38b-4a84-a442-db39de1c166c-kube-api-access-mmvtn\") pod \"authorino-7498df8756-vn757\" (UID: \"beb9ac41-f38b-4a84-a442-db39de1c166c\") " pod="kuadrant-system/authorino-7498df8756-vn757" Apr 17 20:13:29.833308 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:29.833280 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-6vtnx"] Apr 17 20:13:29.838821 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:13:29.838788 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod755db018_6048_474e_bc46_94402ec6cd70.slice/crio-df7c0f8a3c4a5707ccf006dd22d450da0ba7a932c306221e3a063c4568dc24c4 WatchSource:0}: Error finding container df7c0f8a3c4a5707ccf006dd22d450da0ba7a932c306221e3a063c4568dc24c4: Status 404 returned error can't find the container with id df7c0f8a3c4a5707ccf006dd22d450da0ba7a932c306221e3a063c4568dc24c4 Apr 17 20:13:29.876370 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:29.876322 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-vn757" Apr 17 20:13:30.041884 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:30.041822 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-vn757"] Apr 17 20:13:30.045518 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:13:30.045486 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbeb9ac41_f38b_4a84_a442_db39de1c166c.slice/crio-298a560c0ab8da87a40728439fa6f7c70e5e12f7e9a61df04d8348e6f3f8f0da WatchSource:0}: Error finding container 298a560c0ab8da87a40728439fa6f7c70e5e12f7e9a61df04d8348e6f3f8f0da: Status 404 returned error can't find the container with id 298a560c0ab8da87a40728439fa6f7c70e5e12f7e9a61df04d8348e6f3f8f0da Apr 17 20:13:30.451321 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:30.451267 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-6vtnx" event={"ID":"755db018-6048-474e-bc46-94402ec6cd70","Type":"ContainerStarted","Data":"df7c0f8a3c4a5707ccf006dd22d450da0ba7a932c306221e3a063c4568dc24c4"} Apr 17 20:13:30.452599 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:30.452559 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-vn757" event={"ID":"beb9ac41-f38b-4a84-a442-db39de1c166c","Type":"ContainerStarted","Data":"298a560c0ab8da87a40728439fa6f7c70e5e12f7e9a61df04d8348e6f3f8f0da"} Apr 17 20:13:33.466309 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:33.466220 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-fhhwn" event={"ID":"ad4503d5-73cc-41f2-9605-70ea78a702ca","Type":"ContainerStarted","Data":"bc68f36e5b17e0362446e9b048b122e2a25def953ba5d3517e18e372625f3c6d"} Apr 17 20:13:33.466725 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:33.466314 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-fhhwn" Apr 17 20:13:33.467525 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:33.467503 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-vn757" event={"ID":"beb9ac41-f38b-4a84-a442-db39de1c166c","Type":"ContainerStarted","Data":"796e7edecbe488bf4592ac8fc8958205d7509f3e62706df9f0b0548d4f55787c"} Apr 17 20:13:33.468780 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:33.468756 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-6vtnx" event={"ID":"755db018-6048-474e-bc46-94402ec6cd70","Type":"ContainerStarted","Data":"3689f939b5ed12ff0684fc8bdf45e8c419cd90612a9ebd8934ee0527a3ef4442"} Apr 17 20:13:33.482778 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:33.482738 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-fhhwn" podStartSLOduration=1.351885029 podStartE2EDuration="5.482726117s" podCreationTimestamp="2026-04-17 20:13:28 +0000 UTC" firstStartedPulling="2026-04-17 20:13:28.990736198 +0000 UTC m=+547.858555952" lastFinishedPulling="2026-04-17 20:13:33.121577295 +0000 UTC m=+551.989397040" observedRunningTime="2026-04-17 20:13:33.480484999 +0000 UTC m=+552.348304759" watchObservedRunningTime="2026-04-17 20:13:33.482726117 +0000 UTC m=+552.350545878" Apr 17 20:13:33.493566 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:33.493522 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-6vtnx" podStartSLOduration=1.263786126 podStartE2EDuration="4.493509717s" podCreationTimestamp="2026-04-17 20:13:29 +0000 UTC" firstStartedPulling="2026-04-17 20:13:29.8408721 +0000 UTC m=+548.708691846" lastFinishedPulling="2026-04-17 20:13:33.070595683 +0000 UTC m=+551.938415437" observedRunningTime="2026-04-17 20:13:33.492326812 +0000 UTC m=+552.360146574" watchObservedRunningTime="2026-04-17 20:13:33.493509717 +0000 UTC m=+552.361329478" Apr 17 20:13:33.531081 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:33.531032 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-vn757" podStartSLOduration=1.496177584 podStartE2EDuration="4.531015055s" podCreationTimestamp="2026-04-17 20:13:29 +0000 UTC" firstStartedPulling="2026-04-17 20:13:30.047148949 +0000 UTC m=+548.914968700" lastFinishedPulling="2026-04-17 20:13:33.081986429 +0000 UTC m=+551.949806171" observedRunningTime="2026-04-17 20:13:33.506207286 +0000 UTC m=+552.374027047" watchObservedRunningTime="2026-04-17 20:13:33.531015055 +0000 UTC m=+552.398834815" Apr 17 20:13:33.531684 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:33.531666 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-6vtnx"] Apr 17 20:13:35.475500 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:35.475456 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-6vtnx" podUID="755db018-6048-474e-bc46-94402ec6cd70" containerName="authorino" containerID="cri-o://3689f939b5ed12ff0684fc8bdf45e8c419cd90612a9ebd8934ee0527a3ef4442" gracePeriod=30 Apr 17 20:13:35.715405 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:35.715379 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-6vtnx" Apr 17 20:13:35.862220 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:35.862185 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6ljt\" (UniqueName: \"kubernetes.io/projected/755db018-6048-474e-bc46-94402ec6cd70-kube-api-access-j6ljt\") pod \"755db018-6048-474e-bc46-94402ec6cd70\" (UID: \"755db018-6048-474e-bc46-94402ec6cd70\") " Apr 17 20:13:35.864385 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:35.864363 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/755db018-6048-474e-bc46-94402ec6cd70-kube-api-access-j6ljt" (OuterVolumeSpecName: "kube-api-access-j6ljt") pod "755db018-6048-474e-bc46-94402ec6cd70" (UID: "755db018-6048-474e-bc46-94402ec6cd70"). InnerVolumeSpecName "kube-api-access-j6ljt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:13:35.962997 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:35.962956 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j6ljt\" (UniqueName: \"kubernetes.io/projected/755db018-6048-474e-bc46-94402ec6cd70-kube-api-access-j6ljt\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:13:36.479998 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:36.479963 2576 generic.go:358] "Generic (PLEG): container finished" podID="755db018-6048-474e-bc46-94402ec6cd70" containerID="3689f939b5ed12ff0684fc8bdf45e8c419cd90612a9ebd8934ee0527a3ef4442" exitCode=0 Apr 17 20:13:36.480416 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:36.480020 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-6vtnx" Apr 17 20:13:36.480416 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:36.480068 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-6vtnx" event={"ID":"755db018-6048-474e-bc46-94402ec6cd70","Type":"ContainerDied","Data":"3689f939b5ed12ff0684fc8bdf45e8c419cd90612a9ebd8934ee0527a3ef4442"} Apr 17 20:13:36.480416 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:36.480095 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-6vtnx" event={"ID":"755db018-6048-474e-bc46-94402ec6cd70","Type":"ContainerDied","Data":"df7c0f8a3c4a5707ccf006dd22d450da0ba7a932c306221e3a063c4568dc24c4"} Apr 17 20:13:36.480416 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:36.480110 2576 scope.go:117] "RemoveContainer" containerID="3689f939b5ed12ff0684fc8bdf45e8c419cd90612a9ebd8934ee0527a3ef4442" Apr 17 20:13:36.488247 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:36.488222 2576 scope.go:117] "RemoveContainer" containerID="3689f939b5ed12ff0684fc8bdf45e8c419cd90612a9ebd8934ee0527a3ef4442" Apr 17 20:13:36.488507 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:13:36.488489 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3689f939b5ed12ff0684fc8bdf45e8c419cd90612a9ebd8934ee0527a3ef4442\": container with ID starting with 3689f939b5ed12ff0684fc8bdf45e8c419cd90612a9ebd8934ee0527a3ef4442 not found: ID does not exist" containerID="3689f939b5ed12ff0684fc8bdf45e8c419cd90612a9ebd8934ee0527a3ef4442" Apr 17 20:13:36.488561 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:36.488517 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3689f939b5ed12ff0684fc8bdf45e8c419cd90612a9ebd8934ee0527a3ef4442"} err="failed to get container status \"3689f939b5ed12ff0684fc8bdf45e8c419cd90612a9ebd8934ee0527a3ef4442\": rpc error: code = NotFound desc = could not find container \"3689f939b5ed12ff0684fc8bdf45e8c419cd90612a9ebd8934ee0527a3ef4442\": container with ID starting with 3689f939b5ed12ff0684fc8bdf45e8c419cd90612a9ebd8934ee0527a3ef4442 not found: ID does not exist" Apr 17 20:13:36.500185 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:36.500150 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-6vtnx"] Apr 17 20:13:36.502751 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:36.502725 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-6vtnx"] Apr 17 20:13:37.638056 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:37.638024 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="755db018-6048-474e-bc46-94402ec6cd70" path="/var/lib/kubelet/pods/755db018-6048-474e-bc46-94402ec6cd70/volumes" Apr 17 20:13:43.539584 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:43.539544 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-fhhwn"] Apr 17 20:13:43.540075 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:43.539774 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-fhhwn" podUID="ad4503d5-73cc-41f2-9605-70ea78a702ca" containerName="limitador" containerID="cri-o://bc68f36e5b17e0362446e9b048b122e2a25def953ba5d3517e18e372625f3c6d" gracePeriod=30 Apr 17 20:13:43.540499 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:43.540477 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-fhhwn" Apr 17 20:13:44.501651 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:44.501627 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-fhhwn" Apr 17 20:13:44.505207 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:44.505179 2576 generic.go:358] "Generic (PLEG): container finished" podID="ad4503d5-73cc-41f2-9605-70ea78a702ca" containerID="bc68f36e5b17e0362446e9b048b122e2a25def953ba5d3517e18e372625f3c6d" exitCode=0 Apr 17 20:13:44.505320 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:44.505253 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-fhhwn" Apr 17 20:13:44.505320 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:44.505261 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-fhhwn" event={"ID":"ad4503d5-73cc-41f2-9605-70ea78a702ca","Type":"ContainerDied","Data":"bc68f36e5b17e0362446e9b048b122e2a25def953ba5d3517e18e372625f3c6d"} Apr 17 20:13:44.505320 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:44.505299 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-fhhwn" event={"ID":"ad4503d5-73cc-41f2-9605-70ea78a702ca","Type":"ContainerDied","Data":"2ef62b46687f15f7ec713b11557cb8e19f7003c9101f680ce2a5cb34663194be"} Apr 17 20:13:44.505320 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:44.505319 2576 scope.go:117] "RemoveContainer" containerID="bc68f36e5b17e0362446e9b048b122e2a25def953ba5d3517e18e372625f3c6d" Apr 17 20:13:44.512746 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:44.512713 2576 scope.go:117] "RemoveContainer" containerID="bc68f36e5b17e0362446e9b048b122e2a25def953ba5d3517e18e372625f3c6d" Apr 17 20:13:44.513059 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:13:44.513038 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc68f36e5b17e0362446e9b048b122e2a25def953ba5d3517e18e372625f3c6d\": container with ID starting with bc68f36e5b17e0362446e9b048b122e2a25def953ba5d3517e18e372625f3c6d not found: ID does not exist" containerID="bc68f36e5b17e0362446e9b048b122e2a25def953ba5d3517e18e372625f3c6d" Apr 17 20:13:44.513127 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:44.513068 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc68f36e5b17e0362446e9b048b122e2a25def953ba5d3517e18e372625f3c6d"} err="failed to get container status \"bc68f36e5b17e0362446e9b048b122e2a25def953ba5d3517e18e372625f3c6d\": rpc error: code = NotFound desc = could not find container \"bc68f36e5b17e0362446e9b048b122e2a25def953ba5d3517e18e372625f3c6d\": container with ID starting with bc68f36e5b17e0362446e9b048b122e2a25def953ba5d3517e18e372625f3c6d not found: ID does not exist" Apr 17 20:13:44.632078 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:44.631984 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g48c\" (UniqueName: \"kubernetes.io/projected/ad4503d5-73cc-41f2-9605-70ea78a702ca-kube-api-access-4g48c\") pod \"ad4503d5-73cc-41f2-9605-70ea78a702ca\" (UID: \"ad4503d5-73cc-41f2-9605-70ea78a702ca\") " Apr 17 20:13:44.632078 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:44.632049 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/ad4503d5-73cc-41f2-9605-70ea78a702ca-config-file\") pod \"ad4503d5-73cc-41f2-9605-70ea78a702ca\" (UID: \"ad4503d5-73cc-41f2-9605-70ea78a702ca\") " Apr 17 20:13:44.632465 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:44.632424 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad4503d5-73cc-41f2-9605-70ea78a702ca-config-file" (OuterVolumeSpecName: "config-file") pod "ad4503d5-73cc-41f2-9605-70ea78a702ca" (UID: "ad4503d5-73cc-41f2-9605-70ea78a702ca"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:13:44.634257 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:44.634234 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad4503d5-73cc-41f2-9605-70ea78a702ca-kube-api-access-4g48c" (OuterVolumeSpecName: "kube-api-access-4g48c") pod "ad4503d5-73cc-41f2-9605-70ea78a702ca" (UID: "ad4503d5-73cc-41f2-9605-70ea78a702ca"). InnerVolumeSpecName "kube-api-access-4g48c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:13:44.733479 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:44.733441 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4g48c\" (UniqueName: \"kubernetes.io/projected/ad4503d5-73cc-41f2-9605-70ea78a702ca-kube-api-access-4g48c\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:13:44.733479 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:44.733473 2576 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/ad4503d5-73cc-41f2-9605-70ea78a702ca-config-file\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:13:44.824961 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:44.824928 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-fhhwn"] Apr 17 20:13:44.827529 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:44.827503 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-fhhwn"] Apr 17 20:13:45.637935 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:45.637901 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad4503d5-73cc-41f2-9605-70ea78a702ca" path="/var/lib/kubelet/pods/ad4503d5-73cc-41f2-9605-70ea78a702ca/volumes" Apr 17 20:13:57.152017 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:57.151979 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-4vw2w"] Apr 17 20:13:57.152375 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:57.152318 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad4503d5-73cc-41f2-9605-70ea78a702ca" containerName="limitador" Apr 17 20:13:57.152375 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:57.152330 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad4503d5-73cc-41f2-9605-70ea78a702ca" containerName="limitador" Apr 17 20:13:57.152375 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:57.152341 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="755db018-6048-474e-bc46-94402ec6cd70" containerName="authorino" Apr 17 20:13:57.152375 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:57.152347 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="755db018-6048-474e-bc46-94402ec6cd70" containerName="authorino" Apr 17 20:13:57.152530 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:57.152403 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad4503d5-73cc-41f2-9605-70ea78a702ca" containerName="limitador" Apr 17 20:13:57.152530 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:57.152413 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="755db018-6048-474e-bc46-94402ec6cd70" containerName="authorino" Apr 17 20:13:57.160281 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:57.159998 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-4vw2w" Apr 17 20:13:57.161796 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:57.161773 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-4vw2w"] Apr 17 20:13:57.240943 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:57.240910 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t958m\" (UniqueName: \"kubernetes.io/projected/2f1cf45b-3c65-4be4-98b0-56c73abdb5fa-kube-api-access-t958m\") pod \"authorino-8b475cf9f-4vw2w\" (UID: \"2f1cf45b-3c65-4be4-98b0-56c73abdb5fa\") " pod="kuadrant-system/authorino-8b475cf9f-4vw2w" Apr 17 20:13:57.342226 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:57.342167 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t958m\" (UniqueName: \"kubernetes.io/projected/2f1cf45b-3c65-4be4-98b0-56c73abdb5fa-kube-api-access-t958m\") pod \"authorino-8b475cf9f-4vw2w\" (UID: \"2f1cf45b-3c65-4be4-98b0-56c73abdb5fa\") " pod="kuadrant-system/authorino-8b475cf9f-4vw2w" Apr 17 20:13:57.350127 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:57.350095 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t958m\" (UniqueName: \"kubernetes.io/projected/2f1cf45b-3c65-4be4-98b0-56c73abdb5fa-kube-api-access-t958m\") pod \"authorino-8b475cf9f-4vw2w\" (UID: \"2f1cf45b-3c65-4be4-98b0-56c73abdb5fa\") " pod="kuadrant-system/authorino-8b475cf9f-4vw2w" Apr 17 20:13:57.369207 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:57.369178 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-4vw2w"] Apr 17 20:13:57.369415 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:57.369404 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-4vw2w" Apr 17 20:13:57.394809 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:57.394778 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-79d69c56f8-pcj72"] Apr 17 20:13:57.399580 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:57.399560 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79d69c56f8-pcj72" Apr 17 20:13:57.402859 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:57.402777 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79d69c56f8-pcj72"] Apr 17 20:13:57.442756 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:57.442723 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79d69c56f8-pcj72"] Apr 17 20:13:57.443062 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:13:57.443031 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-9pzns], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-79d69c56f8-pcj72" podUID="248db835-de5c-4fd4-8afd-9a61c9abcaea" Apr 17 20:13:57.443205 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:57.443032 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pzns\" (UniqueName: \"kubernetes.io/projected/248db835-de5c-4fd4-8afd-9a61c9abcaea-kube-api-access-9pzns\") pod \"authorino-79d69c56f8-pcj72\" (UID: \"248db835-de5c-4fd4-8afd-9a61c9abcaea\") " pod="kuadrant-system/authorino-79d69c56f8-pcj72" Apr 17 20:13:57.467799 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:57.467769 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-95444fd86-rgwr8"] Apr 17 20:13:57.471602 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:57.471582 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-95444fd86-rgwr8" Apr 17 20:13:57.473678 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:57.473656 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 17 20:13:57.477653 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:57.477628 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-95444fd86-rgwr8"] Apr 17 20:13:57.496370 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:57.496344 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-4vw2w"] Apr 17 20:13:57.500569 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:13:57.500543 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f1cf45b_3c65_4be4_98b0_56c73abdb5fa.slice/crio-b3d3627aa28b29bf140ac9dd4f9f4c09426e6bbd9a589b2d6a11e0cb2b1013bb WatchSource:0}: Error finding container b3d3627aa28b29bf140ac9dd4f9f4c09426e6bbd9a589b2d6a11e0cb2b1013bb: Status 404 returned error can't find the container with id b3d3627aa28b29bf140ac9dd4f9f4c09426e6bbd9a589b2d6a11e0cb2b1013bb Apr 17 20:13:57.543408 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:57.543356 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbc2x\" (UniqueName: \"kubernetes.io/projected/f944698f-a710-4c44-a3a5-8ebb500e3d86-kube-api-access-tbc2x\") pod \"authorino-95444fd86-rgwr8\" (UID: \"f944698f-a710-4c44-a3a5-8ebb500e3d86\") " pod="kuadrant-system/authorino-95444fd86-rgwr8" Apr 17 20:13:57.543579 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:57.543481 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/f944698f-a710-4c44-a3a5-8ebb500e3d86-tls-cert\") pod \"authorino-95444fd86-rgwr8\" (UID: \"f944698f-a710-4c44-a3a5-8ebb500e3d86\") " pod="kuadrant-system/authorino-95444fd86-rgwr8" Apr 17 20:13:57.543579 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:57.543531 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9pzns\" (UniqueName: \"kubernetes.io/projected/248db835-de5c-4fd4-8afd-9a61c9abcaea-kube-api-access-9pzns\") pod \"authorino-79d69c56f8-pcj72\" (UID: \"248db835-de5c-4fd4-8afd-9a61c9abcaea\") " pod="kuadrant-system/authorino-79d69c56f8-pcj72" Apr 17 20:13:57.546694 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:57.546666 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-4vw2w" event={"ID":"2f1cf45b-3c65-4be4-98b0-56c73abdb5fa","Type":"ContainerStarted","Data":"b3d3627aa28b29bf140ac9dd4f9f4c09426e6bbd9a589b2d6a11e0cb2b1013bb"} Apr 17 20:13:57.546694 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:57.546688 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79d69c56f8-pcj72" Apr 17 20:13:57.550659 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:57.550633 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pzns\" (UniqueName: \"kubernetes.io/projected/248db835-de5c-4fd4-8afd-9a61c9abcaea-kube-api-access-9pzns\") pod \"authorino-79d69c56f8-pcj72\" (UID: \"248db835-de5c-4fd4-8afd-9a61c9abcaea\") " pod="kuadrant-system/authorino-79d69c56f8-pcj72" Apr 17 20:13:57.558630 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:57.558612 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79d69c56f8-pcj72" Apr 17 20:13:57.643830 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:57.643799 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pzns\" (UniqueName: \"kubernetes.io/projected/248db835-de5c-4fd4-8afd-9a61c9abcaea-kube-api-access-9pzns\") pod \"248db835-de5c-4fd4-8afd-9a61c9abcaea\" (UID: \"248db835-de5c-4fd4-8afd-9a61c9abcaea\") " Apr 17 20:13:57.643975 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:57.643933 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/f944698f-a710-4c44-a3a5-8ebb500e3d86-tls-cert\") pod \"authorino-95444fd86-rgwr8\" (UID: \"f944698f-a710-4c44-a3a5-8ebb500e3d86\") " pod="kuadrant-system/authorino-95444fd86-rgwr8" Apr 17 20:13:57.643975 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:57.643969 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tbc2x\" (UniqueName: \"kubernetes.io/projected/f944698f-a710-4c44-a3a5-8ebb500e3d86-kube-api-access-tbc2x\") pod \"authorino-95444fd86-rgwr8\" (UID: \"f944698f-a710-4c44-a3a5-8ebb500e3d86\") " pod="kuadrant-system/authorino-95444fd86-rgwr8" Apr 17 20:13:57.645922 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:57.645888 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/248db835-de5c-4fd4-8afd-9a61c9abcaea-kube-api-access-9pzns" (OuterVolumeSpecName: "kube-api-access-9pzns") pod "248db835-de5c-4fd4-8afd-9a61c9abcaea" (UID: "248db835-de5c-4fd4-8afd-9a61c9abcaea"). InnerVolumeSpecName "kube-api-access-9pzns". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:13:57.646246 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:57.646229 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/f944698f-a710-4c44-a3a5-8ebb500e3d86-tls-cert\") pod \"authorino-95444fd86-rgwr8\" (UID: \"f944698f-a710-4c44-a3a5-8ebb500e3d86\") " pod="kuadrant-system/authorino-95444fd86-rgwr8" Apr 17 20:13:57.651235 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:57.651212 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbc2x\" (UniqueName: \"kubernetes.io/projected/f944698f-a710-4c44-a3a5-8ebb500e3d86-kube-api-access-tbc2x\") pod \"authorino-95444fd86-rgwr8\" (UID: \"f944698f-a710-4c44-a3a5-8ebb500e3d86\") " pod="kuadrant-system/authorino-95444fd86-rgwr8" Apr 17 20:13:57.744732 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:57.744654 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9pzns\" (UniqueName: \"kubernetes.io/projected/248db835-de5c-4fd4-8afd-9a61c9abcaea-kube-api-access-9pzns\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:13:57.782434 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:57.782395 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-95444fd86-rgwr8" Apr 17 20:13:57.916933 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:57.916797 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-95444fd86-rgwr8"] Apr 17 20:13:57.920550 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:13:57.920497 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf944698f_a710_4c44_a3a5_8ebb500e3d86.slice/crio-c47683b8d3efa0a1533c5f0e66e176d6f6d92524058d3e6d20875270c763ea48 WatchSource:0}: Error finding container c47683b8d3efa0a1533c5f0e66e176d6f6d92524058d3e6d20875270c763ea48: Status 404 returned error can't find the container with id c47683b8d3efa0a1533c5f0e66e176d6f6d92524058d3e6d20875270c763ea48 Apr 17 20:13:58.551996 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:58.551969 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-95444fd86-rgwr8" event={"ID":"f944698f-a710-4c44-a3a5-8ebb500e3d86","Type":"ContainerStarted","Data":"749378ad5c6b95e660119d982864ac3bc17a2e4398007dd1dc6766bd22d71c75"} Apr 17 20:13:58.552389 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:58.552006 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-95444fd86-rgwr8" event={"ID":"f944698f-a710-4c44-a3a5-8ebb500e3d86","Type":"ContainerStarted","Data":"c47683b8d3efa0a1533c5f0e66e176d6f6d92524058d3e6d20875270c763ea48"} Apr 17 20:13:58.553286 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:58.553265 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79d69c56f8-pcj72" Apr 17 20:13:58.553380 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:58.553281 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-4vw2w" event={"ID":"2f1cf45b-3c65-4be4-98b0-56c73abdb5fa","Type":"ContainerStarted","Data":"5cc3e1842599d86fb760607c687e4f7e3bc76ef022f1acd4fd599994a5cee9ce"} Apr 17 20:13:58.553380 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:58.553310 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-8b475cf9f-4vw2w" podUID="2f1cf45b-3c65-4be4-98b0-56c73abdb5fa" containerName="authorino" containerID="cri-o://5cc3e1842599d86fb760607c687e4f7e3bc76ef022f1acd4fd599994a5cee9ce" gracePeriod=30 Apr 17 20:13:58.566366 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:58.566325 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-95444fd86-rgwr8" podStartSLOduration=1.211612933 podStartE2EDuration="1.56631047s" podCreationTimestamp="2026-04-17 20:13:57 +0000 UTC" firstStartedPulling="2026-04-17 20:13:57.922337656 +0000 UTC m=+576.790157396" lastFinishedPulling="2026-04-17 20:13:58.277035192 +0000 UTC m=+577.144854933" observedRunningTime="2026-04-17 20:13:58.565445194 +0000 UTC m=+577.433264954" watchObservedRunningTime="2026-04-17 20:13:58.56631047 +0000 UTC m=+577.434130234" Apr 17 20:13:58.581968 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:58.581914 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-8b475cf9f-4vw2w" podStartSLOduration=1.2983372260000001 podStartE2EDuration="1.581897798s" podCreationTimestamp="2026-04-17 20:13:57 +0000 UTC" firstStartedPulling="2026-04-17 20:13:57.501762347 +0000 UTC m=+576.369582086" lastFinishedPulling="2026-04-17 20:13:57.785322919 +0000 UTC m=+576.653142658" observedRunningTime="2026-04-17 20:13:58.580047132 +0000 UTC m=+577.447866896" watchObservedRunningTime="2026-04-17 20:13:58.581897798 +0000 UTC m=+577.449717614" Apr 17 20:13:58.593463 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:58.593434 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-vn757"] Apr 17 20:13:58.593701 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:58.593676 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-vn757" podUID="beb9ac41-f38b-4a84-a442-db39de1c166c" containerName="authorino" containerID="cri-o://796e7edecbe488bf4592ac8fc8958205d7509f3e62706df9f0b0548d4f55787c" gracePeriod=30 Apr 17 20:13:58.616593 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:58.616565 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79d69c56f8-pcj72"] Apr 17 20:13:58.619525 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:58.619497 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-79d69c56f8-pcj72"] Apr 17 20:13:58.640883 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:13:58.637439 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbeb9ac41_f38b_4a84_a442_db39de1c166c.slice/crio-796e7edecbe488bf4592ac8fc8958205d7509f3e62706df9f0b0548d4f55787c.scope\": RecentStats: unable to find data in memory cache]" Apr 17 20:13:58.845318 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:58.845294 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-4vw2w" Apr 17 20:13:58.853403 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:58.853379 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t958m\" (UniqueName: \"kubernetes.io/projected/2f1cf45b-3c65-4be4-98b0-56c73abdb5fa-kube-api-access-t958m\") pod \"2f1cf45b-3c65-4be4-98b0-56c73abdb5fa\" (UID: \"2f1cf45b-3c65-4be4-98b0-56c73abdb5fa\") " Apr 17 20:13:58.855412 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:58.855387 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f1cf45b-3c65-4be4-98b0-56c73abdb5fa-kube-api-access-t958m" (OuterVolumeSpecName: "kube-api-access-t958m") pod "2f1cf45b-3c65-4be4-98b0-56c73abdb5fa" (UID: "2f1cf45b-3c65-4be4-98b0-56c73abdb5fa"). InnerVolumeSpecName "kube-api-access-t958m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:13:58.855991 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:58.855974 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-vn757" Apr 17 20:13:58.954138 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:58.954100 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmvtn\" (UniqueName: \"kubernetes.io/projected/beb9ac41-f38b-4a84-a442-db39de1c166c-kube-api-access-mmvtn\") pod \"beb9ac41-f38b-4a84-a442-db39de1c166c\" (UID: \"beb9ac41-f38b-4a84-a442-db39de1c166c\") " Apr 17 20:13:58.954338 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:58.954249 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t958m\" (UniqueName: \"kubernetes.io/projected/2f1cf45b-3c65-4be4-98b0-56c73abdb5fa-kube-api-access-t958m\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:13:58.956204 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:58.956176 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/beb9ac41-f38b-4a84-a442-db39de1c166c-kube-api-access-mmvtn" (OuterVolumeSpecName: "kube-api-access-mmvtn") pod "beb9ac41-f38b-4a84-a442-db39de1c166c" (UID: "beb9ac41-f38b-4a84-a442-db39de1c166c"). InnerVolumeSpecName "kube-api-access-mmvtn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:13:59.055588 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.055546 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mmvtn\" (UniqueName: \"kubernetes.io/projected/beb9ac41-f38b-4a84-a442-db39de1c166c-kube-api-access-mmvtn\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:13:59.328518 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.328484 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-szq6s"] Apr 17 20:13:59.328873 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.328850 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="beb9ac41-f38b-4a84-a442-db39de1c166c" containerName="authorino" Apr 17 20:13:59.328873 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.328868 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="beb9ac41-f38b-4a84-a442-db39de1c166c" containerName="authorino" Apr 17 20:13:59.328873 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.328877 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2f1cf45b-3c65-4be4-98b0-56c73abdb5fa" containerName="authorino" Apr 17 20:13:59.329058 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.328883 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f1cf45b-3c65-4be4-98b0-56c73abdb5fa" containerName="authorino" Apr 17 20:13:59.329058 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.328940 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2f1cf45b-3c65-4be4-98b0-56c73abdb5fa" containerName="authorino" Apr 17 20:13:59.329058 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.328947 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="beb9ac41-f38b-4a84-a442-db39de1c166c" containerName="authorino" Apr 17 20:13:59.332097 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.332077 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-szq6s" Apr 17 20:13:59.334391 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.334369 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-n25rc\"" Apr 17 20:13:59.347154 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.347131 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-szq6s"] Apr 17 20:13:59.358569 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.358543 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfd85\" (UniqueName: \"kubernetes.io/projected/69a0956d-07a8-4de0-b717-a7d77e93632b-kube-api-access-sfd85\") pod \"maas-controller-6d4c8f55f9-szq6s\" (UID: \"69a0956d-07a8-4de0-b717-a7d77e93632b\") " pod="opendatahub/maas-controller-6d4c8f55f9-szq6s" Apr 17 20:13:59.459341 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.459307 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sfd85\" (UniqueName: \"kubernetes.io/projected/69a0956d-07a8-4de0-b717-a7d77e93632b-kube-api-access-sfd85\") pod \"maas-controller-6d4c8f55f9-szq6s\" (UID: \"69a0956d-07a8-4de0-b717-a7d77e93632b\") " pod="opendatahub/maas-controller-6d4c8f55f9-szq6s" Apr 17 20:13:59.469171 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.469133 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfd85\" (UniqueName: \"kubernetes.io/projected/69a0956d-07a8-4de0-b717-a7d77e93632b-kube-api-access-sfd85\") pod \"maas-controller-6d4c8f55f9-szq6s\" (UID: \"69a0956d-07a8-4de0-b717-a7d77e93632b\") " pod="opendatahub/maas-controller-6d4c8f55f9-szq6s" Apr 17 20:13:59.472602 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.472575 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-7b9dcbdb6d-575lm"] Apr 17 20:13:59.475950 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.475932 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7b9dcbdb6d-575lm" Apr 17 20:13:59.484880 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.484822 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7b9dcbdb6d-575lm"] Apr 17 20:13:59.557769 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.557739 2576 generic.go:358] "Generic (PLEG): container finished" podID="2f1cf45b-3c65-4be4-98b0-56c73abdb5fa" containerID="5cc3e1842599d86fb760607c687e4f7e3bc76ef022f1acd4fd599994a5cee9ce" exitCode=0 Apr 17 20:13:59.558177 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.557808 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-4vw2w" Apr 17 20:13:59.558177 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.557832 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-4vw2w" event={"ID":"2f1cf45b-3c65-4be4-98b0-56c73abdb5fa","Type":"ContainerDied","Data":"5cc3e1842599d86fb760607c687e4f7e3bc76ef022f1acd4fd599994a5cee9ce"} Apr 17 20:13:59.558177 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.557879 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-4vw2w" event={"ID":"2f1cf45b-3c65-4be4-98b0-56c73abdb5fa","Type":"ContainerDied","Data":"b3d3627aa28b29bf140ac9dd4f9f4c09426e6bbd9a589b2d6a11e0cb2b1013bb"} Apr 17 20:13:59.558177 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.557895 2576 scope.go:117] "RemoveContainer" containerID="5cc3e1842599d86fb760607c687e4f7e3bc76ef022f1acd4fd599994a5cee9ce" Apr 17 20:13:59.559111 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.559009 2576 generic.go:358] "Generic (PLEG): container finished" podID="beb9ac41-f38b-4a84-a442-db39de1c166c" containerID="796e7edecbe488bf4592ac8fc8958205d7509f3e62706df9f0b0548d4f55787c" exitCode=0 Apr 17 20:13:59.559111 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.559068 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-vn757" Apr 17 20:13:59.559111 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.559095 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-vn757" event={"ID":"beb9ac41-f38b-4a84-a442-db39de1c166c","Type":"ContainerDied","Data":"796e7edecbe488bf4592ac8fc8958205d7509f3e62706df9f0b0548d4f55787c"} Apr 17 20:13:59.559244 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.559129 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-vn757" event={"ID":"beb9ac41-f38b-4a84-a442-db39de1c166c","Type":"ContainerDied","Data":"298a560c0ab8da87a40728439fa6f7c70e5e12f7e9a61df04d8348e6f3f8f0da"} Apr 17 20:13:59.559826 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.559804 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdmcm\" (UniqueName: \"kubernetes.io/projected/564b30fd-1328-4f28-9961-f0afe1beeaaf-kube-api-access-mdmcm\") pod \"maas-controller-7b9dcbdb6d-575lm\" (UID: \"564b30fd-1328-4f28-9961-f0afe1beeaaf\") " pod="opendatahub/maas-controller-7b9dcbdb6d-575lm" Apr 17 20:13:59.566145 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.566127 2576 scope.go:117] "RemoveContainer" containerID="5cc3e1842599d86fb760607c687e4f7e3bc76ef022f1acd4fd599994a5cee9ce" Apr 17 20:13:59.566411 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:13:59.566387 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cc3e1842599d86fb760607c687e4f7e3bc76ef022f1acd4fd599994a5cee9ce\": container with ID starting with 5cc3e1842599d86fb760607c687e4f7e3bc76ef022f1acd4fd599994a5cee9ce not found: ID does not exist" containerID="5cc3e1842599d86fb760607c687e4f7e3bc76ef022f1acd4fd599994a5cee9ce" Apr 17 20:13:59.566506 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.566415 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cc3e1842599d86fb760607c687e4f7e3bc76ef022f1acd4fd599994a5cee9ce"} err="failed to get container status \"5cc3e1842599d86fb760607c687e4f7e3bc76ef022f1acd4fd599994a5cee9ce\": rpc error: code = NotFound desc = could not find container \"5cc3e1842599d86fb760607c687e4f7e3bc76ef022f1acd4fd599994a5cee9ce\": container with ID starting with 5cc3e1842599d86fb760607c687e4f7e3bc76ef022f1acd4fd599994a5cee9ce not found: ID does not exist" Apr 17 20:13:59.566506 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.566431 2576 scope.go:117] "RemoveContainer" containerID="796e7edecbe488bf4592ac8fc8958205d7509f3e62706df9f0b0548d4f55787c" Apr 17 20:13:59.573738 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.573714 2576 scope.go:117] "RemoveContainer" containerID="796e7edecbe488bf4592ac8fc8958205d7509f3e62706df9f0b0548d4f55787c" Apr 17 20:13:59.574010 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:13:59.573988 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"796e7edecbe488bf4592ac8fc8958205d7509f3e62706df9f0b0548d4f55787c\": container with ID starting with 796e7edecbe488bf4592ac8fc8958205d7509f3e62706df9f0b0548d4f55787c not found: ID does not exist" containerID="796e7edecbe488bf4592ac8fc8958205d7509f3e62706df9f0b0548d4f55787c" Apr 17 20:13:59.574066 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.574016 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"796e7edecbe488bf4592ac8fc8958205d7509f3e62706df9f0b0548d4f55787c"} err="failed to get container status \"796e7edecbe488bf4592ac8fc8958205d7509f3e62706df9f0b0548d4f55787c\": rpc error: code = NotFound desc = could not find container \"796e7edecbe488bf4592ac8fc8958205d7509f3e62706df9f0b0548d4f55787c\": container with ID starting with 796e7edecbe488bf4592ac8fc8958205d7509f3e62706df9f0b0548d4f55787c not found: ID does not exist" Apr 17 20:13:59.581216 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.581166 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-vn757"] Apr 17 20:13:59.586023 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.585953 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-szq6s"] Apr 17 20:13:59.586267 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.586252 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-szq6s" Apr 17 20:13:59.587281 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.587265 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-vn757"] Apr 17 20:13:59.605163 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.605119 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-4vw2w"] Apr 17 20:13:59.611248 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.611221 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-7f8b87ff8b-882w7"] Apr 17 20:13:59.615712 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.615691 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-4vw2w"] Apr 17 20:13:59.615805 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.615798 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7f8b87ff8b-882w7" Apr 17 20:13:59.620618 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.620597 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7f8b87ff8b-882w7"] Apr 17 20:13:59.640956 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.640906 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="248db835-de5c-4fd4-8afd-9a61c9abcaea" path="/var/lib/kubelet/pods/248db835-de5c-4fd4-8afd-9a61c9abcaea/volumes" Apr 17 20:13:59.641286 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.641265 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f1cf45b-3c65-4be4-98b0-56c73abdb5fa" path="/var/lib/kubelet/pods/2f1cf45b-3c65-4be4-98b0-56c73abdb5fa/volumes" Apr 17 20:13:59.642649 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.641731 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="beb9ac41-f38b-4a84-a442-db39de1c166c" path="/var/lib/kubelet/pods/beb9ac41-f38b-4a84-a442-db39de1c166c/volumes" Apr 17 20:13:59.660492 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.660463 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdmcm\" (UniqueName: \"kubernetes.io/projected/564b30fd-1328-4f28-9961-f0afe1beeaaf-kube-api-access-mdmcm\") pod \"maas-controller-7b9dcbdb6d-575lm\" (UID: \"564b30fd-1328-4f28-9961-f0afe1beeaaf\") " pod="opendatahub/maas-controller-7b9dcbdb6d-575lm" Apr 17 20:13:59.660630 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.660580 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7wnj\" (UniqueName: \"kubernetes.io/projected/f38a8d02-26c2-49bd-b474-3fd36ec1e756-kube-api-access-f7wnj\") pod \"maas-controller-7f8b87ff8b-882w7\" (UID: \"f38a8d02-26c2-49bd-b474-3fd36ec1e756\") " pod="opendatahub/maas-controller-7f8b87ff8b-882w7" Apr 17 20:13:59.671280 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.671231 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdmcm\" (UniqueName: \"kubernetes.io/projected/564b30fd-1328-4f28-9961-f0afe1beeaaf-kube-api-access-mdmcm\") pod \"maas-controller-7b9dcbdb6d-575lm\" (UID: \"564b30fd-1328-4f28-9961-f0afe1beeaaf\") " pod="opendatahub/maas-controller-7b9dcbdb6d-575lm" Apr 17 20:13:59.714476 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.714446 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-szq6s"] Apr 17 20:13:59.717235 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:13:59.717206 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69a0956d_07a8_4de0_b717_a7d77e93632b.slice/crio-5d346e9f98f179130f86aa266f9c148d4c1dc5fd846af2d3228a1e55f5c1505a WatchSource:0}: Error finding container 5d346e9f98f179130f86aa266f9c148d4c1dc5fd846af2d3228a1e55f5c1505a: Status 404 returned error can't find the container with id 5d346e9f98f179130f86aa266f9c148d4c1dc5fd846af2d3228a1e55f5c1505a Apr 17 20:13:59.761154 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.761117 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7wnj\" (UniqueName: \"kubernetes.io/projected/f38a8d02-26c2-49bd-b474-3fd36ec1e756-kube-api-access-f7wnj\") pod \"maas-controller-7f8b87ff8b-882w7\" (UID: \"f38a8d02-26c2-49bd-b474-3fd36ec1e756\") " pod="opendatahub/maas-controller-7f8b87ff8b-882w7" Apr 17 20:13:59.768800 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.768774 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7wnj\" (UniqueName: \"kubernetes.io/projected/f38a8d02-26c2-49bd-b474-3fd36ec1e756-kube-api-access-f7wnj\") pod \"maas-controller-7f8b87ff8b-882w7\" (UID: \"f38a8d02-26c2-49bd-b474-3fd36ec1e756\") " pod="opendatahub/maas-controller-7f8b87ff8b-882w7" Apr 17 20:13:59.786205 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.786176 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7b9dcbdb6d-575lm" Apr 17 20:13:59.903341 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.903313 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7b9dcbdb6d-575lm"] Apr 17 20:13:59.905194 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:13:59.905165 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod564b30fd_1328_4f28_9961_f0afe1beeaaf.slice/crio-3086c3742c0330ce5860e345dc955c55542fec5691321ac914daab2f1e1fab64 WatchSource:0}: Error finding container 3086c3742c0330ce5860e345dc955c55542fec5691321ac914daab2f1e1fab64: Status 404 returned error can't find the container with id 3086c3742c0330ce5860e345dc955c55542fec5691321ac914daab2f1e1fab64 Apr 17 20:13:59.927993 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:13:59.927967 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7f8b87ff8b-882w7" Apr 17 20:14:00.045136 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:00.045110 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7f8b87ff8b-882w7"] Apr 17 20:14:00.047513 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:14:00.047485 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf38a8d02_26c2_49bd_b474_3fd36ec1e756.slice/crio-948199cdb3c0bd788396eef51a4b1496292e601f2463fd2cabda9b626316c067 WatchSource:0}: Error finding container 948199cdb3c0bd788396eef51a4b1496292e601f2463fd2cabda9b626316c067: Status 404 returned error can't find the container with id 948199cdb3c0bd788396eef51a4b1496292e601f2463fd2cabda9b626316c067 Apr 17 20:14:00.581480 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:00.581444 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7f8b87ff8b-882w7" event={"ID":"f38a8d02-26c2-49bd-b474-3fd36ec1e756","Type":"ContainerStarted","Data":"948199cdb3c0bd788396eef51a4b1496292e601f2463fd2cabda9b626316c067"} Apr 17 20:14:00.586300 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:00.586264 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7b9dcbdb6d-575lm" event={"ID":"564b30fd-1328-4f28-9961-f0afe1beeaaf","Type":"ContainerStarted","Data":"3086c3742c0330ce5860e345dc955c55542fec5691321ac914daab2f1e1fab64"} Apr 17 20:14:00.590335 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:00.590304 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-szq6s" event={"ID":"69a0956d-07a8-4de0-b717-a7d77e93632b","Type":"ContainerStarted","Data":"5d346e9f98f179130f86aa266f9c148d4c1dc5fd846af2d3228a1e55f5c1505a"} Apr 17 20:14:03.608282 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:03.608238 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-szq6s" event={"ID":"69a0956d-07a8-4de0-b717-a7d77e93632b","Type":"ContainerStarted","Data":"f11bca9d164451c4afe59c3e21b795d6864c28d1a92ecfed48763a3855a3e8df"} Apr 17 20:14:03.608754 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:03.608279 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6d4c8f55f9-szq6s" podUID="69a0956d-07a8-4de0-b717-a7d77e93632b" containerName="manager" containerID="cri-o://f11bca9d164451c4afe59c3e21b795d6864c28d1a92ecfed48763a3855a3e8df" gracePeriod=10 Apr 17 20:14:03.608754 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:03.608292 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6d4c8f55f9-szq6s" Apr 17 20:14:03.609684 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:03.609662 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7f8b87ff8b-882w7" event={"ID":"f38a8d02-26c2-49bd-b474-3fd36ec1e756","Type":"ContainerStarted","Data":"4c6a345d110173db648609a75dc09fea3c98c20646038d82b6161c54ceb37a56"} Apr 17 20:14:03.609819 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:03.609759 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-7f8b87ff8b-882w7" Apr 17 20:14:03.610978 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:03.610952 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7b9dcbdb6d-575lm" event={"ID":"564b30fd-1328-4f28-9961-f0afe1beeaaf","Type":"ContainerStarted","Data":"48980931cf05fc6df28c7ae9435286c503eee7e5ac763870383c2aa322d7b967"} Apr 17 20:14:03.611178 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:03.611157 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-7b9dcbdb6d-575lm" Apr 17 20:14:03.625200 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:03.625159 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6d4c8f55f9-szq6s" podStartSLOduration=1.503771225 podStartE2EDuration="4.62514707s" podCreationTimestamp="2026-04-17 20:13:59 +0000 UTC" firstStartedPulling="2026-04-17 20:13:59.718545485 +0000 UTC m=+578.586365224" lastFinishedPulling="2026-04-17 20:14:02.839921328 +0000 UTC m=+581.707741069" observedRunningTime="2026-04-17 20:14:03.623643727 +0000 UTC m=+582.491463488" watchObservedRunningTime="2026-04-17 20:14:03.62514707 +0000 UTC m=+582.492966831" Apr 17 20:14:03.638752 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:03.638693 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-7f8b87ff8b-882w7" podStartSLOduration=1.837521011 podStartE2EDuration="4.638675746s" podCreationTimestamp="2026-04-17 20:13:59 +0000 UTC" firstStartedPulling="2026-04-17 20:14:00.048892571 +0000 UTC m=+578.916712310" lastFinishedPulling="2026-04-17 20:14:02.850047306 +0000 UTC m=+581.717867045" observedRunningTime="2026-04-17 20:14:03.637276065 +0000 UTC m=+582.505095828" watchObservedRunningTime="2026-04-17 20:14:03.638675746 +0000 UTC m=+582.506495509" Apr 17 20:14:03.653880 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:03.653805 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-7b9dcbdb6d-575lm" podStartSLOduration=1.720441858 podStartE2EDuration="4.653791785s" podCreationTimestamp="2026-04-17 20:13:59 +0000 UTC" firstStartedPulling="2026-04-17 20:13:59.906555407 +0000 UTC m=+578.774375146" lastFinishedPulling="2026-04-17 20:14:02.839905334 +0000 UTC m=+581.707725073" observedRunningTime="2026-04-17 20:14:03.653225437 +0000 UTC m=+582.521045199" watchObservedRunningTime="2026-04-17 20:14:03.653791785 +0000 UTC m=+582.521611545" Apr 17 20:14:03.854194 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:03.854168 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-szq6s" Apr 17 20:14:03.899188 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:03.899104 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfd85\" (UniqueName: \"kubernetes.io/projected/69a0956d-07a8-4de0-b717-a7d77e93632b-kube-api-access-sfd85\") pod \"69a0956d-07a8-4de0-b717-a7d77e93632b\" (UID: \"69a0956d-07a8-4de0-b717-a7d77e93632b\") " Apr 17 20:14:03.901359 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:03.901333 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69a0956d-07a8-4de0-b717-a7d77e93632b-kube-api-access-sfd85" (OuterVolumeSpecName: "kube-api-access-sfd85") pod "69a0956d-07a8-4de0-b717-a7d77e93632b" (UID: "69a0956d-07a8-4de0-b717-a7d77e93632b"). InnerVolumeSpecName "kube-api-access-sfd85". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:14:04.000616 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:04.000575 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sfd85\" (UniqueName: \"kubernetes.io/projected/69a0956d-07a8-4de0-b717-a7d77e93632b-kube-api-access-sfd85\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:14:04.615226 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:04.615189 2576 generic.go:358] "Generic (PLEG): container finished" podID="69a0956d-07a8-4de0-b717-a7d77e93632b" containerID="f11bca9d164451c4afe59c3e21b795d6864c28d1a92ecfed48763a3855a3e8df" exitCode=0 Apr 17 20:14:04.615711 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:04.615258 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-szq6s" Apr 17 20:14:04.615711 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:04.615267 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-szq6s" event={"ID":"69a0956d-07a8-4de0-b717-a7d77e93632b","Type":"ContainerDied","Data":"f11bca9d164451c4afe59c3e21b795d6864c28d1a92ecfed48763a3855a3e8df"} Apr 17 20:14:04.615711 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:04.615310 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-szq6s" event={"ID":"69a0956d-07a8-4de0-b717-a7d77e93632b","Type":"ContainerDied","Data":"5d346e9f98f179130f86aa266f9c148d4c1dc5fd846af2d3228a1e55f5c1505a"} Apr 17 20:14:04.615711 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:04.615335 2576 scope.go:117] "RemoveContainer" containerID="f11bca9d164451c4afe59c3e21b795d6864c28d1a92ecfed48763a3855a3e8df" Apr 17 20:14:04.623232 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:04.623216 2576 scope.go:117] "RemoveContainer" containerID="f11bca9d164451c4afe59c3e21b795d6864c28d1a92ecfed48763a3855a3e8df" Apr 17 20:14:04.623490 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:14:04.623469 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f11bca9d164451c4afe59c3e21b795d6864c28d1a92ecfed48763a3855a3e8df\": container with ID starting with f11bca9d164451c4afe59c3e21b795d6864c28d1a92ecfed48763a3855a3e8df not found: ID does not exist" containerID="f11bca9d164451c4afe59c3e21b795d6864c28d1a92ecfed48763a3855a3e8df" Apr 17 20:14:04.623536 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:04.623501 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f11bca9d164451c4afe59c3e21b795d6864c28d1a92ecfed48763a3855a3e8df"} err="failed to get container status \"f11bca9d164451c4afe59c3e21b795d6864c28d1a92ecfed48763a3855a3e8df\": rpc error: code = NotFound desc = could not find container \"f11bca9d164451c4afe59c3e21b795d6864c28d1a92ecfed48763a3855a3e8df\": container with ID starting with f11bca9d164451c4afe59c3e21b795d6864c28d1a92ecfed48763a3855a3e8df not found: ID does not exist" Apr 17 20:14:04.635146 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:04.635123 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-szq6s"] Apr 17 20:14:04.638894 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:04.638829 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-szq6s"] Apr 17 20:14:05.174050 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:05.174020 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-79456bcd74-7rzn5"] Apr 17 20:14:05.174343 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:05.174331 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="69a0956d-07a8-4de0-b717-a7d77e93632b" containerName="manager" Apr 17 20:14:05.174388 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:05.174345 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="69a0956d-07a8-4de0-b717-a7d77e93632b" containerName="manager" Apr 17 20:14:05.174421 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:05.174415 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="69a0956d-07a8-4de0-b717-a7d77e93632b" containerName="manager" Apr 17 20:14:05.176821 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:05.176804 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-79456bcd74-7rzn5" Apr 17 20:14:05.179185 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:05.179164 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 17 20:14:05.179349 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:05.179170 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 17 20:14:05.179349 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:05.179171 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-qrvkh\"" Apr 17 20:14:05.188709 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:05.188675 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-79456bcd74-7rzn5"] Apr 17 20:14:05.211355 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:05.211318 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdf5r\" (UniqueName: \"kubernetes.io/projected/b640df52-24ae-430c-92c3-9e93a5e7baaf-kube-api-access-kdf5r\") pod \"maas-api-79456bcd74-7rzn5\" (UID: \"b640df52-24ae-430c-92c3-9e93a5e7baaf\") " pod="opendatahub/maas-api-79456bcd74-7rzn5" Apr 17 20:14:05.211494 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:05.211381 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/b640df52-24ae-430c-92c3-9e93a5e7baaf-maas-api-tls\") pod \"maas-api-79456bcd74-7rzn5\" (UID: \"b640df52-24ae-430c-92c3-9e93a5e7baaf\") " pod="opendatahub/maas-api-79456bcd74-7rzn5" Apr 17 20:14:05.312810 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:05.312772 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdf5r\" (UniqueName: \"kubernetes.io/projected/b640df52-24ae-430c-92c3-9e93a5e7baaf-kube-api-access-kdf5r\") pod \"maas-api-79456bcd74-7rzn5\" (UID: \"b640df52-24ae-430c-92c3-9e93a5e7baaf\") " pod="opendatahub/maas-api-79456bcd74-7rzn5" Apr 17 20:14:05.312967 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:05.312827 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/b640df52-24ae-430c-92c3-9e93a5e7baaf-maas-api-tls\") pod \"maas-api-79456bcd74-7rzn5\" (UID: \"b640df52-24ae-430c-92c3-9e93a5e7baaf\") " pod="opendatahub/maas-api-79456bcd74-7rzn5" Apr 17 20:14:05.315237 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:05.315206 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/b640df52-24ae-430c-92c3-9e93a5e7baaf-maas-api-tls\") pod \"maas-api-79456bcd74-7rzn5\" (UID: \"b640df52-24ae-430c-92c3-9e93a5e7baaf\") " pod="opendatahub/maas-api-79456bcd74-7rzn5" Apr 17 20:14:05.322110 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:05.322083 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdf5r\" (UniqueName: \"kubernetes.io/projected/b640df52-24ae-430c-92c3-9e93a5e7baaf-kube-api-access-kdf5r\") pod \"maas-api-79456bcd74-7rzn5\" (UID: \"b640df52-24ae-430c-92c3-9e93a5e7baaf\") " pod="opendatahub/maas-api-79456bcd74-7rzn5" Apr 17 20:14:05.487276 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:05.487177 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-79456bcd74-7rzn5" Apr 17 20:14:05.611500 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:05.611476 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-79456bcd74-7rzn5"] Apr 17 20:14:05.614022 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:14:05.613992 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb640df52_24ae_430c_92c3_9e93a5e7baaf.slice/crio-f88abe05ff7c231a6e69ca1538796f49e4a09d96f114760e8456e940a764a053 WatchSource:0}: Error finding container f88abe05ff7c231a6e69ca1538796f49e4a09d96f114760e8456e940a764a053: Status 404 returned error can't find the container with id f88abe05ff7c231a6e69ca1538796f49e4a09d96f114760e8456e940a764a053 Apr 17 20:14:05.626930 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:05.626902 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-79456bcd74-7rzn5" event={"ID":"b640df52-24ae-430c-92c3-9e93a5e7baaf","Type":"ContainerStarted","Data":"f88abe05ff7c231a6e69ca1538796f49e4a09d96f114760e8456e940a764a053"} Apr 17 20:14:05.639967 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:05.639936 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69a0956d-07a8-4de0-b717-a7d77e93632b" path="/var/lib/kubelet/pods/69a0956d-07a8-4de0-b717-a7d77e93632b/volumes" Apr 17 20:14:07.640475 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:07.640396 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-79456bcd74-7rzn5" event={"ID":"b640df52-24ae-430c-92c3-9e93a5e7baaf","Type":"ContainerStarted","Data":"847378f4b5ab260739a4df61e9b8e34903cce496616ba5068a96d1d2867ccd67"} Apr 17 20:14:07.640475 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:07.640454 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-79456bcd74-7rzn5" Apr 17 20:14:07.657591 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:07.657541 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-79456bcd74-7rzn5" podStartSLOduration=0.967864006 podStartE2EDuration="2.657525379s" podCreationTimestamp="2026-04-17 20:14:05 +0000 UTC" firstStartedPulling="2026-04-17 20:14:05.615299975 +0000 UTC m=+584.483119714" lastFinishedPulling="2026-04-17 20:14:07.304961333 +0000 UTC m=+586.172781087" observedRunningTime="2026-04-17 20:14:07.655588881 +0000 UTC m=+586.523408643" watchObservedRunningTime="2026-04-17 20:14:07.657525379 +0000 UTC m=+586.525345181" Apr 17 20:14:13.648068 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:13.648042 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-79456bcd74-7rzn5" Apr 17 20:14:14.621403 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:14.621367 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-7f8b87ff8b-882w7" Apr 17 20:14:14.621775 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:14.621757 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-7b9dcbdb6d-575lm" Apr 17 20:14:14.674349 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:14.674316 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-7b9dcbdb6d-575lm"] Apr 17 20:14:14.674743 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:14.674509 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-7b9dcbdb6d-575lm" podUID="564b30fd-1328-4f28-9961-f0afe1beeaaf" containerName="manager" containerID="cri-o://48980931cf05fc6df28c7ae9435286c503eee7e5ac763870383c2aa322d7b967" gracePeriod=10 Apr 17 20:14:14.912766 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:14.912743 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7b9dcbdb6d-575lm" Apr 17 20:14:14.958671 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:14.958638 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-54d89dbfcc-xwpgt"] Apr 17 20:14:14.959006 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:14.958992 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="564b30fd-1328-4f28-9961-f0afe1beeaaf" containerName="manager" Apr 17 20:14:14.959006 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:14.959008 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="564b30fd-1328-4f28-9961-f0afe1beeaaf" containerName="manager" Apr 17 20:14:14.959098 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:14.959068 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="564b30fd-1328-4f28-9961-f0afe1beeaaf" containerName="manager" Apr 17 20:14:14.961022 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:14.961006 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-54d89dbfcc-xwpgt" Apr 17 20:14:14.968557 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:14.968506 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-54d89dbfcc-xwpgt"] Apr 17 20:14:14.994014 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:14.993985 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdmcm\" (UniqueName: \"kubernetes.io/projected/564b30fd-1328-4f28-9961-f0afe1beeaaf-kube-api-access-mdmcm\") pod \"564b30fd-1328-4f28-9961-f0afe1beeaaf\" (UID: \"564b30fd-1328-4f28-9961-f0afe1beeaaf\") " Apr 17 20:14:14.996088 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:14.996062 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/564b30fd-1328-4f28-9961-f0afe1beeaaf-kube-api-access-mdmcm" (OuterVolumeSpecName: "kube-api-access-mdmcm") pod "564b30fd-1328-4f28-9961-f0afe1beeaaf" (UID: "564b30fd-1328-4f28-9961-f0afe1beeaaf"). InnerVolumeSpecName "kube-api-access-mdmcm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:14:15.095594 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:15.095550 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxhng\" (UniqueName: \"kubernetes.io/projected/8d6c6dde-851f-4c11-84a9-81fc00ad270c-kube-api-access-cxhng\") pod \"maas-controller-54d89dbfcc-xwpgt\" (UID: \"8d6c6dde-851f-4c11-84a9-81fc00ad270c\") " pod="opendatahub/maas-controller-54d89dbfcc-xwpgt" Apr 17 20:14:15.095813 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:15.095607 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mdmcm\" (UniqueName: \"kubernetes.io/projected/564b30fd-1328-4f28-9961-f0afe1beeaaf-kube-api-access-mdmcm\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:14:15.196689 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:15.196598 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxhng\" (UniqueName: \"kubernetes.io/projected/8d6c6dde-851f-4c11-84a9-81fc00ad270c-kube-api-access-cxhng\") pod \"maas-controller-54d89dbfcc-xwpgt\" (UID: \"8d6c6dde-851f-4c11-84a9-81fc00ad270c\") " pod="opendatahub/maas-controller-54d89dbfcc-xwpgt" Apr 17 20:14:15.205008 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:15.204985 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxhng\" (UniqueName: \"kubernetes.io/projected/8d6c6dde-851f-4c11-84a9-81fc00ad270c-kube-api-access-cxhng\") pod \"maas-controller-54d89dbfcc-xwpgt\" (UID: \"8d6c6dde-851f-4c11-84a9-81fc00ad270c\") " pod="opendatahub/maas-controller-54d89dbfcc-xwpgt" Apr 17 20:14:15.272001 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:15.271965 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-54d89dbfcc-xwpgt" Apr 17 20:14:15.388668 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:15.388641 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-54d89dbfcc-xwpgt"] Apr 17 20:14:15.391104 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:14:15.391075 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d6c6dde_851f_4c11_84a9_81fc00ad270c.slice/crio-5a88f5004da336eadc8ac67b3e028d682df87d2928a22cb7ee0c88a861d2e969 WatchSource:0}: Error finding container 5a88f5004da336eadc8ac67b3e028d682df87d2928a22cb7ee0c88a861d2e969: Status 404 returned error can't find the container with id 5a88f5004da336eadc8ac67b3e028d682df87d2928a22cb7ee0c88a861d2e969 Apr 17 20:14:15.665552 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:15.665515 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-54d89dbfcc-xwpgt" event={"ID":"8d6c6dde-851f-4c11-84a9-81fc00ad270c","Type":"ContainerStarted","Data":"5a88f5004da336eadc8ac67b3e028d682df87d2928a22cb7ee0c88a861d2e969"} Apr 17 20:14:15.666675 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:15.666645 2576 generic.go:358] "Generic (PLEG): container finished" podID="564b30fd-1328-4f28-9961-f0afe1beeaaf" containerID="48980931cf05fc6df28c7ae9435286c503eee7e5ac763870383c2aa322d7b967" exitCode=0 Apr 17 20:14:15.666813 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:15.666690 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7b9dcbdb6d-575lm" event={"ID":"564b30fd-1328-4f28-9961-f0afe1beeaaf","Type":"ContainerDied","Data":"48980931cf05fc6df28c7ae9435286c503eee7e5ac763870383c2aa322d7b967"} Apr 17 20:14:15.666813 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:15.666719 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7b9dcbdb6d-575lm" Apr 17 20:14:15.666813 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:15.666734 2576 scope.go:117] "RemoveContainer" containerID="48980931cf05fc6df28c7ae9435286c503eee7e5ac763870383c2aa322d7b967" Apr 17 20:14:15.666999 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:15.666720 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7b9dcbdb6d-575lm" event={"ID":"564b30fd-1328-4f28-9961-f0afe1beeaaf","Type":"ContainerDied","Data":"3086c3742c0330ce5860e345dc955c55542fec5691321ac914daab2f1e1fab64"} Apr 17 20:14:15.674814 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:15.674779 2576 scope.go:117] "RemoveContainer" containerID="48980931cf05fc6df28c7ae9435286c503eee7e5ac763870383c2aa322d7b967" Apr 17 20:14:15.675168 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:14:15.675147 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48980931cf05fc6df28c7ae9435286c503eee7e5ac763870383c2aa322d7b967\": container with ID starting with 48980931cf05fc6df28c7ae9435286c503eee7e5ac763870383c2aa322d7b967 not found: ID does not exist" containerID="48980931cf05fc6df28c7ae9435286c503eee7e5ac763870383c2aa322d7b967" Apr 17 20:14:15.675211 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:15.675178 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48980931cf05fc6df28c7ae9435286c503eee7e5ac763870383c2aa322d7b967"} err="failed to get container status \"48980931cf05fc6df28c7ae9435286c503eee7e5ac763870383c2aa322d7b967\": rpc error: code = NotFound desc = could not find container \"48980931cf05fc6df28c7ae9435286c503eee7e5ac763870383c2aa322d7b967\": container with ID starting with 48980931cf05fc6df28c7ae9435286c503eee7e5ac763870383c2aa322d7b967 not found: ID does not exist" Apr 17 20:14:15.682430 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:15.682409 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-7b9dcbdb6d-575lm"] Apr 17 20:14:15.685471 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:15.685445 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-7b9dcbdb6d-575lm"] Apr 17 20:14:16.672723 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:16.672691 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-54d89dbfcc-xwpgt" event={"ID":"8d6c6dde-851f-4c11-84a9-81fc00ad270c","Type":"ContainerStarted","Data":"7e224e7495b1090c5b4c82b0e4821df8b953caa1ecc7261ef57da0113ad675d2"} Apr 17 20:14:16.672951 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:16.672797 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-54d89dbfcc-xwpgt" Apr 17 20:14:16.686657 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:16.686607 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-54d89dbfcc-xwpgt" podStartSLOduration=2.393604707 podStartE2EDuration="2.686591264s" podCreationTimestamp="2026-04-17 20:14:14 +0000 UTC" firstStartedPulling="2026-04-17 20:14:15.392341437 +0000 UTC m=+594.260161177" lastFinishedPulling="2026-04-17 20:14:15.685327981 +0000 UTC m=+594.553147734" observedRunningTime="2026-04-17 20:14:16.686425378 +0000 UTC m=+595.554245140" watchObservedRunningTime="2026-04-17 20:14:16.686591264 +0000 UTC m=+595.554411024" Apr 17 20:14:17.638604 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:17.638571 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="564b30fd-1328-4f28-9961-f0afe1beeaaf" path="/var/lib/kubelet/pods/564b30fd-1328-4f28-9961-f0afe1beeaaf/volumes" Apr 17 20:14:21.552385 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:21.552357 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2wkz_8c7c8612-982d-4a5d-81bd-d18fab9adeef/ovn-acl-logging/0.log" Apr 17 20:14:21.552826 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:21.552724 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2wkz_8c7c8612-982d-4a5d-81bd-d18fab9adeef/ovn-acl-logging/0.log" Apr 17 20:14:27.680430 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:27.680397 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-54d89dbfcc-xwpgt" Apr 17 20:14:27.719792 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:27.719764 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-7f8b87ff8b-882w7"] Apr 17 20:14:27.720045 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:27.720018 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-7f8b87ff8b-882w7" podUID="f38a8d02-26c2-49bd-b474-3fd36ec1e756" containerName="manager" containerID="cri-o://4c6a345d110173db648609a75dc09fea3c98c20646038d82b6161c54ceb37a56" gracePeriod=10 Apr 17 20:14:27.963550 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:27.963525 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7f8b87ff8b-882w7" Apr 17 20:14:28.109249 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:28.109201 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7wnj\" (UniqueName: \"kubernetes.io/projected/f38a8d02-26c2-49bd-b474-3fd36ec1e756-kube-api-access-f7wnj\") pod \"f38a8d02-26c2-49bd-b474-3fd36ec1e756\" (UID: \"f38a8d02-26c2-49bd-b474-3fd36ec1e756\") " Apr 17 20:14:28.111238 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:28.111207 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f38a8d02-26c2-49bd-b474-3fd36ec1e756-kube-api-access-f7wnj" (OuterVolumeSpecName: "kube-api-access-f7wnj") pod "f38a8d02-26c2-49bd-b474-3fd36ec1e756" (UID: "f38a8d02-26c2-49bd-b474-3fd36ec1e756"). InnerVolumeSpecName "kube-api-access-f7wnj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:14:28.210157 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:28.210080 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f7wnj\" (UniqueName: \"kubernetes.io/projected/f38a8d02-26c2-49bd-b474-3fd36ec1e756-kube-api-access-f7wnj\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:14:28.710559 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:28.710518 2576 generic.go:358] "Generic (PLEG): container finished" podID="f38a8d02-26c2-49bd-b474-3fd36ec1e756" containerID="4c6a345d110173db648609a75dc09fea3c98c20646038d82b6161c54ceb37a56" exitCode=0 Apr 17 20:14:28.711014 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:28.710584 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7f8b87ff8b-882w7" Apr 17 20:14:28.711014 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:28.710584 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7f8b87ff8b-882w7" event={"ID":"f38a8d02-26c2-49bd-b474-3fd36ec1e756","Type":"ContainerDied","Data":"4c6a345d110173db648609a75dc09fea3c98c20646038d82b6161c54ceb37a56"} Apr 17 20:14:28.711014 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:28.710630 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7f8b87ff8b-882w7" event={"ID":"f38a8d02-26c2-49bd-b474-3fd36ec1e756","Type":"ContainerDied","Data":"948199cdb3c0bd788396eef51a4b1496292e601f2463fd2cabda9b626316c067"} Apr 17 20:14:28.711014 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:28.710652 2576 scope.go:117] "RemoveContainer" containerID="4c6a345d110173db648609a75dc09fea3c98c20646038d82b6161c54ceb37a56" Apr 17 20:14:28.720105 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:28.720060 2576 scope.go:117] "RemoveContainer" containerID="4c6a345d110173db648609a75dc09fea3c98c20646038d82b6161c54ceb37a56" Apr 17 20:14:28.720386 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:14:28.720366 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c6a345d110173db648609a75dc09fea3c98c20646038d82b6161c54ceb37a56\": container with ID starting with 4c6a345d110173db648609a75dc09fea3c98c20646038d82b6161c54ceb37a56 not found: ID does not exist" containerID="4c6a345d110173db648609a75dc09fea3c98c20646038d82b6161c54ceb37a56" Apr 17 20:14:28.720453 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:28.720399 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c6a345d110173db648609a75dc09fea3c98c20646038d82b6161c54ceb37a56"} err="failed to get container status \"4c6a345d110173db648609a75dc09fea3c98c20646038d82b6161c54ceb37a56\": rpc error: code = NotFound desc = could not find container \"4c6a345d110173db648609a75dc09fea3c98c20646038d82b6161c54ceb37a56\": container with ID starting with 4c6a345d110173db648609a75dc09fea3c98c20646038d82b6161c54ceb37a56 not found: ID does not exist" Apr 17 20:14:28.733412 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:28.733162 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-7f8b87ff8b-882w7"] Apr 17 20:14:28.734965 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:28.734940 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-7f8b87ff8b-882w7"] Apr 17 20:14:29.638929 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:29.638896 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f38a8d02-26c2-49bd-b474-3fd36ec1e756" path="/var/lib/kubelet/pods/f38a8d02-26c2-49bd-b474-3fd36ec1e756/volumes" Apr 17 20:14:53.851914 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:53.851880 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-79456bcd74-7rzn5"] Apr 17 20:14:53.854647 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:53.852177 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-79456bcd74-7rzn5" podUID="b640df52-24ae-430c-92c3-9e93a5e7baaf" containerName="maas-api" containerID="cri-o://847378f4b5ab260739a4df61e9b8e34903cce496616ba5068a96d1d2867ccd67" gracePeriod=30 Apr 17 20:14:54.087914 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:54.087888 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-79456bcd74-7rzn5" Apr 17 20:14:54.131042 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:54.130968 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdf5r\" (UniqueName: \"kubernetes.io/projected/b640df52-24ae-430c-92c3-9e93a5e7baaf-kube-api-access-kdf5r\") pod \"b640df52-24ae-430c-92c3-9e93a5e7baaf\" (UID: \"b640df52-24ae-430c-92c3-9e93a5e7baaf\") " Apr 17 20:14:54.131042 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:54.131018 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/b640df52-24ae-430c-92c3-9e93a5e7baaf-maas-api-tls\") pod \"b640df52-24ae-430c-92c3-9e93a5e7baaf\" (UID: \"b640df52-24ae-430c-92c3-9e93a5e7baaf\") " Apr 17 20:14:54.133090 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:54.133062 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b640df52-24ae-430c-92c3-9e93a5e7baaf-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "b640df52-24ae-430c-92c3-9e93a5e7baaf" (UID: "b640df52-24ae-430c-92c3-9e93a5e7baaf"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:14:54.133177 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:54.133147 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b640df52-24ae-430c-92c3-9e93a5e7baaf-kube-api-access-kdf5r" (OuterVolumeSpecName: "kube-api-access-kdf5r") pod "b640df52-24ae-430c-92c3-9e93a5e7baaf" (UID: "b640df52-24ae-430c-92c3-9e93a5e7baaf"). InnerVolumeSpecName "kube-api-access-kdf5r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:14:54.231623 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:54.231586 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kdf5r\" (UniqueName: \"kubernetes.io/projected/b640df52-24ae-430c-92c3-9e93a5e7baaf-kube-api-access-kdf5r\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:14:54.231623 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:54.231621 2576 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/b640df52-24ae-430c-92c3-9e93a5e7baaf-maas-api-tls\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:14:54.799989 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:54.799955 2576 generic.go:358] "Generic (PLEG): container finished" podID="b640df52-24ae-430c-92c3-9e93a5e7baaf" containerID="847378f4b5ab260739a4df61e9b8e34903cce496616ba5068a96d1d2867ccd67" exitCode=0 Apr 17 20:14:54.800213 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:54.800026 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-79456bcd74-7rzn5" Apr 17 20:14:54.800213 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:54.800046 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-79456bcd74-7rzn5" event={"ID":"b640df52-24ae-430c-92c3-9e93a5e7baaf","Type":"ContainerDied","Data":"847378f4b5ab260739a4df61e9b8e34903cce496616ba5068a96d1d2867ccd67"} Apr 17 20:14:54.800213 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:54.800085 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-79456bcd74-7rzn5" event={"ID":"b640df52-24ae-430c-92c3-9e93a5e7baaf","Type":"ContainerDied","Data":"f88abe05ff7c231a6e69ca1538796f49e4a09d96f114760e8456e940a764a053"} Apr 17 20:14:54.800213 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:54.800104 2576 scope.go:117] "RemoveContainer" containerID="847378f4b5ab260739a4df61e9b8e34903cce496616ba5068a96d1d2867ccd67" Apr 17 20:14:54.808097 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:54.808074 2576 scope.go:117] "RemoveContainer" containerID="847378f4b5ab260739a4df61e9b8e34903cce496616ba5068a96d1d2867ccd67" Apr 17 20:14:54.808374 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:14:54.808352 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"847378f4b5ab260739a4df61e9b8e34903cce496616ba5068a96d1d2867ccd67\": container with ID starting with 847378f4b5ab260739a4df61e9b8e34903cce496616ba5068a96d1d2867ccd67 not found: ID does not exist" containerID="847378f4b5ab260739a4df61e9b8e34903cce496616ba5068a96d1d2867ccd67" Apr 17 20:14:54.808427 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:54.808384 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"847378f4b5ab260739a4df61e9b8e34903cce496616ba5068a96d1d2867ccd67"} err="failed to get container status \"847378f4b5ab260739a4df61e9b8e34903cce496616ba5068a96d1d2867ccd67\": rpc error: code = NotFound desc = could not find container \"847378f4b5ab260739a4df61e9b8e34903cce496616ba5068a96d1d2867ccd67\": container with ID starting with 847378f4b5ab260739a4df61e9b8e34903cce496616ba5068a96d1d2867ccd67 not found: ID does not exist" Apr 17 20:14:54.820513 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:54.820484 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-79456bcd74-7rzn5"] Apr 17 20:14:54.822088 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:54.822065 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-79456bcd74-7rzn5"] Apr 17 20:14:55.638127 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:55.638095 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b640df52-24ae-430c-92c3-9e93a5e7baaf" path="/var/lib/kubelet/pods/b640df52-24ae-430c-92c3-9e93a5e7baaf/volumes" Apr 17 20:14:59.764918 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:59.764879 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm"] Apr 17 20:14:59.765314 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:59.765202 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b640df52-24ae-430c-92c3-9e93a5e7baaf" containerName="maas-api" Apr 17 20:14:59.765314 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:59.765213 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b640df52-24ae-430c-92c3-9e93a5e7baaf" containerName="maas-api" Apr 17 20:14:59.765314 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:59.765231 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f38a8d02-26c2-49bd-b474-3fd36ec1e756" containerName="manager" Apr 17 20:14:59.765314 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:59.765237 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f38a8d02-26c2-49bd-b474-3fd36ec1e756" containerName="manager" Apr 17 20:14:59.765314 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:59.765292 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b640df52-24ae-430c-92c3-9e93a5e7baaf" containerName="maas-api" Apr 17 20:14:59.765314 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:59.765300 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f38a8d02-26c2-49bd-b474-3fd36ec1e756" containerName="manager" Apr 17 20:14:59.769924 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:59.769905 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" Apr 17 20:14:59.773180 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:59.773151 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-8zk9g\"" Apr 17 20:14:59.773308 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:59.773184 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 17 20:14:59.773308 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:59.773185 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 17 20:14:59.773308 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:59.773189 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 17 20:14:59.778450 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:59.778427 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm"] Apr 17 20:14:59.879607 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:59.879571 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0288f2a6-594b-4a71-bf11-4312437fc0b7-home\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm\" (UID: \"0288f2a6-594b-4a71-bf11-4312437fc0b7\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" Apr 17 20:14:59.879760 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:59.879623 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0288f2a6-594b-4a71-bf11-4312437fc0b7-model-cache\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm\" (UID: \"0288f2a6-594b-4a71-bf11-4312437fc0b7\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" Apr 17 20:14:59.879760 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:59.879674 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0288f2a6-594b-4a71-bf11-4312437fc0b7-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm\" (UID: \"0288f2a6-594b-4a71-bf11-4312437fc0b7\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" Apr 17 20:14:59.879760 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:59.879694 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjwv8\" (UniqueName: \"kubernetes.io/projected/0288f2a6-594b-4a71-bf11-4312437fc0b7-kube-api-access-gjwv8\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm\" (UID: \"0288f2a6-594b-4a71-bf11-4312437fc0b7\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" Apr 17 20:14:59.879760 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:59.879747 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0288f2a6-594b-4a71-bf11-4312437fc0b7-dshm\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm\" (UID: \"0288f2a6-594b-4a71-bf11-4312437fc0b7\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" Apr 17 20:14:59.879911 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:59.879795 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0288f2a6-594b-4a71-bf11-4312437fc0b7-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm\" (UID: \"0288f2a6-594b-4a71-bf11-4312437fc0b7\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" Apr 17 20:14:59.980432 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:59.980395 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0288f2a6-594b-4a71-bf11-4312437fc0b7-model-cache\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm\" (UID: \"0288f2a6-594b-4a71-bf11-4312437fc0b7\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" Apr 17 20:14:59.980602 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:59.980439 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0288f2a6-594b-4a71-bf11-4312437fc0b7-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm\" (UID: \"0288f2a6-594b-4a71-bf11-4312437fc0b7\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" Apr 17 20:14:59.980602 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:59.980462 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gjwv8\" (UniqueName: \"kubernetes.io/projected/0288f2a6-594b-4a71-bf11-4312437fc0b7-kube-api-access-gjwv8\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm\" (UID: \"0288f2a6-594b-4a71-bf11-4312437fc0b7\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" Apr 17 20:14:59.980602 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:59.980513 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0288f2a6-594b-4a71-bf11-4312437fc0b7-dshm\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm\" (UID: \"0288f2a6-594b-4a71-bf11-4312437fc0b7\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" Apr 17 20:14:59.980602 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:59.980562 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0288f2a6-594b-4a71-bf11-4312437fc0b7-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm\" (UID: \"0288f2a6-594b-4a71-bf11-4312437fc0b7\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" Apr 17 20:14:59.980602 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:59.980593 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0288f2a6-594b-4a71-bf11-4312437fc0b7-home\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm\" (UID: \"0288f2a6-594b-4a71-bf11-4312437fc0b7\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" Apr 17 20:14:59.980899 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:59.980865 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0288f2a6-594b-4a71-bf11-4312437fc0b7-model-cache\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm\" (UID: \"0288f2a6-594b-4a71-bf11-4312437fc0b7\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" Apr 17 20:14:59.980986 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:59.980961 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0288f2a6-594b-4a71-bf11-4312437fc0b7-home\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm\" (UID: \"0288f2a6-594b-4a71-bf11-4312437fc0b7\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" Apr 17 20:14:59.980986 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:59.980981 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0288f2a6-594b-4a71-bf11-4312437fc0b7-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm\" (UID: \"0288f2a6-594b-4a71-bf11-4312437fc0b7\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" Apr 17 20:14:59.982728 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:59.982701 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0288f2a6-594b-4a71-bf11-4312437fc0b7-dshm\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm\" (UID: \"0288f2a6-594b-4a71-bf11-4312437fc0b7\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" Apr 17 20:14:59.983004 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:59.982985 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0288f2a6-594b-4a71-bf11-4312437fc0b7-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm\" (UID: \"0288f2a6-594b-4a71-bf11-4312437fc0b7\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" Apr 17 20:14:59.987767 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:14:59.987741 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjwv8\" (UniqueName: \"kubernetes.io/projected/0288f2a6-594b-4a71-bf11-4312437fc0b7-kube-api-access-gjwv8\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm\" (UID: \"0288f2a6-594b-4a71-bf11-4312437fc0b7\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" Apr 17 20:15:00.081493 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:00.081458 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" Apr 17 20:15:00.133779 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:00.133747 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29607615-hxsgg"] Apr 17 20:15:00.140865 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:00.140654 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607615-hxsgg" Apr 17 20:15:00.141444 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:00.141419 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607615-hxsgg"] Apr 17 20:15:00.145338 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:00.144526 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-qrvkh\"" Apr 17 20:15:00.182234 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:00.182204 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj9pv\" (UniqueName: \"kubernetes.io/projected/e59351fd-81e5-415c-92ac-94da8b9a7b76-kube-api-access-fj9pv\") pod \"maas-api-key-cleanup-29607615-hxsgg\" (UID: \"e59351fd-81e5-415c-92ac-94da8b9a7b76\") " pod="opendatahub/maas-api-key-cleanup-29607615-hxsgg" Apr 17 20:15:00.208493 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:00.208468 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm"] Apr 17 20:15:00.210924 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:15:00.210894 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0288f2a6_594b_4a71_bf11_4312437fc0b7.slice/crio-4e499ae7cc50785b5b5087a1d1e00f7cee03b0e445a11ff487998bcc32e3653c WatchSource:0}: Error finding container 4e499ae7cc50785b5b5087a1d1e00f7cee03b0e445a11ff487998bcc32e3653c: Status 404 returned error can't find the container with id 4e499ae7cc50785b5b5087a1d1e00f7cee03b0e445a11ff487998bcc32e3653c Apr 17 20:15:00.212639 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:00.212624 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 20:15:00.283437 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:00.283402 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fj9pv\" (UniqueName: \"kubernetes.io/projected/e59351fd-81e5-415c-92ac-94da8b9a7b76-kube-api-access-fj9pv\") pod \"maas-api-key-cleanup-29607615-hxsgg\" (UID: \"e59351fd-81e5-415c-92ac-94da8b9a7b76\") " pod="opendatahub/maas-api-key-cleanup-29607615-hxsgg" Apr 17 20:15:00.291282 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:00.291251 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj9pv\" (UniqueName: \"kubernetes.io/projected/e59351fd-81e5-415c-92ac-94da8b9a7b76-kube-api-access-fj9pv\") pod \"maas-api-key-cleanup-29607615-hxsgg\" (UID: \"e59351fd-81e5-415c-92ac-94da8b9a7b76\") " pod="opendatahub/maas-api-key-cleanup-29607615-hxsgg" Apr 17 20:15:00.453952 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:00.453873 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607615-hxsgg" Apr 17 20:15:00.595439 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:00.595407 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607615-hxsgg"] Apr 17 20:15:00.597561 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:15:00.597499 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode59351fd_81e5_415c_92ac_94da8b9a7b76.slice/crio-1d3e8fa3584f0790db3ad454e4e176575f55df2997939acc65fffd097cd3d095 WatchSource:0}: Error finding container 1d3e8fa3584f0790db3ad454e4e176575f55df2997939acc65fffd097cd3d095: Status 404 returned error can't find the container with id 1d3e8fa3584f0790db3ad454e4e176575f55df2997939acc65fffd097cd3d095 Apr 17 20:15:00.827234 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:00.827197 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607615-hxsgg" event={"ID":"e59351fd-81e5-415c-92ac-94da8b9a7b76","Type":"ContainerStarted","Data":"1d3e8fa3584f0790db3ad454e4e176575f55df2997939acc65fffd097cd3d095"} Apr 17 20:15:00.828332 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:00.828310 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" event={"ID":"0288f2a6-594b-4a71-bf11-4312437fc0b7","Type":"ContainerStarted","Data":"4e499ae7cc50785b5b5087a1d1e00f7cee03b0e445a11ff487998bcc32e3653c"} Apr 17 20:15:01.834387 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:01.834344 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607615-hxsgg" event={"ID":"e59351fd-81e5-415c-92ac-94da8b9a7b76","Type":"ContainerStarted","Data":"79370ba1c4bfb4abe5fff9f62810535310bac23e690c90042ea8f7c0be79ef0a"} Apr 17 20:15:01.850154 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:01.850109 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29607615-hxsgg" podStartSLOduration=1.095939777 podStartE2EDuration="1.850092458s" podCreationTimestamp="2026-04-17 20:15:00 +0000 UTC" firstStartedPulling="2026-04-17 20:15:00.599652641 +0000 UTC m=+639.467472381" lastFinishedPulling="2026-04-17 20:15:01.353805323 +0000 UTC m=+640.221625062" observedRunningTime="2026-04-17 20:15:01.847632272 +0000 UTC m=+640.715452034" watchObservedRunningTime="2026-04-17 20:15:01.850092458 +0000 UTC m=+640.717912220" Apr 17 20:15:07.467097 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:07.467058 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7"] Apr 17 20:15:07.470871 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:07.470833 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" Apr 17 20:15:07.473356 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:07.473335 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 17 20:15:07.481930 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:07.481903 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7"] Apr 17 20:15:07.552239 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:07.552208 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7\" (UID: \"1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" Apr 17 20:15:07.552413 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:07.552270 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7\" (UID: \"1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" Apr 17 20:15:07.552413 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:07.552298 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7\" (UID: \"1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" Apr 17 20:15:07.552413 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:07.552363 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7\" (UID: \"1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" Apr 17 20:15:07.552413 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:07.552395 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcglk\" (UniqueName: \"kubernetes.io/projected/1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc-kube-api-access-lcglk\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7\" (UID: \"1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" Apr 17 20:15:07.552591 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:07.552463 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7\" (UID: \"1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" Apr 17 20:15:07.653746 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:07.653708 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7\" (UID: \"1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" Apr 17 20:15:07.653951 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:07.653754 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7\" (UID: \"1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" Apr 17 20:15:07.653951 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:07.653808 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7\" (UID: \"1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" Apr 17 20:15:07.653951 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:07.653833 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lcglk\" (UniqueName: \"kubernetes.io/projected/1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc-kube-api-access-lcglk\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7\" (UID: \"1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" Apr 17 20:15:07.653951 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:07.653934 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7\" (UID: \"1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" Apr 17 20:15:07.654182 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:07.653963 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7\" (UID: \"1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" Apr 17 20:15:07.654274 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:07.654247 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7\" (UID: \"1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" Apr 17 20:15:07.654342 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:07.654296 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7\" (UID: \"1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" Apr 17 20:15:07.654342 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:07.654322 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7\" (UID: \"1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" Apr 17 20:15:07.655984 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:07.655963 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7\" (UID: \"1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" Apr 17 20:15:07.656465 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:07.656450 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7\" (UID: \"1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" Apr 17 20:15:07.661653 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:07.661629 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcglk\" (UniqueName: \"kubernetes.io/projected/1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc-kube-api-access-lcglk\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7\" (UID: \"1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" Apr 17 20:15:07.782012 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:07.781979 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" Apr 17 20:15:07.858707 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:07.858665 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" event={"ID":"0288f2a6-594b-4a71-bf11-4312437fc0b7","Type":"ContainerStarted","Data":"8aae8e6b89b904d64b5328bb6532290f7d403a67faf2eef35b7b13bc08d76465"} Apr 17 20:15:07.915067 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:07.915041 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7"] Apr 17 20:15:07.917590 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:15:07.917561 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ba24aa5_a3f4_4b2d_a0dd_6b49459dc5fc.slice/crio-7511e59a93f112051643731a627ab4c83191c0211a38c5537ccd14d7c8fc96d4 WatchSource:0}: Error finding container 7511e59a93f112051643731a627ab4c83191c0211a38c5537ccd14d7c8fc96d4: Status 404 returned error can't find the container with id 7511e59a93f112051643731a627ab4c83191c0211a38c5537ccd14d7c8fc96d4 Apr 17 20:15:08.864467 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:08.864423 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" event={"ID":"1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc","Type":"ContainerStarted","Data":"992088fc71bffabfc9a43a8f6603116b066f686d7dc6590cc43268a818134fb1"} Apr 17 20:15:08.864985 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:08.864476 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" event={"ID":"1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc","Type":"ContainerStarted","Data":"7511e59a93f112051643731a627ab4c83191c0211a38c5537ccd14d7c8fc96d4"} Apr 17 20:15:12.879240 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:12.879200 2576 generic.go:358] "Generic (PLEG): container finished" podID="0288f2a6-594b-4a71-bf11-4312437fc0b7" containerID="8aae8e6b89b904d64b5328bb6532290f7d403a67faf2eef35b7b13bc08d76465" exitCode=0 Apr 17 20:15:12.879727 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:12.879247 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" event={"ID":"0288f2a6-594b-4a71-bf11-4312437fc0b7","Type":"ContainerDied","Data":"8aae8e6b89b904d64b5328bb6532290f7d403a67faf2eef35b7b13bc08d76465"} Apr 17 20:15:13.884715 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:13.884674 2576 generic.go:358] "Generic (PLEG): container finished" podID="1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc" containerID="992088fc71bffabfc9a43a8f6603116b066f686d7dc6590cc43268a818134fb1" exitCode=0 Apr 17 20:15:13.885129 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:13.884755 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" event={"ID":"1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc","Type":"ContainerDied","Data":"992088fc71bffabfc9a43a8f6603116b066f686d7dc6590cc43268a818134fb1"} Apr 17 20:15:14.889728 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:14.889700 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_0288f2a6-594b-4a71-bf11-4312437fc0b7/main/0.log" Apr 17 20:15:14.890123 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:14.890019 2576 generic.go:358] "Generic (PLEG): container finished" podID="0288f2a6-594b-4a71-bf11-4312437fc0b7" containerID="7eef872e30f6fe6792a7a9b56c25a1a79e1fc4e49a92757e2bf40371a5cd1c3e" exitCode=2 Apr 17 20:15:14.890123 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:14.890096 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" event={"ID":"0288f2a6-594b-4a71-bf11-4312437fc0b7","Type":"ContainerDied","Data":"7eef872e30f6fe6792a7a9b56c25a1a79e1fc4e49a92757e2bf40371a5cd1c3e"} Apr 17 20:15:14.890514 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:14.890496 2576 scope.go:117] "RemoveContainer" containerID="7eef872e30f6fe6792a7a9b56c25a1a79e1fc4e49a92757e2bf40371a5cd1c3e" Apr 17 20:15:14.891482 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:14.891464 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc/main/0.log" Apr 17 20:15:14.891799 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:14.891778 2576 generic.go:358] "Generic (PLEG): container finished" podID="1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc" containerID="0d71994d270a5247df636bceee94705e418f27d47d62397b689549576088b4b4" exitCode=2 Apr 17 20:15:14.891875 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:14.891818 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" event={"ID":"1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc","Type":"ContainerDied","Data":"0d71994d270a5247df636bceee94705e418f27d47d62397b689549576088b4b4"} Apr 17 20:15:14.892132 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:14.892114 2576 scope.go:117] "RemoveContainer" containerID="0d71994d270a5247df636bceee94705e418f27d47d62397b689549576088b4b4" Apr 17 20:15:15.897014 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:15.896988 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_0288f2a6-594b-4a71-bf11-4312437fc0b7/main/1.log" Apr 17 20:15:15.897470 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:15.897437 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_0288f2a6-594b-4a71-bf11-4312437fc0b7/main/0.log" Apr 17 20:15:15.897779 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:15.897759 2576 generic.go:358] "Generic (PLEG): container finished" podID="0288f2a6-594b-4a71-bf11-4312437fc0b7" containerID="6e40c9a0bf14366f6ddbd14a2ecfe076c706498f3c42bb2176788ff78243f210" exitCode=2 Apr 17 20:15:15.897869 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:15.897828 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" event={"ID":"0288f2a6-594b-4a71-bf11-4312437fc0b7","Type":"ContainerDied","Data":"6e40c9a0bf14366f6ddbd14a2ecfe076c706498f3c42bb2176788ff78243f210"} Apr 17 20:15:15.897993 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:15.897891 2576 scope.go:117] "RemoveContainer" containerID="7eef872e30f6fe6792a7a9b56c25a1a79e1fc4e49a92757e2bf40371a5cd1c3e" Apr 17 20:15:15.898291 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:15.898276 2576 scope.go:117] "RemoveContainer" containerID="6e40c9a0bf14366f6ddbd14a2ecfe076c706498f3c42bb2176788ff78243f210" Apr 17 20:15:15.898517 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:15:15.898496 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_llm(0288f2a6-594b-4a71-bf11-4312437fc0b7)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" podUID="0288f2a6-594b-4a71-bf11-4312437fc0b7" Apr 17 20:15:15.899231 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:15.899216 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc/main/1.log" Apr 17 20:15:15.899594 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:15.899582 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc/main/0.log" Apr 17 20:15:15.899874 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:15.899836 2576 generic.go:358] "Generic (PLEG): container finished" podID="1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc" containerID="2495b2763ce2a8595fe98d3d8ab277882845fe9b41487bf4dd29610cd31cc536" exitCode=2 Apr 17 20:15:15.899945 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:15.899873 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" event={"ID":"1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc","Type":"ContainerDied","Data":"2495b2763ce2a8595fe98d3d8ab277882845fe9b41487bf4dd29610cd31cc536"} Apr 17 20:15:15.900240 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:15.900223 2576 scope.go:117] "RemoveContainer" containerID="2495b2763ce2a8595fe98d3d8ab277882845fe9b41487bf4dd29610cd31cc536" Apr 17 20:15:15.900417 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:15:15.900399 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_llm(1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" podUID="1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc" Apr 17 20:15:15.908284 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:15.908270 2576 scope.go:117] "RemoveContainer" containerID="0d71994d270a5247df636bceee94705e418f27d47d62397b689549576088b4b4" Apr 17 20:15:16.904375 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:16.904346 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_0288f2a6-594b-4a71-bf11-4312437fc0b7/main/1.log" Apr 17 20:15:16.906013 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:16.905996 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc/main/1.log" Apr 17 20:15:17.782724 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:17.782690 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" Apr 17 20:15:17.782724 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:17.782726 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" Apr 17 20:15:17.783194 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:17.783175 2576 scope.go:117] "RemoveContainer" containerID="2495b2763ce2a8595fe98d3d8ab277882845fe9b41487bf4dd29610cd31cc536" Apr 17 20:15:17.783386 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:15:17.783368 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_llm(1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" podUID="1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc" Apr 17 20:15:20.081962 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:20.081924 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" Apr 17 20:15:20.081962 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:20.081967 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" Apr 17 20:15:20.082513 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:20.082485 2576 scope.go:117] "RemoveContainer" containerID="6e40c9a0bf14366f6ddbd14a2ecfe076c706498f3c42bb2176788ff78243f210" Apr 17 20:15:20.082698 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:15:20.082680 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_llm(0288f2a6-594b-4a71-bf11-4312437fc0b7)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" podUID="0288f2a6-594b-4a71-bf11-4312437fc0b7" Apr 17 20:15:22.928162 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:22.928126 2576 generic.go:358] "Generic (PLEG): container finished" podID="e59351fd-81e5-415c-92ac-94da8b9a7b76" containerID="79370ba1c4bfb4abe5fff9f62810535310bac23e690c90042ea8f7c0be79ef0a" exitCode=6 Apr 17 20:15:22.928515 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:22.928195 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607615-hxsgg" event={"ID":"e59351fd-81e5-415c-92ac-94da8b9a7b76","Type":"ContainerDied","Data":"79370ba1c4bfb4abe5fff9f62810535310bac23e690c90042ea8f7c0be79ef0a"} Apr 17 20:15:22.928515 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:22.928501 2576 scope.go:117] "RemoveContainer" containerID="79370ba1c4bfb4abe5fff9f62810535310bac23e690c90042ea8f7c0be79ef0a" Apr 17 20:15:23.933082 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:23.933047 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607615-hxsgg" event={"ID":"e59351fd-81e5-415c-92ac-94da8b9a7b76","Type":"ContainerStarted","Data":"add31f9036a6caf717bec24bd97faf075090f37d3b7821d16c33f9f69b95d94f"} Apr 17 20:15:26.465438 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:26.465404 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b"] Apr 17 20:15:26.469077 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:26.469056 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" Apr 17 20:15:26.471254 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:26.471228 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 17 20:15:26.478763 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:26.478736 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b"] Apr 17 20:15:26.626064 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:26.626020 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b6316c7d-9ee3-4f7f-87b4-b97fb2117b36-home\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-xgd2b\" (UID: \"b6316c7d-9ee3-4f7f-87b4-b97fb2117b36\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" Apr 17 20:15:26.626253 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:26.626144 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b6316c7d-9ee3-4f7f-87b4-b97fb2117b36-tls-certs\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-xgd2b\" (UID: \"b6316c7d-9ee3-4f7f-87b4-b97fb2117b36\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" Apr 17 20:15:26.626253 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:26.626174 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b6316c7d-9ee3-4f7f-87b4-b97fb2117b36-model-cache\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-xgd2b\" (UID: \"b6316c7d-9ee3-4f7f-87b4-b97fb2117b36\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" Apr 17 20:15:26.626253 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:26.626208 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b6316c7d-9ee3-4f7f-87b4-b97fb2117b36-dshm\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-xgd2b\" (UID: \"b6316c7d-9ee3-4f7f-87b4-b97fb2117b36\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" Apr 17 20:15:26.626378 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:26.626267 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b6316c7d-9ee3-4f7f-87b4-b97fb2117b36-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-xgd2b\" (UID: \"b6316c7d-9ee3-4f7f-87b4-b97fb2117b36\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" Apr 17 20:15:26.626378 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:26.626287 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvkx7\" (UniqueName: \"kubernetes.io/projected/b6316c7d-9ee3-4f7f-87b4-b97fb2117b36-kube-api-access-hvkx7\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-xgd2b\" (UID: \"b6316c7d-9ee3-4f7f-87b4-b97fb2117b36\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" Apr 17 20:15:26.727080 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:26.726992 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b6316c7d-9ee3-4f7f-87b4-b97fb2117b36-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-xgd2b\" (UID: \"b6316c7d-9ee3-4f7f-87b4-b97fb2117b36\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" Apr 17 20:15:26.727080 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:26.727036 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hvkx7\" (UniqueName: \"kubernetes.io/projected/b6316c7d-9ee3-4f7f-87b4-b97fb2117b36-kube-api-access-hvkx7\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-xgd2b\" (UID: \"b6316c7d-9ee3-4f7f-87b4-b97fb2117b36\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" Apr 17 20:15:26.727080 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:26.727077 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b6316c7d-9ee3-4f7f-87b4-b97fb2117b36-home\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-xgd2b\" (UID: \"b6316c7d-9ee3-4f7f-87b4-b97fb2117b36\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" Apr 17 20:15:26.727322 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:26.727166 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b6316c7d-9ee3-4f7f-87b4-b97fb2117b36-tls-certs\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-xgd2b\" (UID: \"b6316c7d-9ee3-4f7f-87b4-b97fb2117b36\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" Apr 17 20:15:26.727322 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:26.727200 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b6316c7d-9ee3-4f7f-87b4-b97fb2117b36-model-cache\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-xgd2b\" (UID: \"b6316c7d-9ee3-4f7f-87b4-b97fb2117b36\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" Apr 17 20:15:26.727322 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:26.727247 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b6316c7d-9ee3-4f7f-87b4-b97fb2117b36-dshm\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-xgd2b\" (UID: \"b6316c7d-9ee3-4f7f-87b4-b97fb2117b36\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" Apr 17 20:15:26.727532 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:26.727506 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b6316c7d-9ee3-4f7f-87b4-b97fb2117b36-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-xgd2b\" (UID: \"b6316c7d-9ee3-4f7f-87b4-b97fb2117b36\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" Apr 17 20:15:26.727614 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:26.727546 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b6316c7d-9ee3-4f7f-87b4-b97fb2117b36-home\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-xgd2b\" (UID: \"b6316c7d-9ee3-4f7f-87b4-b97fb2117b36\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" Apr 17 20:15:26.727671 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:26.727626 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b6316c7d-9ee3-4f7f-87b4-b97fb2117b36-model-cache\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-xgd2b\" (UID: \"b6316c7d-9ee3-4f7f-87b4-b97fb2117b36\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" Apr 17 20:15:26.729523 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:26.729503 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b6316c7d-9ee3-4f7f-87b4-b97fb2117b36-dshm\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-xgd2b\" (UID: \"b6316c7d-9ee3-4f7f-87b4-b97fb2117b36\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" Apr 17 20:15:26.729858 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:26.729825 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b6316c7d-9ee3-4f7f-87b4-b97fb2117b36-tls-certs\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-xgd2b\" (UID: \"b6316c7d-9ee3-4f7f-87b4-b97fb2117b36\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" Apr 17 20:15:26.735932 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:26.735911 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvkx7\" (UniqueName: \"kubernetes.io/projected/b6316c7d-9ee3-4f7f-87b4-b97fb2117b36-kube-api-access-hvkx7\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-xgd2b\" (UID: \"b6316c7d-9ee3-4f7f-87b4-b97fb2117b36\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" Apr 17 20:15:26.780669 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:26.780636 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" Apr 17 20:15:26.905812 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:26.905784 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b"] Apr 17 20:15:26.908011 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:15:26.907977 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6316c7d_9ee3_4f7f_87b4_b97fb2117b36.slice/crio-3767895693dd9270581751a83949147b7577449311d7874feffb14e004e7bc6c WatchSource:0}: Error finding container 3767895693dd9270581751a83949147b7577449311d7874feffb14e004e7bc6c: Status 404 returned error can't find the container with id 3767895693dd9270581751a83949147b7577449311d7874feffb14e004e7bc6c Apr 17 20:15:26.943903 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:26.943874 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" event={"ID":"b6316c7d-9ee3-4f7f-87b4-b97fb2117b36","Type":"ContainerStarted","Data":"3767895693dd9270581751a83949147b7577449311d7874feffb14e004e7bc6c"} Apr 17 20:15:27.951890 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:27.951851 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" event={"ID":"b6316c7d-9ee3-4f7f-87b4-b97fb2117b36","Type":"ContainerStarted","Data":"082b1c33c36bfde8ee56ca15efe45ae82ab22e18596b2a3d06594d4621650255"} Apr 17 20:15:31.468491 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:31.468455 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-6d5965695-bctd7"] Apr 17 20:15:31.496582 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:31.496552 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-6d5965695-bctd7"] Apr 17 20:15:31.496758 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:31.496734 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-bctd7" Apr 17 20:15:31.499038 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:31.499017 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 17 20:15:31.569329 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:31.569285 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3cf31824-04be-4c34-9b73-76116db5f973-model-cache\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-bctd7\" (UID: \"3cf31824-04be-4c34-9b73-76116db5f973\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-bctd7" Apr 17 20:15:31.569499 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:31.569377 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3cf31824-04be-4c34-9b73-76116db5f973-home\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-bctd7\" (UID: \"3cf31824-04be-4c34-9b73-76116db5f973\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-bctd7" Apr 17 20:15:31.569553 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:31.569492 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3cf31824-04be-4c34-9b73-76116db5f973-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-bctd7\" (UID: \"3cf31824-04be-4c34-9b73-76116db5f973\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-bctd7" Apr 17 20:15:31.569553 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:31.569524 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3cf31824-04be-4c34-9b73-76116db5f973-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-bctd7\" (UID: \"3cf31824-04be-4c34-9b73-76116db5f973\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-bctd7" Apr 17 20:15:31.569632 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:31.569550 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vw5m\" (UniqueName: \"kubernetes.io/projected/3cf31824-04be-4c34-9b73-76116db5f973-kube-api-access-7vw5m\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-bctd7\" (UID: \"3cf31824-04be-4c34-9b73-76116db5f973\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-bctd7" Apr 17 20:15:31.569632 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:31.569583 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3cf31824-04be-4c34-9b73-76116db5f973-dshm\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-bctd7\" (UID: \"3cf31824-04be-4c34-9b73-76116db5f973\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-bctd7" Apr 17 20:15:31.636517 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:31.636479 2576 scope.go:117] "RemoveContainer" containerID="6e40c9a0bf14366f6ddbd14a2ecfe076c706498f3c42bb2176788ff78243f210" Apr 17 20:15:31.636739 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:31.636717 2576 scope.go:117] "RemoveContainer" containerID="2495b2763ce2a8595fe98d3d8ab277882845fe9b41487bf4dd29610cd31cc536" Apr 17 20:15:31.670014 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:31.669987 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3cf31824-04be-4c34-9b73-76116db5f973-home\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-bctd7\" (UID: \"3cf31824-04be-4c34-9b73-76116db5f973\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-bctd7" Apr 17 20:15:31.670173 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:31.670098 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3cf31824-04be-4c34-9b73-76116db5f973-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-bctd7\" (UID: \"3cf31824-04be-4c34-9b73-76116db5f973\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-bctd7" Apr 17 20:15:31.670173 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:31.670128 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3cf31824-04be-4c34-9b73-76116db5f973-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-bctd7\" (UID: \"3cf31824-04be-4c34-9b73-76116db5f973\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-bctd7" Apr 17 20:15:31.670173 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:31.670153 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7vw5m\" (UniqueName: \"kubernetes.io/projected/3cf31824-04be-4c34-9b73-76116db5f973-kube-api-access-7vw5m\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-bctd7\" (UID: \"3cf31824-04be-4c34-9b73-76116db5f973\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-bctd7" Apr 17 20:15:31.670338 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:31.670200 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3cf31824-04be-4c34-9b73-76116db5f973-dshm\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-bctd7\" (UID: \"3cf31824-04be-4c34-9b73-76116db5f973\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-bctd7" Apr 17 20:15:31.670338 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:31.670263 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3cf31824-04be-4c34-9b73-76116db5f973-model-cache\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-bctd7\" (UID: \"3cf31824-04be-4c34-9b73-76116db5f973\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-bctd7" Apr 17 20:15:31.670541 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:31.670517 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3cf31824-04be-4c34-9b73-76116db5f973-home\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-bctd7\" (UID: \"3cf31824-04be-4c34-9b73-76116db5f973\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-bctd7" Apr 17 20:15:31.670621 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:31.670577 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3cf31824-04be-4c34-9b73-76116db5f973-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-bctd7\" (UID: \"3cf31824-04be-4c34-9b73-76116db5f973\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-bctd7" Apr 17 20:15:31.670680 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:31.670620 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3cf31824-04be-4c34-9b73-76116db5f973-model-cache\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-bctd7\" (UID: \"3cf31824-04be-4c34-9b73-76116db5f973\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-bctd7" Apr 17 20:15:31.672477 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:31.672450 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3cf31824-04be-4c34-9b73-76116db5f973-dshm\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-bctd7\" (UID: \"3cf31824-04be-4c34-9b73-76116db5f973\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-bctd7" Apr 17 20:15:31.672758 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:31.672740 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3cf31824-04be-4c34-9b73-76116db5f973-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-bctd7\" (UID: \"3cf31824-04be-4c34-9b73-76116db5f973\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-bctd7" Apr 17 20:15:31.677166 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:31.677140 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vw5m\" (UniqueName: \"kubernetes.io/projected/3cf31824-04be-4c34-9b73-76116db5f973-kube-api-access-7vw5m\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-bctd7\" (UID: \"3cf31824-04be-4c34-9b73-76116db5f973\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-bctd7" Apr 17 20:15:31.808118 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:31.808085 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-bctd7" Apr 17 20:15:31.946795 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:31.946770 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-6d5965695-bctd7"] Apr 17 20:15:31.967347 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:31.967325 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_0288f2a6-594b-4a71-bf11-4312437fc0b7/main/2.log" Apr 17 20:15:31.967697 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:31.967680 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_0288f2a6-594b-4a71-bf11-4312437fc0b7/main/1.log" Apr 17 20:15:31.970690 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:31.970612 2576 generic.go:358] "Generic (PLEG): container finished" podID="0288f2a6-594b-4a71-bf11-4312437fc0b7" containerID="b87844fb3a75c10d3ca67ab48fee9bace4fbdd9e4d6dcaf1742e7cc87735819a" exitCode=2 Apr 17 20:15:31.970756 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:31.970700 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" event={"ID":"0288f2a6-594b-4a71-bf11-4312437fc0b7","Type":"ContainerDied","Data":"b87844fb3a75c10d3ca67ab48fee9bace4fbdd9e4d6dcaf1742e7cc87735819a"} Apr 17 20:15:31.970756 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:31.970734 2576 scope.go:117] "RemoveContainer" containerID="6e40c9a0bf14366f6ddbd14a2ecfe076c706498f3c42bb2176788ff78243f210" Apr 17 20:15:31.971428 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:31.971264 2576 scope.go:117] "RemoveContainer" containerID="b87844fb3a75c10d3ca67ab48fee9bace4fbdd9e4d6dcaf1742e7cc87735819a" Apr 17 20:15:31.971524 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:15:31.971498 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_llm(0288f2a6-594b-4a71-bf11-4312437fc0b7)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" podUID="0288f2a6-594b-4a71-bf11-4312437fc0b7" Apr 17 20:15:31.973361 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:31.973341 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc/main/2.log" Apr 17 20:15:31.973764 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:31.973748 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc/main/1.log" Apr 17 20:15:31.974079 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:31.974058 2576 generic.go:358] "Generic (PLEG): container finished" podID="1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc" containerID="4ce145850b1a5a914c212910db1e40d826d65bff46871203e3c043249ee6266b" exitCode=2 Apr 17 20:15:31.974150 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:31.974083 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" event={"ID":"1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc","Type":"ContainerDied","Data":"4ce145850b1a5a914c212910db1e40d826d65bff46871203e3c043249ee6266b"} Apr 17 20:15:31.974443 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:31.974430 2576 scope.go:117] "RemoveContainer" containerID="4ce145850b1a5a914c212910db1e40d826d65bff46871203e3c043249ee6266b" Apr 17 20:15:31.974613 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:15:31.974597 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_llm(1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" podUID="1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc" Apr 17 20:15:32.004074 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:15:32.004030 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cf31824_04be_4c34_9b73_76116db5f973.slice/crio-a0001149b844c452c4c890e2245fcceb0bc9ff5ad3e93721a841aad1bd9f4c3e WatchSource:0}: Error finding container a0001149b844c452c4c890e2245fcceb0bc9ff5ad3e93721a841aad1bd9f4c3e: Status 404 returned error can't find the container with id a0001149b844c452c4c890e2245fcceb0bc9ff5ad3e93721a841aad1bd9f4c3e Apr 17 20:15:32.025542 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:32.025524 2576 scope.go:117] "RemoveContainer" containerID="2495b2763ce2a8595fe98d3d8ab277882845fe9b41487bf4dd29610cd31cc536" Apr 17 20:15:32.979161 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:32.979082 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_0288f2a6-594b-4a71-bf11-4312437fc0b7/main/2.log" Apr 17 20:15:32.980924 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:32.980895 2576 generic.go:358] "Generic (PLEG): container finished" podID="b6316c7d-9ee3-4f7f-87b4-b97fb2117b36" containerID="082b1c33c36bfde8ee56ca15efe45ae82ab22e18596b2a3d06594d4621650255" exitCode=0 Apr 17 20:15:32.981033 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:32.980964 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" event={"ID":"b6316c7d-9ee3-4f7f-87b4-b97fb2117b36","Type":"ContainerDied","Data":"082b1c33c36bfde8ee56ca15efe45ae82ab22e18596b2a3d06594d4621650255"} Apr 17 20:15:32.982463 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:32.982442 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc/main/2.log" Apr 17 20:15:32.984129 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:32.984111 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-bctd7" event={"ID":"3cf31824-04be-4c34-9b73-76116db5f973","Type":"ContainerStarted","Data":"9ac65d193f2cdfa77068ee6f9cb9e91fc5c7e440a82fe0c520f20b33acaa6113"} Apr 17 20:15:32.984203 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:32.984134 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-bctd7" event={"ID":"3cf31824-04be-4c34-9b73-76116db5f973","Type":"ContainerStarted","Data":"a0001149b844c452c4c890e2245fcceb0bc9ff5ad3e93721a841aad1bd9f4c3e"} Apr 17 20:15:33.565547 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:33.565517 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49"] Apr 17 20:15:33.568847 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:33.568822 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" Apr 17 20:15:33.570992 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:33.570969 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 17 20:15:33.577177 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:33.577151 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49"] Apr 17 20:15:33.589408 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:33.589379 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d219c10a-62e0-44ad-8e0d-4def7d54e510-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49\" (UID: \"d219c10a-62e0-44ad-8e0d-4def7d54e510\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" Apr 17 20:15:33.589546 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:33.589418 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc8d6\" (UniqueName: \"kubernetes.io/projected/d219c10a-62e0-44ad-8e0d-4def7d54e510-kube-api-access-fc8d6\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49\" (UID: \"d219c10a-62e0-44ad-8e0d-4def7d54e510\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" Apr 17 20:15:33.589546 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:33.589439 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d219c10a-62e0-44ad-8e0d-4def7d54e510-dshm\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49\" (UID: \"d219c10a-62e0-44ad-8e0d-4def7d54e510\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" Apr 17 20:15:33.589546 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:33.589459 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d219c10a-62e0-44ad-8e0d-4def7d54e510-model-cache\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49\" (UID: \"d219c10a-62e0-44ad-8e0d-4def7d54e510\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" Apr 17 20:15:33.589713 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:33.589550 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d219c10a-62e0-44ad-8e0d-4def7d54e510-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49\" (UID: \"d219c10a-62e0-44ad-8e0d-4def7d54e510\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" Apr 17 20:15:33.589713 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:33.589672 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d219c10a-62e0-44ad-8e0d-4def7d54e510-home\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49\" (UID: \"d219c10a-62e0-44ad-8e0d-4def7d54e510\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" Apr 17 20:15:33.690459 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:33.690421 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d219c10a-62e0-44ad-8e0d-4def7d54e510-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49\" (UID: \"d219c10a-62e0-44ad-8e0d-4def7d54e510\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" Apr 17 20:15:33.690652 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:33.690485 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d219c10a-62e0-44ad-8e0d-4def7d54e510-home\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49\" (UID: \"d219c10a-62e0-44ad-8e0d-4def7d54e510\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" Apr 17 20:15:33.690652 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:33.690537 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d219c10a-62e0-44ad-8e0d-4def7d54e510-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49\" (UID: \"d219c10a-62e0-44ad-8e0d-4def7d54e510\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" Apr 17 20:15:33.690652 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:33.690564 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fc8d6\" (UniqueName: \"kubernetes.io/projected/d219c10a-62e0-44ad-8e0d-4def7d54e510-kube-api-access-fc8d6\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49\" (UID: \"d219c10a-62e0-44ad-8e0d-4def7d54e510\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" Apr 17 20:15:33.690652 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:33.690592 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d219c10a-62e0-44ad-8e0d-4def7d54e510-dshm\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49\" (UID: \"d219c10a-62e0-44ad-8e0d-4def7d54e510\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" Apr 17 20:15:33.690652 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:33.690621 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d219c10a-62e0-44ad-8e0d-4def7d54e510-model-cache\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49\" (UID: \"d219c10a-62e0-44ad-8e0d-4def7d54e510\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" Apr 17 20:15:33.691047 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:33.691012 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d219c10a-62e0-44ad-8e0d-4def7d54e510-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49\" (UID: \"d219c10a-62e0-44ad-8e0d-4def7d54e510\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" Apr 17 20:15:33.691169 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:33.691019 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d219c10a-62e0-44ad-8e0d-4def7d54e510-model-cache\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49\" (UID: \"d219c10a-62e0-44ad-8e0d-4def7d54e510\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" Apr 17 20:15:33.691244 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:33.691214 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d219c10a-62e0-44ad-8e0d-4def7d54e510-home\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49\" (UID: \"d219c10a-62e0-44ad-8e0d-4def7d54e510\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" Apr 17 20:15:33.692858 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:33.692811 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d219c10a-62e0-44ad-8e0d-4def7d54e510-dshm\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49\" (UID: \"d219c10a-62e0-44ad-8e0d-4def7d54e510\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" Apr 17 20:15:33.693097 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:33.693077 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d219c10a-62e0-44ad-8e0d-4def7d54e510-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49\" (UID: \"d219c10a-62e0-44ad-8e0d-4def7d54e510\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" Apr 17 20:15:33.698212 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:33.698187 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc8d6\" (UniqueName: \"kubernetes.io/projected/d219c10a-62e0-44ad-8e0d-4def7d54e510-kube-api-access-fc8d6\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49\" (UID: \"d219c10a-62e0-44ad-8e0d-4def7d54e510\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" Apr 17 20:15:33.880553 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:33.880461 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" Apr 17 20:15:33.990079 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:33.990050 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_b6316c7d-9ee3-4f7f-87b4-b97fb2117b36/main/0.log" Apr 17 20:15:33.990507 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:33.990410 2576 generic.go:358] "Generic (PLEG): container finished" podID="b6316c7d-9ee3-4f7f-87b4-b97fb2117b36" containerID="1fdbda74231dce0747402c6d530afdbcd69864ff403f15ecb41ffb1437a2c257" exitCode=2 Apr 17 20:15:33.990507 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:33.990485 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" event={"ID":"b6316c7d-9ee3-4f7f-87b4-b97fb2117b36","Type":"ContainerDied","Data":"1fdbda74231dce0747402c6d530afdbcd69864ff403f15ecb41ffb1437a2c257"} Apr 17 20:15:33.991122 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:33.991093 2576 scope.go:117] "RemoveContainer" containerID="1fdbda74231dce0747402c6d530afdbcd69864ff403f15ecb41ffb1437a2c257" Apr 17 20:15:34.009488 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:34.009460 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49"] Apr 17 20:15:34.012138 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:15:34.012106 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd219c10a_62e0_44ad_8e0d_4def7d54e510.slice/crio-86c2182cbb552328505112205dd9c99b5325daaebb539c161e858fba0b43547d WatchSource:0}: Error finding container 86c2182cbb552328505112205dd9c99b5325daaebb539c161e858fba0b43547d: Status 404 returned error can't find the container with id 86c2182cbb552328505112205dd9c99b5325daaebb539c161e858fba0b43547d Apr 17 20:15:34.997092 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:34.997052 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_b6316c7d-9ee3-4f7f-87b4-b97fb2117b36/main/1.log" Apr 17 20:15:34.997586 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:34.997473 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_b6316c7d-9ee3-4f7f-87b4-b97fb2117b36/main/0.log" Apr 17 20:15:34.997830 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:34.997803 2576 generic.go:358] "Generic (PLEG): container finished" podID="b6316c7d-9ee3-4f7f-87b4-b97fb2117b36" containerID="d7302c5b76d615e3156df6d3577324afda9371a97cf9de2f0dba9542174f6a60" exitCode=2 Apr 17 20:15:34.997971 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:34.997879 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" event={"ID":"b6316c7d-9ee3-4f7f-87b4-b97fb2117b36","Type":"ContainerDied","Data":"d7302c5b76d615e3156df6d3577324afda9371a97cf9de2f0dba9542174f6a60"} Apr 17 20:15:34.997971 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:34.997916 2576 scope.go:117] "RemoveContainer" containerID="1fdbda74231dce0747402c6d530afdbcd69864ff403f15ecb41ffb1437a2c257" Apr 17 20:15:34.998503 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:34.998344 2576 scope.go:117] "RemoveContainer" containerID="d7302c5b76d615e3156df6d3577324afda9371a97cf9de2f0dba9542174f6a60" Apr 17 20:15:34.998604 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:15:34.998586 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_llm(b6316c7d-9ee3-4f7f-87b4-b97fb2117b36)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" podUID="b6316c7d-9ee3-4f7f-87b4-b97fb2117b36" Apr 17 20:15:34.999420 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:34.999395 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" event={"ID":"d219c10a-62e0-44ad-8e0d-4def7d54e510","Type":"ContainerStarted","Data":"acf8871745f88a97ad125550159efb06d10ca9ab6eb37cbffc57029772861bf8"} Apr 17 20:15:34.999536 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:34.999430 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" event={"ID":"d219c10a-62e0-44ad-8e0d-4def7d54e510","Type":"ContainerStarted","Data":"86c2182cbb552328505112205dd9c99b5325daaebb539c161e858fba0b43547d"} Apr 17 20:15:35.162411 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:35.162375 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4"] Apr 17 20:15:35.166313 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:35.166287 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" Apr 17 20:15:35.168656 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:35.168634 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 17 20:15:35.175362 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:35.175321 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4"] Apr 17 20:15:35.204448 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:35.204417 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5a40d06f-f97a-4be6-a799-e52a7a311ff1-home\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-w8lj4\" (UID: \"5a40d06f-f97a-4be6-a799-e52a7a311ff1\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" Apr 17 20:15:35.204634 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:35.204454 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjtvd\" (UniqueName: \"kubernetes.io/projected/5a40d06f-f97a-4be6-a799-e52a7a311ff1-kube-api-access-mjtvd\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-w8lj4\" (UID: \"5a40d06f-f97a-4be6-a799-e52a7a311ff1\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" Apr 17 20:15:35.204634 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:35.204523 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5a40d06f-f97a-4be6-a799-e52a7a311ff1-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-w8lj4\" (UID: \"5a40d06f-f97a-4be6-a799-e52a7a311ff1\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" Apr 17 20:15:35.204634 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:35.204541 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5a40d06f-f97a-4be6-a799-e52a7a311ff1-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-w8lj4\" (UID: \"5a40d06f-f97a-4be6-a799-e52a7a311ff1\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" Apr 17 20:15:35.204634 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:35.204616 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5a40d06f-f97a-4be6-a799-e52a7a311ff1-dshm\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-w8lj4\" (UID: \"5a40d06f-f97a-4be6-a799-e52a7a311ff1\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" Apr 17 20:15:35.204883 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:35.204639 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5a40d06f-f97a-4be6-a799-e52a7a311ff1-model-cache\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-w8lj4\" (UID: \"5a40d06f-f97a-4be6-a799-e52a7a311ff1\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" Apr 17 20:15:35.305348 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:35.305306 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5a40d06f-f97a-4be6-a799-e52a7a311ff1-home\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-w8lj4\" (UID: \"5a40d06f-f97a-4be6-a799-e52a7a311ff1\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" Apr 17 20:15:35.305348 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:35.305349 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mjtvd\" (UniqueName: \"kubernetes.io/projected/5a40d06f-f97a-4be6-a799-e52a7a311ff1-kube-api-access-mjtvd\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-w8lj4\" (UID: \"5a40d06f-f97a-4be6-a799-e52a7a311ff1\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" Apr 17 20:15:35.305594 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:35.305391 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5a40d06f-f97a-4be6-a799-e52a7a311ff1-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-w8lj4\" (UID: \"5a40d06f-f97a-4be6-a799-e52a7a311ff1\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" Apr 17 20:15:35.305594 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:35.305410 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5a40d06f-f97a-4be6-a799-e52a7a311ff1-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-w8lj4\" (UID: \"5a40d06f-f97a-4be6-a799-e52a7a311ff1\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" Apr 17 20:15:35.305594 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:35.305497 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5a40d06f-f97a-4be6-a799-e52a7a311ff1-dshm\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-w8lj4\" (UID: \"5a40d06f-f97a-4be6-a799-e52a7a311ff1\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" Apr 17 20:15:35.305594 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:35.305523 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5a40d06f-f97a-4be6-a799-e52a7a311ff1-model-cache\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-w8lj4\" (UID: \"5a40d06f-f97a-4be6-a799-e52a7a311ff1\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" Apr 17 20:15:35.305814 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:35.305762 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5a40d06f-f97a-4be6-a799-e52a7a311ff1-home\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-w8lj4\" (UID: \"5a40d06f-f97a-4be6-a799-e52a7a311ff1\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" Apr 17 20:15:35.305907 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:35.305867 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5a40d06f-f97a-4be6-a799-e52a7a311ff1-model-cache\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-w8lj4\" (UID: \"5a40d06f-f97a-4be6-a799-e52a7a311ff1\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" Apr 17 20:15:35.305993 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:35.305969 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5a40d06f-f97a-4be6-a799-e52a7a311ff1-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-w8lj4\" (UID: \"5a40d06f-f97a-4be6-a799-e52a7a311ff1\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" Apr 17 20:15:35.307832 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:35.307804 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5a40d06f-f97a-4be6-a799-e52a7a311ff1-dshm\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-w8lj4\" (UID: \"5a40d06f-f97a-4be6-a799-e52a7a311ff1\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" Apr 17 20:15:35.308233 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:35.308211 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5a40d06f-f97a-4be6-a799-e52a7a311ff1-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-w8lj4\" (UID: \"5a40d06f-f97a-4be6-a799-e52a7a311ff1\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" Apr 17 20:15:35.314673 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:35.314646 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjtvd\" (UniqueName: \"kubernetes.io/projected/5a40d06f-f97a-4be6-a799-e52a7a311ff1-kube-api-access-mjtvd\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-w8lj4\" (UID: \"5a40d06f-f97a-4be6-a799-e52a7a311ff1\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" Apr 17 20:15:35.480325 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:35.480271 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" Apr 17 20:15:35.617545 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:35.617508 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4"] Apr 17 20:15:35.620483 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:15:35.620456 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a40d06f_f97a_4be6_a799_e52a7a311ff1.slice/crio-f1e1cdcec7da08b8593361b447b6909a1c4fae9d7ec9799a261214c335f3b6ee WatchSource:0}: Error finding container f1e1cdcec7da08b8593361b447b6909a1c4fae9d7ec9799a261214c335f3b6ee: Status 404 returned error can't find the container with id f1e1cdcec7da08b8593361b447b6909a1c4fae9d7ec9799a261214c335f3b6ee Apr 17 20:15:36.009783 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:36.009697 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_b6316c7d-9ee3-4f7f-87b4-b97fb2117b36/main/1.log" Apr 17 20:15:36.011795 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:36.011757 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" event={"ID":"5a40d06f-f97a-4be6-a799-e52a7a311ff1","Type":"ContainerStarted","Data":"f10787cc1a759013f5a4666a3c499155270329ca8e460b27ce7571f4b8fec4ac"} Apr 17 20:15:36.011957 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:36.011804 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" event={"ID":"5a40d06f-f97a-4be6-a799-e52a7a311ff1","Type":"ContainerStarted","Data":"f1e1cdcec7da08b8593361b447b6909a1c4fae9d7ec9799a261214c335f3b6ee"} Apr 17 20:15:36.781528 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:36.781491 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" Apr 17 20:15:36.781704 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:36.781542 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" Apr 17 20:15:36.782113 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:36.782095 2576 scope.go:117] "RemoveContainer" containerID="d7302c5b76d615e3156df6d3577324afda9371a97cf9de2f0dba9542174f6a60" Apr 17 20:15:36.782368 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:15:36.782342 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_llm(b6316c7d-9ee3-4f7f-87b4-b97fb2117b36)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" podUID="b6316c7d-9ee3-4f7f-87b4-b97fb2117b36" Apr 17 20:15:37.782891 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:37.782790 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" Apr 17 20:15:37.782891 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:37.782854 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" Apr 17 20:15:37.783499 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:37.783477 2576 scope.go:117] "RemoveContainer" containerID="4ce145850b1a5a914c212910db1e40d826d65bff46871203e3c043249ee6266b" Apr 17 20:15:37.783762 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:15:37.783738 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_llm(1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" podUID="1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc" Apr 17 20:15:38.021372 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:38.021332 2576 generic.go:358] "Generic (PLEG): container finished" podID="3cf31824-04be-4c34-9b73-76116db5f973" containerID="9ac65d193f2cdfa77068ee6f9cb9e91fc5c7e440a82fe0c520f20b33acaa6113" exitCode=0 Apr 17 20:15:38.021532 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:38.021416 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-bctd7" event={"ID":"3cf31824-04be-4c34-9b73-76116db5f973","Type":"ContainerDied","Data":"9ac65d193f2cdfa77068ee6f9cb9e91fc5c7e440a82fe0c520f20b33acaa6113"} Apr 17 20:15:40.082478 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:40.082442 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" Apr 17 20:15:40.082871 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:40.082491 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" Apr 17 20:15:40.083132 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:40.083111 2576 scope.go:117] "RemoveContainer" containerID="b87844fb3a75c10d3ca67ab48fee9bace4fbdd9e4d6dcaf1742e7cc87735819a" Apr 17 20:15:40.083360 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:15:40.083335 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_llm(0288f2a6-594b-4a71-bf11-4312437fc0b7)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" podUID="0288f2a6-594b-4a71-bf11-4312437fc0b7" Apr 17 20:15:41.036754 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:41.036716 2576 generic.go:358] "Generic (PLEG): container finished" podID="d219c10a-62e0-44ad-8e0d-4def7d54e510" containerID="acf8871745f88a97ad125550159efb06d10ca9ab6eb37cbffc57029772861bf8" exitCode=0 Apr 17 20:15:41.036936 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:41.036793 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" event={"ID":"d219c10a-62e0-44ad-8e0d-4def7d54e510","Type":"ContainerDied","Data":"acf8871745f88a97ad125550159efb06d10ca9ab6eb37cbffc57029772861bf8"} Apr 17 20:15:43.045748 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:43.045723 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49_d219c10a-62e0-44ad-8e0d-4def7d54e510/main/0.log" Apr 17 20:15:43.046173 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:43.046119 2576 generic.go:358] "Generic (PLEG): container finished" podID="d219c10a-62e0-44ad-8e0d-4def7d54e510" containerID="66c203753df760644e1119f895289b9a33cc174156bbc3f7b8259cd01f64d31c" exitCode=2 Apr 17 20:15:43.046231 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:43.046211 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" event={"ID":"d219c10a-62e0-44ad-8e0d-4def7d54e510","Type":"ContainerDied","Data":"66c203753df760644e1119f895289b9a33cc174156bbc3f7b8259cd01f64d31c"} Apr 17 20:15:43.046627 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:43.046607 2576 scope.go:117] "RemoveContainer" containerID="66c203753df760644e1119f895289b9a33cc174156bbc3f7b8259cd01f64d31c" Apr 17 20:15:43.048220 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:43.048198 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-bctd7" event={"ID":"3cf31824-04be-4c34-9b73-76116db5f973","Type":"ContainerStarted","Data":"db556ed5756a12aad05548e1fdad08e0fc5109133521233a0d5eb331be9f9396"} Apr 17 20:15:43.048421 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:43.048395 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-bctd7" Apr 17 20:15:43.081215 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:43.081144 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-bctd7" podStartSLOduration=7.9034697529999995 podStartE2EDuration="12.0811211s" podCreationTimestamp="2026-04-17 20:15:31 +0000 UTC" firstStartedPulling="2026-04-17 20:15:38.022313344 +0000 UTC m=+676.890133096" lastFinishedPulling="2026-04-17 20:15:42.1999647 +0000 UTC m=+681.067784443" observedRunningTime="2026-04-17 20:15:43.078772055 +0000 UTC m=+681.946591858" watchObservedRunningTime="2026-04-17 20:15:43.0811211 +0000 UTC m=+681.948940899" Apr 17 20:15:43.881320 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:43.881232 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" Apr 17 20:15:43.881320 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:43.881273 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" Apr 17 20:15:44.052827 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:44.052792 2576 generic.go:358] "Generic (PLEG): container finished" podID="e59351fd-81e5-415c-92ac-94da8b9a7b76" containerID="add31f9036a6caf717bec24bd97faf075090f37d3b7821d16c33f9f69b95d94f" exitCode=6 Apr 17 20:15:44.053263 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:44.052884 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607615-hxsgg" event={"ID":"e59351fd-81e5-415c-92ac-94da8b9a7b76","Type":"ContainerDied","Data":"add31f9036a6caf717bec24bd97faf075090f37d3b7821d16c33f9f69b95d94f"} Apr 17 20:15:44.053263 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:44.052934 2576 scope.go:117] "RemoveContainer" containerID="79370ba1c4bfb4abe5fff9f62810535310bac23e690c90042ea8f7c0be79ef0a" Apr 17 20:15:44.053361 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:44.053291 2576 scope.go:117] "RemoveContainer" containerID="add31f9036a6caf717bec24bd97faf075090f37d3b7821d16c33f9f69b95d94f" Apr 17 20:15:44.053581 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:15:44.053549 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29607615-hxsgg_opendatahub(e59351fd-81e5-415c-92ac-94da8b9a7b76)\"" pod="opendatahub/maas-api-key-cleanup-29607615-hxsgg" podUID="e59351fd-81e5-415c-92ac-94da8b9a7b76" Apr 17 20:15:44.054745 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:44.054727 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49_d219c10a-62e0-44ad-8e0d-4def7d54e510/main/1.log" Apr 17 20:15:44.055127 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:44.055105 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49_d219c10a-62e0-44ad-8e0d-4def7d54e510/main/0.log" Apr 17 20:15:44.055419 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:44.055400 2576 generic.go:358] "Generic (PLEG): container finished" podID="d219c10a-62e0-44ad-8e0d-4def7d54e510" containerID="2d50c3df81ac9ff56206f355fcdcccdfa977b56e21b158804e1f35ddbd12b1f3" exitCode=2 Apr 17 20:15:44.055499 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:44.055479 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" event={"ID":"d219c10a-62e0-44ad-8e0d-4def7d54e510","Type":"ContainerDied","Data":"2d50c3df81ac9ff56206f355fcdcccdfa977b56e21b158804e1f35ddbd12b1f3"} Apr 17 20:15:44.055767 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:44.055750 2576 scope.go:117] "RemoveContainer" containerID="2d50c3df81ac9ff56206f355fcdcccdfa977b56e21b158804e1f35ddbd12b1f3" Apr 17 20:15:44.055979 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:15:44.055960 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49_llm(d219c10a-62e0-44ad-8e0d-4def7d54e510)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" podUID="d219c10a-62e0-44ad-8e0d-4def7d54e510" Apr 17 20:15:44.065135 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:44.065118 2576 scope.go:117] "RemoveContainer" containerID="66c203753df760644e1119f895289b9a33cc174156bbc3f7b8259cd01f64d31c" Apr 17 20:15:45.061128 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:45.061101 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49_d219c10a-62e0-44ad-8e0d-4def7d54e510/main/1.log" Apr 17 20:15:45.061857 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:45.061820 2576 scope.go:117] "RemoveContainer" containerID="2d50c3df81ac9ff56206f355fcdcccdfa977b56e21b158804e1f35ddbd12b1f3" Apr 17 20:15:45.062086 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:15:45.062066 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49_llm(d219c10a-62e0-44ad-8e0d-4def7d54e510)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" podUID="d219c10a-62e0-44ad-8e0d-4def7d54e510" Apr 17 20:15:45.062693 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:45.062669 2576 generic.go:358] "Generic (PLEG): container finished" podID="5a40d06f-f97a-4be6-a799-e52a7a311ff1" containerID="f10787cc1a759013f5a4666a3c499155270329ca8e460b27ce7571f4b8fec4ac" exitCode=0 Apr 17 20:15:45.062782 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:45.062700 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" event={"ID":"5a40d06f-f97a-4be6-a799-e52a7a311ff1","Type":"ContainerDied","Data":"f10787cc1a759013f5a4666a3c499155270329ca8e460b27ce7571f4b8fec4ac"} Apr 17 20:15:46.067234 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:46.067207 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-w8lj4_5a40d06f-f97a-4be6-a799-e52a7a311ff1/main/0.log" Apr 17 20:15:46.067658 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:46.067499 2576 generic.go:358] "Generic (PLEG): container finished" podID="5a40d06f-f97a-4be6-a799-e52a7a311ff1" containerID="06da9f3402034c70300d98128208d5ad5d9ec0f33012f2df1c75b9f784342520" exitCode=2 Apr 17 20:15:46.067658 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:46.067539 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" event={"ID":"5a40d06f-f97a-4be6-a799-e52a7a311ff1","Type":"ContainerDied","Data":"06da9f3402034c70300d98128208d5ad5d9ec0f33012f2df1c75b9f784342520"} Apr 17 20:15:46.067888 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:46.067875 2576 scope.go:117] "RemoveContainer" containerID="06da9f3402034c70300d98128208d5ad5d9ec0f33012f2df1c75b9f784342520" Apr 17 20:15:47.072785 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:47.072762 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-w8lj4_5a40d06f-f97a-4be6-a799-e52a7a311ff1/main/1.log" Apr 17 20:15:47.073240 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:47.073167 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-w8lj4_5a40d06f-f97a-4be6-a799-e52a7a311ff1/main/0.log" Apr 17 20:15:47.073469 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:47.073450 2576 generic.go:358] "Generic (PLEG): container finished" podID="5a40d06f-f97a-4be6-a799-e52a7a311ff1" containerID="7941c7b404ed709111f245284401b5c59f1e14b0f4b17b3efb08b1630b5d630b" exitCode=2 Apr 17 20:15:47.073546 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:47.073525 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" event={"ID":"5a40d06f-f97a-4be6-a799-e52a7a311ff1","Type":"ContainerDied","Data":"7941c7b404ed709111f245284401b5c59f1e14b0f4b17b3efb08b1630b5d630b"} Apr 17 20:15:47.073630 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:47.073569 2576 scope.go:117] "RemoveContainer" containerID="06da9f3402034c70300d98128208d5ad5d9ec0f33012f2df1c75b9f784342520" Apr 17 20:15:47.074035 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:47.074018 2576 scope.go:117] "RemoveContainer" containerID="7941c7b404ed709111f245284401b5c59f1e14b0f4b17b3efb08b1630b5d630b" Apr 17 20:15:47.074237 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:15:47.074214 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-w8lj4_llm(5a40d06f-f97a-4be6-a799-e52a7a311ff1)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" podUID="5a40d06f-f97a-4be6-a799-e52a7a311ff1" Apr 17 20:15:48.084737 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:48.084704 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-w8lj4_5a40d06f-f97a-4be6-a799-e52a7a311ff1/main/1.log" Apr 17 20:15:48.634638 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:48.634610 2576 scope.go:117] "RemoveContainer" containerID="4ce145850b1a5a914c212910db1e40d826d65bff46871203e3c043249ee6266b" Apr 17 20:15:48.634817 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:48.634708 2576 scope.go:117] "RemoveContainer" containerID="d7302c5b76d615e3156df6d3577324afda9371a97cf9de2f0dba9542174f6a60" Apr 17 20:15:48.634817 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:15:48.634801 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_llm(1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" podUID="1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc" Apr 17 20:15:49.090764 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:49.090737 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_b6316c7d-9ee3-4f7f-87b4-b97fb2117b36/main/2.log" Apr 17 20:15:49.091180 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:49.091165 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_b6316c7d-9ee3-4f7f-87b4-b97fb2117b36/main/1.log" Apr 17 20:15:49.091523 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:49.091502 2576 generic.go:358] "Generic (PLEG): container finished" podID="b6316c7d-9ee3-4f7f-87b4-b97fb2117b36" containerID="18ed885d2767d07fc29f13b71f4317266ccaa1c36b7b0805b72a24adb266f118" exitCode=2 Apr 17 20:15:49.091602 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:49.091582 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" event={"ID":"b6316c7d-9ee3-4f7f-87b4-b97fb2117b36","Type":"ContainerDied","Data":"18ed885d2767d07fc29f13b71f4317266ccaa1c36b7b0805b72a24adb266f118"} Apr 17 20:15:49.091647 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:49.091629 2576 scope.go:117] "RemoveContainer" containerID="d7302c5b76d615e3156df6d3577324afda9371a97cf9de2f0dba9542174f6a60" Apr 17 20:15:49.092141 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:49.092118 2576 scope.go:117] "RemoveContainer" containerID="18ed885d2767d07fc29f13b71f4317266ccaa1c36b7b0805b72a24adb266f118" Apr 17 20:15:49.092391 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:15:49.092369 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_llm(b6316c7d-9ee3-4f7f-87b4-b97fb2117b36)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" podUID="b6316c7d-9ee3-4f7f-87b4-b97fb2117b36" Apr 17 20:15:50.096642 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:50.096613 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_b6316c7d-9ee3-4f7f-87b4-b97fb2117b36/main/2.log" Apr 17 20:15:53.880984 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:53.880935 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" Apr 17 20:15:53.880984 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:53.880978 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" Apr 17 20:15:53.881470 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:53.881451 2576 scope.go:117] "RemoveContainer" containerID="2d50c3df81ac9ff56206f355fcdcccdfa977b56e21b158804e1f35ddbd12b1f3" Apr 17 20:15:54.069361 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:54.069333 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-bctd7" Apr 17 20:15:54.634404 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:54.634329 2576 scope.go:117] "RemoveContainer" containerID="add31f9036a6caf717bec24bd97faf075090f37d3b7821d16c33f9f69b95d94f" Apr 17 20:15:55.117859 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:55.117805 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607615-hxsgg" event={"ID":"e59351fd-81e5-415c-92ac-94da8b9a7b76","Type":"ContainerStarted","Data":"0cd4b967a6da889d19724dacf5e2a67f379b95d3bc5c2ebc44ad0ff5597b6389"} Apr 17 20:15:55.119306 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:55.119285 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49_d219c10a-62e0-44ad-8e0d-4def7d54e510/main/2.log" Apr 17 20:15:55.119659 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:55.119643 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49_d219c10a-62e0-44ad-8e0d-4def7d54e510/main/1.log" Apr 17 20:15:55.119937 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:55.119917 2576 generic.go:358] "Generic (PLEG): container finished" podID="d219c10a-62e0-44ad-8e0d-4def7d54e510" containerID="824b50022051de1eb442a94e1669682bf57a26023a5a9f000c14a46e0e887828" exitCode=2 Apr 17 20:15:55.120008 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:55.119968 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" event={"ID":"d219c10a-62e0-44ad-8e0d-4def7d54e510","Type":"ContainerDied","Data":"824b50022051de1eb442a94e1669682bf57a26023a5a9f000c14a46e0e887828"} Apr 17 20:15:55.120008 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:55.119998 2576 scope.go:117] "RemoveContainer" containerID="2d50c3df81ac9ff56206f355fcdcccdfa977b56e21b158804e1f35ddbd12b1f3" Apr 17 20:15:55.120326 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:55.120309 2576 scope.go:117] "RemoveContainer" containerID="824b50022051de1eb442a94e1669682bf57a26023a5a9f000c14a46e0e887828" Apr 17 20:15:55.120503 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:15:55.120487 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49_llm(d219c10a-62e0-44ad-8e0d-4def7d54e510)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" podUID="d219c10a-62e0-44ad-8e0d-4def7d54e510" Apr 17 20:15:55.480582 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:55.480544 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" Apr 17 20:15:55.480582 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:55.480592 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" Apr 17 20:15:55.481140 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:55.481121 2576 scope.go:117] "RemoveContainer" containerID="7941c7b404ed709111f245284401b5c59f1e14b0f4b17b3efb08b1630b5d630b" Apr 17 20:15:55.481355 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:15:55.481335 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-w8lj4_llm(5a40d06f-f97a-4be6-a799-e52a7a311ff1)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" podUID="5a40d06f-f97a-4be6-a799-e52a7a311ff1" Apr 17 20:15:55.634597 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:55.634567 2576 scope.go:117] "RemoveContainer" containerID="b87844fb3a75c10d3ca67ab48fee9bace4fbdd9e4d6dcaf1742e7cc87735819a" Apr 17 20:15:55.659571 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:55.659537 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607615-hxsgg"] Apr 17 20:15:56.124981 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:56.124960 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49_d219c10a-62e0-44ad-8e0d-4def7d54e510/main/2.log" Apr 17 20:15:56.125725 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:56.125696 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29607615-hxsgg" podUID="e59351fd-81e5-415c-92ac-94da8b9a7b76" containerName="cleanup" containerID="cri-o://0cd4b967a6da889d19724dacf5e2a67f379b95d3bc5c2ebc44ad0ff5597b6389" gracePeriod=30 Apr 17 20:15:56.781054 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:56.781024 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" Apr 17 20:15:56.781054 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:56.781060 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" Apr 17 20:15:56.781533 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:56.781516 2576 scope.go:117] "RemoveContainer" containerID="18ed885d2767d07fc29f13b71f4317266ccaa1c36b7b0805b72a24adb266f118" Apr 17 20:15:56.781742 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:15:56.781722 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_llm(b6316c7d-9ee3-4f7f-87b4-b97fb2117b36)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" podUID="b6316c7d-9ee3-4f7f-87b4-b97fb2117b36" Apr 17 20:15:57.130698 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:57.130673 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_0288f2a6-594b-4a71-bf11-4312437fc0b7/main/3.log" Apr 17 20:15:57.131135 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:57.131088 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_0288f2a6-594b-4a71-bf11-4312437fc0b7/main/2.log" Apr 17 20:15:57.131407 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:57.131386 2576 generic.go:358] "Generic (PLEG): container finished" podID="0288f2a6-594b-4a71-bf11-4312437fc0b7" containerID="f4e4e9d8ebe4ac0ec3b54adc792e734c1a268b711fd499bb175d58a9e1ebcce7" exitCode=2 Apr 17 20:15:57.131486 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:57.131445 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" event={"ID":"0288f2a6-594b-4a71-bf11-4312437fc0b7","Type":"ContainerDied","Data":"f4e4e9d8ebe4ac0ec3b54adc792e734c1a268b711fd499bb175d58a9e1ebcce7"} Apr 17 20:15:57.131486 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:57.131477 2576 scope.go:117] "RemoveContainer" containerID="b87844fb3a75c10d3ca67ab48fee9bace4fbdd9e4d6dcaf1742e7cc87735819a" Apr 17 20:15:57.131911 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:57.131895 2576 scope.go:117] "RemoveContainer" containerID="f4e4e9d8ebe4ac0ec3b54adc792e734c1a268b711fd499bb175d58a9e1ebcce7" Apr 17 20:15:57.132135 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:15:57.132112 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_llm(0288f2a6-594b-4a71-bf11-4312437fc0b7)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" podUID="0288f2a6-594b-4a71-bf11-4312437fc0b7" Apr 17 20:15:58.135979 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:58.135952 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_0288f2a6-594b-4a71-bf11-4312437fc0b7/main/3.log" Apr 17 20:15:59.634537 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:15:59.634505 2576 scope.go:117] "RemoveContainer" containerID="4ce145850b1a5a914c212910db1e40d826d65bff46871203e3c043249ee6266b" Apr 17 20:16:00.082372 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:00.082353 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" Apr 17 20:16:00.082438 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:00.082380 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" Apr 17 20:16:00.082751 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:00.082720 2576 scope.go:117] "RemoveContainer" containerID="f4e4e9d8ebe4ac0ec3b54adc792e734c1a268b711fd499bb175d58a9e1ebcce7" Apr 17 20:16:00.082907 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:16:00.082888 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_llm(0288f2a6-594b-4a71-bf11-4312437fc0b7)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" podUID="0288f2a6-594b-4a71-bf11-4312437fc0b7" Apr 17 20:16:01.149223 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:01.149196 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc/main/3.log" Apr 17 20:16:01.149617 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:01.149557 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc/main/2.log" Apr 17 20:16:01.149885 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:01.149857 2576 generic.go:358] "Generic (PLEG): container finished" podID="1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc" containerID="fb59776d50465aea71e5f86f93bbbae186793fc90a2c5aa495cd81523ec83649" exitCode=2 Apr 17 20:16:01.149982 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:01.149875 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" event={"ID":"1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc","Type":"ContainerDied","Data":"fb59776d50465aea71e5f86f93bbbae186793fc90a2c5aa495cd81523ec83649"} Apr 17 20:16:01.149982 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:01.149927 2576 scope.go:117] "RemoveContainer" containerID="4ce145850b1a5a914c212910db1e40d826d65bff46871203e3c043249ee6266b" Apr 17 20:16:01.150319 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:01.150303 2576 scope.go:117] "RemoveContainer" containerID="fb59776d50465aea71e5f86f93bbbae186793fc90a2c5aa495cd81523ec83649" Apr 17 20:16:01.150517 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:16:01.150497 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_llm(1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" podUID="1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc" Apr 17 20:16:02.154551 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:02.154518 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc/main/3.log" Apr 17 20:16:03.881248 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:03.881211 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" Apr 17 20:16:03.881248 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:03.881253 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" Apr 17 20:16:03.881709 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:03.881655 2576 scope.go:117] "RemoveContainer" containerID="824b50022051de1eb442a94e1669682bf57a26023a5a9f000c14a46e0e887828" Apr 17 20:16:03.881884 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:16:03.881862 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49_llm(d219c10a-62e0-44ad-8e0d-4def7d54e510)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" podUID="d219c10a-62e0-44ad-8e0d-4def7d54e510" Apr 17 20:16:06.634126 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:06.634095 2576 scope.go:117] "RemoveContainer" containerID="7941c7b404ed709111f245284401b5c59f1e14b0f4b17b3efb08b1630b5d630b" Apr 17 20:16:07.172989 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:07.172916 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-w8lj4_5a40d06f-f97a-4be6-a799-e52a7a311ff1/main/2.log" Apr 17 20:16:07.173286 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:07.173272 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-w8lj4_5a40d06f-f97a-4be6-a799-e52a7a311ff1/main/1.log" Apr 17 20:16:07.173581 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:07.173559 2576 generic.go:358] "Generic (PLEG): container finished" podID="5a40d06f-f97a-4be6-a799-e52a7a311ff1" containerID="445f971fc85536b90a46d904b80e7b24c7d13c22eb0f2187b9f01d27eca6024e" exitCode=2 Apr 17 20:16:07.173646 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:07.173628 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" event={"ID":"5a40d06f-f97a-4be6-a799-e52a7a311ff1","Type":"ContainerDied","Data":"445f971fc85536b90a46d904b80e7b24c7d13c22eb0f2187b9f01d27eca6024e"} Apr 17 20:16:07.173684 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:07.173671 2576 scope.go:117] "RemoveContainer" containerID="7941c7b404ed709111f245284401b5c59f1e14b0f4b17b3efb08b1630b5d630b" Apr 17 20:16:07.174156 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:07.174130 2576 scope.go:117] "RemoveContainer" containerID="445f971fc85536b90a46d904b80e7b24c7d13c22eb0f2187b9f01d27eca6024e" Apr 17 20:16:07.174372 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:16:07.174353 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-w8lj4_llm(5a40d06f-f97a-4be6-a799-e52a7a311ff1)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" podUID="5a40d06f-f97a-4be6-a799-e52a7a311ff1" Apr 17 20:16:07.782657 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:07.782618 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" Apr 17 20:16:07.782657 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:07.782652 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" Apr 17 20:16:07.783126 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:07.783078 2576 scope.go:117] "RemoveContainer" containerID="fb59776d50465aea71e5f86f93bbbae186793fc90a2c5aa495cd81523ec83649" Apr 17 20:16:07.783271 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:16:07.783254 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_llm(1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" podUID="1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc" Apr 17 20:16:08.181280 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:08.181255 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-w8lj4_5a40d06f-f97a-4be6-a799-e52a7a311ff1/main/2.log" Apr 17 20:16:11.638469 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:11.638438 2576 scope.go:117] "RemoveContainer" containerID="18ed885d2767d07fc29f13b71f4317266ccaa1c36b7b0805b72a24adb266f118" Apr 17 20:16:12.197134 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:12.197101 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_b6316c7d-9ee3-4f7f-87b4-b97fb2117b36/main/3.log" Apr 17 20:16:12.197485 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:12.197469 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_b6316c7d-9ee3-4f7f-87b4-b97fb2117b36/main/2.log" Apr 17 20:16:12.197763 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:12.197740 2576 generic.go:358] "Generic (PLEG): container finished" podID="b6316c7d-9ee3-4f7f-87b4-b97fb2117b36" containerID="5a8ba45683a00f9c0d85dce696d2544029ccd88f1cee07059fd067d1fcf286dd" exitCode=2 Apr 17 20:16:12.197863 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:12.197797 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" event={"ID":"b6316c7d-9ee3-4f7f-87b4-b97fb2117b36","Type":"ContainerDied","Data":"5a8ba45683a00f9c0d85dce696d2544029ccd88f1cee07059fd067d1fcf286dd"} Apr 17 20:16:12.197930 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:12.197863 2576 scope.go:117] "RemoveContainer" containerID="18ed885d2767d07fc29f13b71f4317266ccaa1c36b7b0805b72a24adb266f118" Apr 17 20:16:12.198259 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:12.198242 2576 scope.go:117] "RemoveContainer" containerID="5a8ba45683a00f9c0d85dce696d2544029ccd88f1cee07059fd067d1fcf286dd" Apr 17 20:16:12.198474 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:16:12.198455 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_llm(b6316c7d-9ee3-4f7f-87b4-b97fb2117b36)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" podUID="b6316c7d-9ee3-4f7f-87b4-b97fb2117b36" Apr 17 20:16:13.202667 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:13.202641 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_b6316c7d-9ee3-4f7f-87b4-b97fb2117b36/main/3.log" Apr 17 20:16:14.634278 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:14.634247 2576 scope.go:117] "RemoveContainer" containerID="f4e4e9d8ebe4ac0ec3b54adc792e734c1a268b711fd499bb175d58a9e1ebcce7" Apr 17 20:16:14.634651 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:16:14.634443 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_llm(0288f2a6-594b-4a71-bf11-4312437fc0b7)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" podUID="0288f2a6-594b-4a71-bf11-4312437fc0b7" Apr 17 20:16:15.466490 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:15.466466 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607615-hxsgg" Apr 17 20:16:15.480531 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:15.480348 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" Apr 17 20:16:15.480531 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:15.480387 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" Apr 17 20:16:15.480904 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:15.480883 2576 scope.go:117] "RemoveContainer" containerID="445f971fc85536b90a46d904b80e7b24c7d13c22eb0f2187b9f01d27eca6024e" Apr 17 20:16:15.481132 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:16:15.481096 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-w8lj4_llm(5a40d06f-f97a-4be6-a799-e52a7a311ff1)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" podUID="5a40d06f-f97a-4be6-a799-e52a7a311ff1" Apr 17 20:16:15.482664 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:15.482636 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fj9pv\" (UniqueName: \"kubernetes.io/projected/e59351fd-81e5-415c-92ac-94da8b9a7b76-kube-api-access-fj9pv\") pod \"e59351fd-81e5-415c-92ac-94da8b9a7b76\" (UID: \"e59351fd-81e5-415c-92ac-94da8b9a7b76\") " Apr 17 20:16:15.489491 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:15.489457 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e59351fd-81e5-415c-92ac-94da8b9a7b76-kube-api-access-fj9pv" (OuterVolumeSpecName: "kube-api-access-fj9pv") pod "e59351fd-81e5-415c-92ac-94da8b9a7b76" (UID: "e59351fd-81e5-415c-92ac-94da8b9a7b76"). InnerVolumeSpecName "kube-api-access-fj9pv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:16:15.584161 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:15.584124 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fj9pv\" (UniqueName: \"kubernetes.io/projected/e59351fd-81e5-415c-92ac-94da8b9a7b76-kube-api-access-fj9pv\") on node \"ip-10-0-131-172.ec2.internal\" DevicePath \"\"" Apr 17 20:16:16.216376 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:16.216343 2576 generic.go:358] "Generic (PLEG): container finished" podID="e59351fd-81e5-415c-92ac-94da8b9a7b76" containerID="0cd4b967a6da889d19724dacf5e2a67f379b95d3bc5c2ebc44ad0ff5597b6389" exitCode=6 Apr 17 20:16:16.216811 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:16.216407 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607615-hxsgg" Apr 17 20:16:16.216811 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:16.216431 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607615-hxsgg" event={"ID":"e59351fd-81e5-415c-92ac-94da8b9a7b76","Type":"ContainerDied","Data":"0cd4b967a6da889d19724dacf5e2a67f379b95d3bc5c2ebc44ad0ff5597b6389"} Apr 17 20:16:16.216811 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:16.216477 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607615-hxsgg" event={"ID":"e59351fd-81e5-415c-92ac-94da8b9a7b76","Type":"ContainerDied","Data":"1d3e8fa3584f0790db3ad454e4e176575f55df2997939acc65fffd097cd3d095"} Apr 17 20:16:16.216811 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:16.216499 2576 scope.go:117] "RemoveContainer" containerID="0cd4b967a6da889d19724dacf5e2a67f379b95d3bc5c2ebc44ad0ff5597b6389" Apr 17 20:16:16.224497 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:16.224479 2576 scope.go:117] "RemoveContainer" containerID="add31f9036a6caf717bec24bd97faf075090f37d3b7821d16c33f9f69b95d94f" Apr 17 20:16:16.231776 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:16.231758 2576 scope.go:117] "RemoveContainer" containerID="0cd4b967a6da889d19724dacf5e2a67f379b95d3bc5c2ebc44ad0ff5597b6389" Apr 17 20:16:16.232061 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:16:16.232043 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cd4b967a6da889d19724dacf5e2a67f379b95d3bc5c2ebc44ad0ff5597b6389\": container with ID starting with 0cd4b967a6da889d19724dacf5e2a67f379b95d3bc5c2ebc44ad0ff5597b6389 not found: ID does not exist" containerID="0cd4b967a6da889d19724dacf5e2a67f379b95d3bc5c2ebc44ad0ff5597b6389" Apr 17 20:16:16.232120 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:16.232070 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cd4b967a6da889d19724dacf5e2a67f379b95d3bc5c2ebc44ad0ff5597b6389"} err="failed to get container status \"0cd4b967a6da889d19724dacf5e2a67f379b95d3bc5c2ebc44ad0ff5597b6389\": rpc error: code = NotFound desc = could not find container \"0cd4b967a6da889d19724dacf5e2a67f379b95d3bc5c2ebc44ad0ff5597b6389\": container with ID starting with 0cd4b967a6da889d19724dacf5e2a67f379b95d3bc5c2ebc44ad0ff5597b6389 not found: ID does not exist" Apr 17 20:16:16.232120 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:16.232093 2576 scope.go:117] "RemoveContainer" containerID="add31f9036a6caf717bec24bd97faf075090f37d3b7821d16c33f9f69b95d94f" Apr 17 20:16:16.232332 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:16:16.232314 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"add31f9036a6caf717bec24bd97faf075090f37d3b7821d16c33f9f69b95d94f\": container with ID starting with add31f9036a6caf717bec24bd97faf075090f37d3b7821d16c33f9f69b95d94f not found: ID does not exist" containerID="add31f9036a6caf717bec24bd97faf075090f37d3b7821d16c33f9f69b95d94f" Apr 17 20:16:16.232371 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:16.232338 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"add31f9036a6caf717bec24bd97faf075090f37d3b7821d16c33f9f69b95d94f"} err="failed to get container status \"add31f9036a6caf717bec24bd97faf075090f37d3b7821d16c33f9f69b95d94f\": rpc error: code = NotFound desc = could not find container \"add31f9036a6caf717bec24bd97faf075090f37d3b7821d16c33f9f69b95d94f\": container with ID starting with add31f9036a6caf717bec24bd97faf075090f37d3b7821d16c33f9f69b95d94f not found: ID does not exist" Apr 17 20:16:16.232531 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:16.232515 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607615-hxsgg"] Apr 17 20:16:16.238635 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:16.238614 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607615-hxsgg"] Apr 17 20:16:16.634177 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:16.634146 2576 scope.go:117] "RemoveContainer" containerID="824b50022051de1eb442a94e1669682bf57a26023a5a9f000c14a46e0e887828" Apr 17 20:16:16.781590 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:16.781561 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" Apr 17 20:16:16.781590 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:16.781594 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" Apr 17 20:16:16.782038 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:16.782025 2576 scope.go:117] "RemoveContainer" containerID="5a8ba45683a00f9c0d85dce696d2544029ccd88f1cee07059fd067d1fcf286dd" Apr 17 20:16:16.782214 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:16:16.782198 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_llm(b6316c7d-9ee3-4f7f-87b4-b97fb2117b36)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" podUID="b6316c7d-9ee3-4f7f-87b4-b97fb2117b36" Apr 17 20:16:17.221658 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:17.221579 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49_d219c10a-62e0-44ad-8e0d-4def7d54e510/main/3.log" Apr 17 20:16:17.222062 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:17.221960 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49_d219c10a-62e0-44ad-8e0d-4def7d54e510/main/2.log" Apr 17 20:16:17.222290 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:17.222269 2576 generic.go:358] "Generic (PLEG): container finished" podID="d219c10a-62e0-44ad-8e0d-4def7d54e510" containerID="8039f66039f320dc4cf65a4a08248875ccdc3754cca1e1598dc916aedabe3b7b" exitCode=2 Apr 17 20:16:17.222370 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:17.222351 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" event={"ID":"d219c10a-62e0-44ad-8e0d-4def7d54e510","Type":"ContainerDied","Data":"8039f66039f320dc4cf65a4a08248875ccdc3754cca1e1598dc916aedabe3b7b"} Apr 17 20:16:17.222422 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:17.222395 2576 scope.go:117] "RemoveContainer" containerID="824b50022051de1eb442a94e1669682bf57a26023a5a9f000c14a46e0e887828" Apr 17 20:16:17.222918 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:17.222897 2576 scope.go:117] "RemoveContainer" containerID="8039f66039f320dc4cf65a4a08248875ccdc3754cca1e1598dc916aedabe3b7b" Apr 17 20:16:17.223151 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:16:17.223126 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49_llm(d219c10a-62e0-44ad-8e0d-4def7d54e510)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" podUID="d219c10a-62e0-44ad-8e0d-4def7d54e510" Apr 17 20:16:17.638601 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:17.638559 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e59351fd-81e5-415c-92ac-94da8b9a7b76" path="/var/lib/kubelet/pods/e59351fd-81e5-415c-92ac-94da8b9a7b76/volumes" Apr 17 20:16:18.227954 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:18.227927 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49_d219c10a-62e0-44ad-8e0d-4def7d54e510/main/3.log" Apr 17 20:16:19.634195 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:19.634167 2576 scope.go:117] "RemoveContainer" containerID="fb59776d50465aea71e5f86f93bbbae186793fc90a2c5aa495cd81523ec83649" Apr 17 20:16:19.634589 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:16:19.634354 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_llm(1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" podUID="1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc" Apr 17 20:16:23.881381 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:23.881334 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" Apr 17 20:16:23.881381 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:23.881378 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" Apr 17 20:16:23.881983 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:23.881951 2576 scope.go:117] "RemoveContainer" containerID="8039f66039f320dc4cf65a4a08248875ccdc3754cca1e1598dc916aedabe3b7b" Apr 17 20:16:23.882197 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:16:23.882171 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49_llm(d219c10a-62e0-44ad-8e0d-4def7d54e510)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" podUID="d219c10a-62e0-44ad-8e0d-4def7d54e510" Apr 17 20:16:28.634399 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:28.634363 2576 scope.go:117] "RemoveContainer" containerID="445f971fc85536b90a46d904b80e7b24c7d13c22eb0f2187b9f01d27eca6024e" Apr 17 20:16:29.265192 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:29.265109 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-w8lj4_5a40d06f-f97a-4be6-a799-e52a7a311ff1/main/3.log" Apr 17 20:16:29.265493 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:29.265477 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-w8lj4_5a40d06f-f97a-4be6-a799-e52a7a311ff1/main/2.log" Apr 17 20:16:29.265812 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:29.265791 2576 generic.go:358] "Generic (PLEG): container finished" podID="5a40d06f-f97a-4be6-a799-e52a7a311ff1" containerID="57dae53373f1079fe710a906854099cae96d054618eb7c23228671d83f3bf64a" exitCode=2 Apr 17 20:16:29.265886 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:29.265871 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" event={"ID":"5a40d06f-f97a-4be6-a799-e52a7a311ff1","Type":"ContainerDied","Data":"57dae53373f1079fe710a906854099cae96d054618eb7c23228671d83f3bf64a"} Apr 17 20:16:29.265942 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:29.265910 2576 scope.go:117] "RemoveContainer" containerID="445f971fc85536b90a46d904b80e7b24c7d13c22eb0f2187b9f01d27eca6024e" Apr 17 20:16:29.266343 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:29.266318 2576 scope.go:117] "RemoveContainer" containerID="57dae53373f1079fe710a906854099cae96d054618eb7c23228671d83f3bf64a" Apr 17 20:16:29.266631 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:16:29.266573 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-w8lj4_llm(5a40d06f-f97a-4be6-a799-e52a7a311ff1)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" podUID="5a40d06f-f97a-4be6-a799-e52a7a311ff1" Apr 17 20:16:29.634321 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:29.634291 2576 scope.go:117] "RemoveContainer" containerID="5a8ba45683a00f9c0d85dce696d2544029ccd88f1cee07059fd067d1fcf286dd" Apr 17 20:16:29.634488 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:29.634406 2576 scope.go:117] "RemoveContainer" containerID="f4e4e9d8ebe4ac0ec3b54adc792e734c1a268b711fd499bb175d58a9e1ebcce7" Apr 17 20:16:29.634778 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:16:29.634486 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_llm(b6316c7d-9ee3-4f7f-87b4-b97fb2117b36)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" podUID="b6316c7d-9ee3-4f7f-87b4-b97fb2117b36" Apr 17 20:16:29.634778 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:16:29.634568 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_llm(0288f2a6-594b-4a71-bf11-4312437fc0b7)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" podUID="0288f2a6-594b-4a71-bf11-4312437fc0b7" Apr 17 20:16:30.270936 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:30.270903 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-w8lj4_5a40d06f-f97a-4be6-a799-e52a7a311ff1/main/3.log" Apr 17 20:16:31.638058 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:31.638028 2576 scope.go:117] "RemoveContainer" containerID="fb59776d50465aea71e5f86f93bbbae186793fc90a2c5aa495cd81523ec83649" Apr 17 20:16:31.638557 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:16:31.638249 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_llm(1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" podUID="1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc" Apr 17 20:16:35.480520 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:35.480479 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" Apr 17 20:16:35.480520 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:35.480525 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" Apr 17 20:16:35.481002 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:35.480953 2576 scope.go:117] "RemoveContainer" containerID="57dae53373f1079fe710a906854099cae96d054618eb7c23228671d83f3bf64a" Apr 17 20:16:35.481149 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:16:35.481130 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-w8lj4_llm(5a40d06f-f97a-4be6-a799-e52a7a311ff1)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" podUID="5a40d06f-f97a-4be6-a799-e52a7a311ff1" Apr 17 20:16:36.634002 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:36.633970 2576 scope.go:117] "RemoveContainer" containerID="8039f66039f320dc4cf65a4a08248875ccdc3754cca1e1598dc916aedabe3b7b" Apr 17 20:16:36.634387 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:16:36.634181 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49_llm(d219c10a-62e0-44ad-8e0d-4def7d54e510)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" podUID="d219c10a-62e0-44ad-8e0d-4def7d54e510" Apr 17 20:16:41.636364 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:41.636334 2576 scope.go:117] "RemoveContainer" containerID="f4e4e9d8ebe4ac0ec3b54adc792e734c1a268b711fd499bb175d58a9e1ebcce7" Apr 17 20:16:42.321507 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:42.321473 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_0288f2a6-594b-4a71-bf11-4312437fc0b7/main/4.log" Apr 17 20:16:42.321828 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:42.321811 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_0288f2a6-594b-4a71-bf11-4312437fc0b7/main/3.log" Apr 17 20:16:42.322116 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:42.322098 2576 generic.go:358] "Generic (PLEG): container finished" podID="0288f2a6-594b-4a71-bf11-4312437fc0b7" containerID="dbfec1469ad2d5be3474029f9bc9f6beae881a9b5fdcd5486f583a7d899b859e" exitCode=2 Apr 17 20:16:42.322177 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:42.322164 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" event={"ID":"0288f2a6-594b-4a71-bf11-4312437fc0b7","Type":"ContainerDied","Data":"dbfec1469ad2d5be3474029f9bc9f6beae881a9b5fdcd5486f583a7d899b859e"} Apr 17 20:16:42.322216 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:42.322201 2576 scope.go:117] "RemoveContainer" containerID="f4e4e9d8ebe4ac0ec3b54adc792e734c1a268b711fd499bb175d58a9e1ebcce7" Apr 17 20:16:42.322613 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:42.322597 2576 scope.go:117] "RemoveContainer" containerID="dbfec1469ad2d5be3474029f9bc9f6beae881a9b5fdcd5486f583a7d899b859e" Apr 17 20:16:42.322887 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:16:42.322862 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_llm(0288f2a6-594b-4a71-bf11-4312437fc0b7)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" podUID="0288f2a6-594b-4a71-bf11-4312437fc0b7" Apr 17 20:16:43.327653 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:43.327625 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_0288f2a6-594b-4a71-bf11-4312437fc0b7/main/4.log" Apr 17 20:16:44.634256 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:44.634223 2576 scope.go:117] "RemoveContainer" containerID="fb59776d50465aea71e5f86f93bbbae186793fc90a2c5aa495cd81523ec83649" Apr 17 20:16:44.634722 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:44.634338 2576 scope.go:117] "RemoveContainer" containerID="5a8ba45683a00f9c0d85dce696d2544029ccd88f1cee07059fd067d1fcf286dd" Apr 17 20:16:44.634722 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:16:44.634556 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_llm(b6316c7d-9ee3-4f7f-87b4-b97fb2117b36)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" podUID="b6316c7d-9ee3-4f7f-87b4-b97fb2117b36" Apr 17 20:16:45.336659 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:45.336634 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc/main/4.log" Apr 17 20:16:45.337130 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:45.337113 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc/main/3.log" Apr 17 20:16:45.337425 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:45.337406 2576 generic.go:358] "Generic (PLEG): container finished" podID="1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc" containerID="7800659bc66588402370428eff03d80a704e072ff720b14a0981d485f72dce5e" exitCode=2 Apr 17 20:16:45.337484 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:45.337460 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" event={"ID":"1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc","Type":"ContainerDied","Data":"7800659bc66588402370428eff03d80a704e072ff720b14a0981d485f72dce5e"} Apr 17 20:16:45.337538 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:45.337489 2576 scope.go:117] "RemoveContainer" containerID="fb59776d50465aea71e5f86f93bbbae186793fc90a2c5aa495cd81523ec83649" Apr 17 20:16:45.337926 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:45.337909 2576 scope.go:117] "RemoveContainer" containerID="7800659bc66588402370428eff03d80a704e072ff720b14a0981d485f72dce5e" Apr 17 20:16:45.338146 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:16:45.338124 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_llm(1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" podUID="1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc" Apr 17 20:16:46.343113 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:46.343086 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc/main/4.log" Apr 17 20:16:47.634773 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:47.634738 2576 scope.go:117] "RemoveContainer" containerID="57dae53373f1079fe710a906854099cae96d054618eb7c23228671d83f3bf64a" Apr 17 20:16:47.635203 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:16:47.634980 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-w8lj4_llm(5a40d06f-f97a-4be6-a799-e52a7a311ff1)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" podUID="5a40d06f-f97a-4be6-a799-e52a7a311ff1" Apr 17 20:16:47.782326 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:47.782281 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" Apr 17 20:16:47.782326 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:47.782335 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" Apr 17 20:16:47.782832 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:47.782799 2576 scope.go:117] "RemoveContainer" containerID="7800659bc66588402370428eff03d80a704e072ff720b14a0981d485f72dce5e" Apr 17 20:16:47.783106 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:16:47.783084 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_llm(1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" podUID="1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc" Apr 17 20:16:50.082153 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:50.082110 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" Apr 17 20:16:50.082153 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:50.082159 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" Apr 17 20:16:50.082672 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:50.082578 2576 scope.go:117] "RemoveContainer" containerID="dbfec1469ad2d5be3474029f9bc9f6beae881a9b5fdcd5486f583a7d899b859e" Apr 17 20:16:50.082768 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:16:50.082750 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_llm(0288f2a6-594b-4a71-bf11-4312437fc0b7)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" podUID="0288f2a6-594b-4a71-bf11-4312437fc0b7" Apr 17 20:16:50.634589 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:50.634558 2576 scope.go:117] "RemoveContainer" containerID="8039f66039f320dc4cf65a4a08248875ccdc3754cca1e1598dc916aedabe3b7b" Apr 17 20:16:50.634767 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:16:50.634727 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49_llm(d219c10a-62e0-44ad-8e0d-4def7d54e510)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" podUID="d219c10a-62e0-44ad-8e0d-4def7d54e510" Apr 17 20:16:57.634619 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:57.634589 2576 scope.go:117] "RemoveContainer" containerID="5a8ba45683a00f9c0d85dce696d2544029ccd88f1cee07059fd067d1fcf286dd" Apr 17 20:16:58.396274 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:58.396248 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_b6316c7d-9ee3-4f7f-87b4-b97fb2117b36/main/4.log" Apr 17 20:16:58.396612 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:58.396596 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_b6316c7d-9ee3-4f7f-87b4-b97fb2117b36/main/3.log" Apr 17 20:16:58.396950 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:58.396927 2576 generic.go:358] "Generic (PLEG): container finished" podID="b6316c7d-9ee3-4f7f-87b4-b97fb2117b36" containerID="587c5764838438ae84a0d53aada2d60b6db4b9750244bd40ca0aa477bb15b0d9" exitCode=2 Apr 17 20:16:58.397019 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:58.396998 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" event={"ID":"b6316c7d-9ee3-4f7f-87b4-b97fb2117b36","Type":"ContainerDied","Data":"587c5764838438ae84a0d53aada2d60b6db4b9750244bd40ca0aa477bb15b0d9"} Apr 17 20:16:58.397057 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:58.397042 2576 scope.go:117] "RemoveContainer" containerID="5a8ba45683a00f9c0d85dce696d2544029ccd88f1cee07059fd067d1fcf286dd" Apr 17 20:16:58.397404 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:58.397386 2576 scope.go:117] "RemoveContainer" containerID="587c5764838438ae84a0d53aada2d60b6db4b9750244bd40ca0aa477bb15b0d9" Apr 17 20:16:58.397631 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:16:58.397613 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_llm(b6316c7d-9ee3-4f7f-87b4-b97fb2117b36)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" podUID="b6316c7d-9ee3-4f7f-87b4-b97fb2117b36" Apr 17 20:16:59.402172 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:59.402141 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_b6316c7d-9ee3-4f7f-87b4-b97fb2117b36/main/4.log" Apr 17 20:16:59.634176 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:16:59.634139 2576 scope.go:117] "RemoveContainer" containerID="7800659bc66588402370428eff03d80a704e072ff720b14a0981d485f72dce5e" Apr 17 20:16:59.634404 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:16:59.634380 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_llm(1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" podUID="1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc" Apr 17 20:17:00.634162 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:00.634130 2576 scope.go:117] "RemoveContainer" containerID="57dae53373f1079fe710a906854099cae96d054618eb7c23228671d83f3bf64a" Apr 17 20:17:00.634527 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:17:00.634307 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-w8lj4_llm(5a40d06f-f97a-4be6-a799-e52a7a311ff1)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" podUID="5a40d06f-f97a-4be6-a799-e52a7a311ff1" Apr 17 20:17:02.634181 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:02.634149 2576 scope.go:117] "RemoveContainer" containerID="dbfec1469ad2d5be3474029f9bc9f6beae881a9b5fdcd5486f583a7d899b859e" Apr 17 20:17:02.634573 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:17:02.634339 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_llm(0288f2a6-594b-4a71-bf11-4312437fc0b7)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" podUID="0288f2a6-594b-4a71-bf11-4312437fc0b7" Apr 17 20:17:05.634785 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:05.634754 2576 scope.go:117] "RemoveContainer" containerID="8039f66039f320dc4cf65a4a08248875ccdc3754cca1e1598dc916aedabe3b7b" Apr 17 20:17:06.428526 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:06.428501 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49_d219c10a-62e0-44ad-8e0d-4def7d54e510/main/4.log" Apr 17 20:17:06.428936 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:06.428920 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49_d219c10a-62e0-44ad-8e0d-4def7d54e510/main/3.log" Apr 17 20:17:06.429222 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:06.429203 2576 generic.go:358] "Generic (PLEG): container finished" podID="d219c10a-62e0-44ad-8e0d-4def7d54e510" containerID="1b8645bc4e647b6ac34bd60883008b09eac15bea8df96544c00d9e37c23c8d24" exitCode=2 Apr 17 20:17:06.429290 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:06.429273 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" event={"ID":"d219c10a-62e0-44ad-8e0d-4def7d54e510","Type":"ContainerDied","Data":"1b8645bc4e647b6ac34bd60883008b09eac15bea8df96544c00d9e37c23c8d24"} Apr 17 20:17:06.429328 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:06.429313 2576 scope.go:117] "RemoveContainer" containerID="8039f66039f320dc4cf65a4a08248875ccdc3754cca1e1598dc916aedabe3b7b" Apr 17 20:17:06.429681 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:06.429661 2576 scope.go:117] "RemoveContainer" containerID="1b8645bc4e647b6ac34bd60883008b09eac15bea8df96544c00d9e37c23c8d24" Apr 17 20:17:06.429904 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:17:06.429887 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49_llm(d219c10a-62e0-44ad-8e0d-4def7d54e510)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" podUID="d219c10a-62e0-44ad-8e0d-4def7d54e510" Apr 17 20:17:06.781182 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:06.781105 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" Apr 17 20:17:06.781507 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:06.781210 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" Apr 17 20:17:06.781574 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:06.781560 2576 scope.go:117] "RemoveContainer" containerID="587c5764838438ae84a0d53aada2d60b6db4b9750244bd40ca0aa477bb15b0d9" Apr 17 20:17:06.781761 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:17:06.781744 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_llm(b6316c7d-9ee3-4f7f-87b4-b97fb2117b36)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" podUID="b6316c7d-9ee3-4f7f-87b4-b97fb2117b36" Apr 17 20:17:07.434288 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:07.434261 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49_d219c10a-62e0-44ad-8e0d-4def7d54e510/main/4.log" Apr 17 20:17:07.435029 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:07.435012 2576 scope.go:117] "RemoveContainer" containerID="587c5764838438ae84a0d53aada2d60b6db4b9750244bd40ca0aa477bb15b0d9" Apr 17 20:17:07.435209 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:17:07.435188 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_llm(b6316c7d-9ee3-4f7f-87b4-b97fb2117b36)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" podUID="b6316c7d-9ee3-4f7f-87b4-b97fb2117b36" Apr 17 20:17:11.636431 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:11.636402 2576 scope.go:117] "RemoveContainer" containerID="7800659bc66588402370428eff03d80a704e072ff720b14a0981d485f72dce5e" Apr 17 20:17:11.636832 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:17:11.636602 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_llm(1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" podUID="1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc" Apr 17 20:17:13.881291 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:13.881247 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" Apr 17 20:17:13.881291 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:13.881290 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" Apr 17 20:17:13.881788 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:13.881682 2576 scope.go:117] "RemoveContainer" containerID="1b8645bc4e647b6ac34bd60883008b09eac15bea8df96544c00d9e37c23c8d24" Apr 17 20:17:13.881916 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:17:13.881894 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49_llm(d219c10a-62e0-44ad-8e0d-4def7d54e510)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" podUID="d219c10a-62e0-44ad-8e0d-4def7d54e510" Apr 17 20:17:14.633810 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:14.633777 2576 scope.go:117] "RemoveContainer" containerID="dbfec1469ad2d5be3474029f9bc9f6beae881a9b5fdcd5486f583a7d899b859e" Apr 17 20:17:14.634015 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:17:14.633996 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_llm(0288f2a6-594b-4a71-bf11-4312437fc0b7)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" podUID="0288f2a6-594b-4a71-bf11-4312437fc0b7" Apr 17 20:17:15.633961 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:15.633923 2576 scope.go:117] "RemoveContainer" containerID="57dae53373f1079fe710a906854099cae96d054618eb7c23228671d83f3bf64a" Apr 17 20:17:16.468425 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:16.468396 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-w8lj4_5a40d06f-f97a-4be6-a799-e52a7a311ff1/main/4.log" Apr 17 20:17:16.468776 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:16.468761 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-w8lj4_5a40d06f-f97a-4be6-a799-e52a7a311ff1/main/3.log" Apr 17 20:17:16.469066 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:16.469044 2576 generic.go:358] "Generic (PLEG): container finished" podID="5a40d06f-f97a-4be6-a799-e52a7a311ff1" containerID="514fbc84c64089cfe7ed82ae068ddf28047064a4fc9d4e50a25a0c3da0d459bb" exitCode=2 Apr 17 20:17:16.469143 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:16.469097 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" event={"ID":"5a40d06f-f97a-4be6-a799-e52a7a311ff1","Type":"ContainerDied","Data":"514fbc84c64089cfe7ed82ae068ddf28047064a4fc9d4e50a25a0c3da0d459bb"} Apr 17 20:17:16.469143 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:16.469134 2576 scope.go:117] "RemoveContainer" containerID="57dae53373f1079fe710a906854099cae96d054618eb7c23228671d83f3bf64a" Apr 17 20:17:16.469637 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:16.469620 2576 scope.go:117] "RemoveContainer" containerID="514fbc84c64089cfe7ed82ae068ddf28047064a4fc9d4e50a25a0c3da0d459bb" Apr 17 20:17:16.469860 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:17:16.469827 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-w8lj4_llm(5a40d06f-f97a-4be6-a799-e52a7a311ff1)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" podUID="5a40d06f-f97a-4be6-a799-e52a7a311ff1" Apr 17 20:17:17.473517 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:17.473487 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-w8lj4_5a40d06f-f97a-4be6-a799-e52a7a311ff1/main/4.log" Apr 17 20:17:21.636592 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:21.636564 2576 scope.go:117] "RemoveContainer" containerID="587c5764838438ae84a0d53aada2d60b6db4b9750244bd40ca0aa477bb15b0d9" Apr 17 20:17:21.636972 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:17:21.636790 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_llm(b6316c7d-9ee3-4f7f-87b4-b97fb2117b36)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" podUID="b6316c7d-9ee3-4f7f-87b4-b97fb2117b36" Apr 17 20:17:24.634273 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:24.634240 2576 scope.go:117] "RemoveContainer" containerID="7800659bc66588402370428eff03d80a704e072ff720b14a0981d485f72dce5e" Apr 17 20:17:24.634663 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:17:24.634433 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_llm(1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" podUID="1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc" Apr 17 20:17:25.481428 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:25.481380 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" Apr 17 20:17:25.481428 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:25.481431 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" Apr 17 20:17:25.481924 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:25.481906 2576 scope.go:117] "RemoveContainer" containerID="514fbc84c64089cfe7ed82ae068ddf28047064a4fc9d4e50a25a0c3da0d459bb" Apr 17 20:17:25.482130 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:17:25.482113 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-w8lj4_llm(5a40d06f-f97a-4be6-a799-e52a7a311ff1)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" podUID="5a40d06f-f97a-4be6-a799-e52a7a311ff1" Apr 17 20:17:26.634485 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:26.634458 2576 scope.go:117] "RemoveContainer" containerID="1b8645bc4e647b6ac34bd60883008b09eac15bea8df96544c00d9e37c23c8d24" Apr 17 20:17:26.634887 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:17:26.634614 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49_llm(d219c10a-62e0-44ad-8e0d-4def7d54e510)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" podUID="d219c10a-62e0-44ad-8e0d-4def7d54e510" Apr 17 20:17:29.634487 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:29.634456 2576 scope.go:117] "RemoveContainer" containerID="dbfec1469ad2d5be3474029f9bc9f6beae881a9b5fdcd5486f583a7d899b859e" Apr 17 20:17:29.634896 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:17:29.634646 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_llm(0288f2a6-594b-4a71-bf11-4312437fc0b7)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" podUID="0288f2a6-594b-4a71-bf11-4312437fc0b7" Apr 17 20:17:35.634871 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:35.634774 2576 scope.go:117] "RemoveContainer" containerID="587c5764838438ae84a0d53aada2d60b6db4b9750244bd40ca0aa477bb15b0d9" Apr 17 20:17:35.635212 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:17:35.635026 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_llm(b6316c7d-9ee3-4f7f-87b4-b97fb2117b36)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" podUID="b6316c7d-9ee3-4f7f-87b4-b97fb2117b36" Apr 17 20:17:36.634052 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:36.634022 2576 scope.go:117] "RemoveContainer" containerID="7800659bc66588402370428eff03d80a704e072ff720b14a0981d485f72dce5e" Apr 17 20:17:36.634223 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:17:36.634187 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_llm(1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" podUID="1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc" Apr 17 20:17:37.634322 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:37.634285 2576 scope.go:117] "RemoveContainer" containerID="1b8645bc4e647b6ac34bd60883008b09eac15bea8df96544c00d9e37c23c8d24" Apr 17 20:17:37.634705 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:17:37.634544 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49_llm(d219c10a-62e0-44ad-8e0d-4def7d54e510)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" podUID="d219c10a-62e0-44ad-8e0d-4def7d54e510" Apr 17 20:17:39.633995 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:39.633959 2576 scope.go:117] "RemoveContainer" containerID="514fbc84c64089cfe7ed82ae068ddf28047064a4fc9d4e50a25a0c3da0d459bb" Apr 17 20:17:39.634389 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:17:39.634145 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-w8lj4_llm(5a40d06f-f97a-4be6-a799-e52a7a311ff1)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" podUID="5a40d06f-f97a-4be6-a799-e52a7a311ff1" Apr 17 20:17:42.633683 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:42.633647 2576 scope.go:117] "RemoveContainer" containerID="dbfec1469ad2d5be3474029f9bc9f6beae881a9b5fdcd5486f583a7d899b859e" Apr 17 20:17:42.634128 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:17:42.633878 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_llm(0288f2a6-594b-4a71-bf11-4312437fc0b7)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" podUID="0288f2a6-594b-4a71-bf11-4312437fc0b7" Apr 17 20:17:48.634265 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:48.634227 2576 scope.go:117] "RemoveContainer" containerID="587c5764838438ae84a0d53aada2d60b6db4b9750244bd40ca0aa477bb15b0d9" Apr 17 20:17:48.634733 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:17:48.634412 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_llm(b6316c7d-9ee3-4f7f-87b4-b97fb2117b36)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" podUID="b6316c7d-9ee3-4f7f-87b4-b97fb2117b36" Apr 17 20:17:49.634299 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:49.634261 2576 scope.go:117] "RemoveContainer" containerID="7800659bc66588402370428eff03d80a704e072ff720b14a0981d485f72dce5e" Apr 17 20:17:49.634789 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:17:49.634451 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_llm(1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" podUID="1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc" Apr 17 20:17:52.634390 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:52.634356 2576 scope.go:117] "RemoveContainer" containerID="514fbc84c64089cfe7ed82ae068ddf28047064a4fc9d4e50a25a0c3da0d459bb" Apr 17 20:17:52.634782 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:52.634444 2576 scope.go:117] "RemoveContainer" containerID="1b8645bc4e647b6ac34bd60883008b09eac15bea8df96544c00d9e37c23c8d24" Apr 17 20:17:52.634782 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:17:52.634568 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-w8lj4_llm(5a40d06f-f97a-4be6-a799-e52a7a311ff1)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" podUID="5a40d06f-f97a-4be6-a799-e52a7a311ff1" Apr 17 20:17:52.634782 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:17:52.634619 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49_llm(d219c10a-62e0-44ad-8e0d-4def7d54e510)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" podUID="d219c10a-62e0-44ad-8e0d-4def7d54e510" Apr 17 20:17:55.634486 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:55.634455 2576 scope.go:117] "RemoveContainer" containerID="dbfec1469ad2d5be3474029f9bc9f6beae881a9b5fdcd5486f583a7d899b859e" Apr 17 20:17:55.635029 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:17:55.634658 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_llm(0288f2a6-594b-4a71-bf11-4312437fc0b7)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" podUID="0288f2a6-594b-4a71-bf11-4312437fc0b7" Apr 17 20:17:59.638772 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:17:59.638733 2576 scope.go:117] "RemoveContainer" containerID="587c5764838438ae84a0d53aada2d60b6db4b9750244bd40ca0aa477bb15b0d9" Apr 17 20:17:59.639183 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:17:59.638925 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_llm(b6316c7d-9ee3-4f7f-87b4-b97fb2117b36)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" podUID="b6316c7d-9ee3-4f7f-87b4-b97fb2117b36" Apr 17 20:18:02.634993 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:02.634960 2576 scope.go:117] "RemoveContainer" containerID="7800659bc66588402370428eff03d80a704e072ff720b14a0981d485f72dce5e" Apr 17 20:18:02.635422 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:18:02.635213 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_llm(1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" podUID="1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc" Apr 17 20:18:04.634347 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:04.634315 2576 scope.go:117] "RemoveContainer" containerID="514fbc84c64089cfe7ed82ae068ddf28047064a4fc9d4e50a25a0c3da0d459bb" Apr 17 20:18:04.634744 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:18:04.634507 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-w8lj4_llm(5a40d06f-f97a-4be6-a799-e52a7a311ff1)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" podUID="5a40d06f-f97a-4be6-a799-e52a7a311ff1" Apr 17 20:18:05.639197 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:05.639161 2576 scope.go:117] "RemoveContainer" containerID="1b8645bc4e647b6ac34bd60883008b09eac15bea8df96544c00d9e37c23c8d24" Apr 17 20:18:05.639605 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:18:05.639433 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49_llm(d219c10a-62e0-44ad-8e0d-4def7d54e510)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" podUID="d219c10a-62e0-44ad-8e0d-4def7d54e510" Apr 17 20:18:08.634112 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:08.634081 2576 scope.go:117] "RemoveContainer" containerID="dbfec1469ad2d5be3474029f9bc9f6beae881a9b5fdcd5486f583a7d899b859e" Apr 17 20:18:09.658788 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:09.658758 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_0288f2a6-594b-4a71-bf11-4312437fc0b7/main/5.log" Apr 17 20:18:09.659209 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:09.659126 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_0288f2a6-594b-4a71-bf11-4312437fc0b7/main/4.log" Apr 17 20:18:09.659444 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:09.659424 2576 generic.go:358] "Generic (PLEG): container finished" podID="0288f2a6-594b-4a71-bf11-4312437fc0b7" containerID="f002872b0b68fbaa28c46e0e9b56b24d74277a424cc21c57759a927560af3cf2" exitCode=2 Apr 17 20:18:09.659520 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:09.659498 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" event={"ID":"0288f2a6-594b-4a71-bf11-4312437fc0b7","Type":"ContainerDied","Data":"f002872b0b68fbaa28c46e0e9b56b24d74277a424cc21c57759a927560af3cf2"} Apr 17 20:18:09.659557 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:09.659542 2576 scope.go:117] "RemoveContainer" containerID="dbfec1469ad2d5be3474029f9bc9f6beae881a9b5fdcd5486f583a7d899b859e" Apr 17 20:18:09.659941 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:09.659923 2576 scope.go:117] "RemoveContainer" containerID="f002872b0b68fbaa28c46e0e9b56b24d74277a424cc21c57759a927560af3cf2" Apr 17 20:18:09.660147 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:18:09.660128 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_llm(0288f2a6-594b-4a71-bf11-4312437fc0b7)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" podUID="0288f2a6-594b-4a71-bf11-4312437fc0b7" Apr 17 20:18:10.082164 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:10.082130 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" Apr 17 20:18:10.082164 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:10.082167 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" Apr 17 20:18:10.664816 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:10.664788 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_0288f2a6-594b-4a71-bf11-4312437fc0b7/main/5.log" Apr 17 20:18:10.665528 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:10.665513 2576 scope.go:117] "RemoveContainer" containerID="f002872b0b68fbaa28c46e0e9b56b24d74277a424cc21c57759a927560af3cf2" Apr 17 20:18:10.665704 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:18:10.665687 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_llm(0288f2a6-594b-4a71-bf11-4312437fc0b7)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" podUID="0288f2a6-594b-4a71-bf11-4312437fc0b7" Apr 17 20:18:13.639165 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:13.639130 2576 scope.go:117] "RemoveContainer" containerID="7800659bc66588402370428eff03d80a704e072ff720b14a0981d485f72dce5e" Apr 17 20:18:13.639652 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:13.639215 2576 scope.go:117] "RemoveContainer" containerID="587c5764838438ae84a0d53aada2d60b6db4b9750244bd40ca0aa477bb15b0d9" Apr 17 20:18:13.639652 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:18:13.639367 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_llm(b6316c7d-9ee3-4f7f-87b4-b97fb2117b36)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" podUID="b6316c7d-9ee3-4f7f-87b4-b97fb2117b36" Apr 17 20:18:14.681930 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:14.681895 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc/main/5.log" Apr 17 20:18:14.682315 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:14.682282 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc/main/4.log" Apr 17 20:18:14.682578 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:14.682558 2576 generic.go:358] "Generic (PLEG): container finished" podID="1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc" containerID="de18846fe6f94b8dee6791d125e90192d8e0f8f7678a0adb014c6fddd51fea49" exitCode=2 Apr 17 20:18:14.682647 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:14.682596 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" event={"ID":"1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc","Type":"ContainerDied","Data":"de18846fe6f94b8dee6791d125e90192d8e0f8f7678a0adb014c6fddd51fea49"} Apr 17 20:18:14.682647 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:14.682623 2576 scope.go:117] "RemoveContainer" containerID="7800659bc66588402370428eff03d80a704e072ff720b14a0981d485f72dce5e" Apr 17 20:18:14.683074 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:14.683052 2576 scope.go:117] "RemoveContainer" containerID="de18846fe6f94b8dee6791d125e90192d8e0f8f7678a0adb014c6fddd51fea49" Apr 17 20:18:14.683288 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:18:14.683267 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_llm(1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" podUID="1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc" Apr 17 20:18:15.686690 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:15.686661 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc/main/5.log" Apr 17 20:18:16.634131 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:16.634099 2576 scope.go:117] "RemoveContainer" containerID="514fbc84c64089cfe7ed82ae068ddf28047064a4fc9d4e50a25a0c3da0d459bb" Apr 17 20:18:16.634315 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:18:16.634296 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-w8lj4_llm(5a40d06f-f97a-4be6-a799-e52a7a311ff1)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" podUID="5a40d06f-f97a-4be6-a799-e52a7a311ff1" Apr 17 20:18:17.782661 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:17.782627 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" Apr 17 20:18:17.782661 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:17.782661 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" Apr 17 20:18:17.783208 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:17.783190 2576 scope.go:117] "RemoveContainer" containerID="de18846fe6f94b8dee6791d125e90192d8e0f8f7678a0adb014c6fddd51fea49" Apr 17 20:18:17.783411 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:18:17.783392 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_llm(1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" podUID="1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc" Apr 17 20:18:19.634056 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:19.634021 2576 scope.go:117] "RemoveContainer" containerID="1b8645bc4e647b6ac34bd60883008b09eac15bea8df96544c00d9e37c23c8d24" Apr 17 20:18:19.634439 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:18:19.634280 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49_llm(d219c10a-62e0-44ad-8e0d-4def7d54e510)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" podUID="d219c10a-62e0-44ad-8e0d-4def7d54e510" Apr 17 20:18:21.640127 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:21.640084 2576 scope.go:117] "RemoveContainer" containerID="f002872b0b68fbaa28c46e0e9b56b24d74277a424cc21c57759a927560af3cf2" Apr 17 20:18:21.640711 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:18:21.640464 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_llm(0288f2a6-594b-4a71-bf11-4312437fc0b7)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" podUID="0288f2a6-594b-4a71-bf11-4312437fc0b7" Apr 17 20:18:27.634449 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:27.634410 2576 scope.go:117] "RemoveContainer" containerID="587c5764838438ae84a0d53aada2d60b6db4b9750244bd40ca0aa477bb15b0d9" Apr 17 20:18:28.730727 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:28.730700 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_b6316c7d-9ee3-4f7f-87b4-b97fb2117b36/main/5.log" Apr 17 20:18:28.731130 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:28.731058 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_b6316c7d-9ee3-4f7f-87b4-b97fb2117b36/main/4.log" Apr 17 20:18:28.731368 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:28.731346 2576 generic.go:358] "Generic (PLEG): container finished" podID="b6316c7d-9ee3-4f7f-87b4-b97fb2117b36" containerID="9c28127bc1a9fbeb564ee9ed794e234079dbd288aea73b16595d8d4076c84d59" exitCode=2 Apr 17 20:18:28.731442 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:28.731399 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" event={"ID":"b6316c7d-9ee3-4f7f-87b4-b97fb2117b36","Type":"ContainerDied","Data":"9c28127bc1a9fbeb564ee9ed794e234079dbd288aea73b16595d8d4076c84d59"} Apr 17 20:18:28.731442 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:28.731434 2576 scope.go:117] "RemoveContainer" containerID="587c5764838438ae84a0d53aada2d60b6db4b9750244bd40ca0aa477bb15b0d9" Apr 17 20:18:28.731893 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:28.731873 2576 scope.go:117] "RemoveContainer" containerID="9c28127bc1a9fbeb564ee9ed794e234079dbd288aea73b16595d8d4076c84d59" Apr 17 20:18:28.732130 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:18:28.732110 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_llm(b6316c7d-9ee3-4f7f-87b4-b97fb2117b36)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" podUID="b6316c7d-9ee3-4f7f-87b4-b97fb2117b36" Apr 17 20:18:29.736581 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:29.736549 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_b6316c7d-9ee3-4f7f-87b4-b97fb2117b36/main/5.log" Apr 17 20:18:30.634012 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:30.633980 2576 scope.go:117] "RemoveContainer" containerID="de18846fe6f94b8dee6791d125e90192d8e0f8f7678a0adb014c6fddd51fea49" Apr 17 20:18:30.634213 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:18:30.634183 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_llm(1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" podUID="1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc" Apr 17 20:18:31.636666 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:31.636635 2576 scope.go:117] "RemoveContainer" containerID="514fbc84c64089cfe7ed82ae068ddf28047064a4fc9d4e50a25a0c3da0d459bb" Apr 17 20:18:31.637090 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:18:31.636884 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-w8lj4_llm(5a40d06f-f97a-4be6-a799-e52a7a311ff1)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" podUID="5a40d06f-f97a-4be6-a799-e52a7a311ff1" Apr 17 20:18:33.634139 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:33.634105 2576 scope.go:117] "RemoveContainer" containerID="1b8645bc4e647b6ac34bd60883008b09eac15bea8df96544c00d9e37c23c8d24" Apr 17 20:18:34.755722 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:34.755696 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49_d219c10a-62e0-44ad-8e0d-4def7d54e510/main/5.log" Apr 17 20:18:34.756142 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:34.756129 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49_d219c10a-62e0-44ad-8e0d-4def7d54e510/main/4.log" Apr 17 20:18:34.756451 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:34.756424 2576 generic.go:358] "Generic (PLEG): container finished" podID="d219c10a-62e0-44ad-8e0d-4def7d54e510" containerID="5fe02759f1f762d3e5593bc3892ed4cd0bb44ae2606745e1260cc5eb1771f338" exitCode=2 Apr 17 20:18:34.756574 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:34.756471 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" event={"ID":"d219c10a-62e0-44ad-8e0d-4def7d54e510","Type":"ContainerDied","Data":"5fe02759f1f762d3e5593bc3892ed4cd0bb44ae2606745e1260cc5eb1771f338"} Apr 17 20:18:34.756574 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:34.756510 2576 scope.go:117] "RemoveContainer" containerID="1b8645bc4e647b6ac34bd60883008b09eac15bea8df96544c00d9e37c23c8d24" Apr 17 20:18:34.756948 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:34.756929 2576 scope.go:117] "RemoveContainer" containerID="5fe02759f1f762d3e5593bc3892ed4cd0bb44ae2606745e1260cc5eb1771f338" Apr 17 20:18:34.757182 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:18:34.757164 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49_llm(d219c10a-62e0-44ad-8e0d-4def7d54e510)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" podUID="d219c10a-62e0-44ad-8e0d-4def7d54e510" Apr 17 20:18:35.760552 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:35.760521 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49_d219c10a-62e0-44ad-8e0d-4def7d54e510/main/5.log" Apr 17 20:18:36.634478 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:36.634450 2576 scope.go:117] "RemoveContainer" containerID="f002872b0b68fbaa28c46e0e9b56b24d74277a424cc21c57759a927560af3cf2" Apr 17 20:18:36.634661 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:18:36.634641 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_llm(0288f2a6-594b-4a71-bf11-4312437fc0b7)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" podUID="0288f2a6-594b-4a71-bf11-4312437fc0b7" Apr 17 20:18:36.781174 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:36.781148 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" Apr 17 20:18:36.781620 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:36.781187 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" Apr 17 20:18:36.781749 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:36.781731 2576 scope.go:117] "RemoveContainer" containerID="9c28127bc1a9fbeb564ee9ed794e234079dbd288aea73b16595d8d4076c84d59" Apr 17 20:18:36.781986 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:18:36.781966 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_llm(b6316c7d-9ee3-4f7f-87b4-b97fb2117b36)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" podUID="b6316c7d-9ee3-4f7f-87b4-b97fb2117b36" Apr 17 20:18:42.633691 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:42.633658 2576 scope.go:117] "RemoveContainer" containerID="514fbc84c64089cfe7ed82ae068ddf28047064a4fc9d4e50a25a0c3da0d459bb" Apr 17 20:18:43.791119 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:43.791089 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-w8lj4_5a40d06f-f97a-4be6-a799-e52a7a311ff1/main/5.log" Apr 17 20:18:43.791587 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:43.791485 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-w8lj4_5a40d06f-f97a-4be6-a799-e52a7a311ff1/main/4.log" Apr 17 20:18:43.791801 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:43.791781 2576 generic.go:358] "Generic (PLEG): container finished" podID="5a40d06f-f97a-4be6-a799-e52a7a311ff1" containerID="673f36f070d27dabeef3f86e070e3a7246c693a7cfc7ae0fd6199dc2f2a8928f" exitCode=2 Apr 17 20:18:43.791899 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:43.791879 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" event={"ID":"5a40d06f-f97a-4be6-a799-e52a7a311ff1","Type":"ContainerDied","Data":"673f36f070d27dabeef3f86e070e3a7246c693a7cfc7ae0fd6199dc2f2a8928f"} Apr 17 20:18:43.791956 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:43.791928 2576 scope.go:117] "RemoveContainer" containerID="514fbc84c64089cfe7ed82ae068ddf28047064a4fc9d4e50a25a0c3da0d459bb" Apr 17 20:18:43.792301 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:43.792285 2576 scope.go:117] "RemoveContainer" containerID="673f36f070d27dabeef3f86e070e3a7246c693a7cfc7ae0fd6199dc2f2a8928f" Apr 17 20:18:43.792533 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:18:43.792508 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-w8lj4_llm(5a40d06f-f97a-4be6-a799-e52a7a311ff1)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" podUID="5a40d06f-f97a-4be6-a799-e52a7a311ff1" Apr 17 20:18:43.881006 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:43.880972 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" Apr 17 20:18:43.881006 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:43.881006 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" Apr 17 20:18:43.881405 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:43.881391 2576 scope.go:117] "RemoveContainer" containerID="5fe02759f1f762d3e5593bc3892ed4cd0bb44ae2606745e1260cc5eb1771f338" Apr 17 20:18:43.881594 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:18:43.881577 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49_llm(d219c10a-62e0-44ad-8e0d-4def7d54e510)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" podUID="d219c10a-62e0-44ad-8e0d-4def7d54e510" Apr 17 20:18:44.633896 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:44.633867 2576 scope.go:117] "RemoveContainer" containerID="de18846fe6f94b8dee6791d125e90192d8e0f8f7678a0adb014c6fddd51fea49" Apr 17 20:18:44.634084 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:18:44.634066 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_llm(1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" podUID="1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc" Apr 17 20:18:44.796714 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:44.796682 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-w8lj4_5a40d06f-f97a-4be6-a799-e52a7a311ff1/main/5.log" Apr 17 20:18:45.481074 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:45.481035 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" Apr 17 20:18:45.481074 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:45.481075 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" Apr 17 20:18:45.481465 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:45.481453 2576 scope.go:117] "RemoveContainer" containerID="673f36f070d27dabeef3f86e070e3a7246c693a7cfc7ae0fd6199dc2f2a8928f" Apr 17 20:18:45.481643 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:18:45.481626 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-w8lj4_llm(5a40d06f-f97a-4be6-a799-e52a7a311ff1)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" podUID="5a40d06f-f97a-4be6-a799-e52a7a311ff1" Apr 17 20:18:47.634552 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:47.634521 2576 scope.go:117] "RemoveContainer" containerID="f002872b0b68fbaa28c46e0e9b56b24d74277a424cc21c57759a927560af3cf2" Apr 17 20:18:47.635047 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:18:47.634707 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_llm(0288f2a6-594b-4a71-bf11-4312437fc0b7)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" podUID="0288f2a6-594b-4a71-bf11-4312437fc0b7" Apr 17 20:18:47.635047 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:47.634783 2576 scope.go:117] "RemoveContainer" containerID="9c28127bc1a9fbeb564ee9ed794e234079dbd288aea73b16595d8d4076c84d59" Apr 17 20:18:47.635047 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:18:47.634955 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_llm(b6316c7d-9ee3-4f7f-87b4-b97fb2117b36)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" podUID="b6316c7d-9ee3-4f7f-87b4-b97fb2117b36" Apr 17 20:18:54.634582 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:54.634549 2576 scope.go:117] "RemoveContainer" containerID="5fe02759f1f762d3e5593bc3892ed4cd0bb44ae2606745e1260cc5eb1771f338" Apr 17 20:18:54.635074 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:18:54.634746 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49_llm(d219c10a-62e0-44ad-8e0d-4def7d54e510)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" podUID="d219c10a-62e0-44ad-8e0d-4def7d54e510" Apr 17 20:18:56.634133 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:56.634103 2576 scope.go:117] "RemoveContainer" containerID="de18846fe6f94b8dee6791d125e90192d8e0f8f7678a0adb014c6fddd51fea49" Apr 17 20:18:56.634513 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:18:56.634262 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_llm(1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" podUID="1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc" Apr 17 20:18:59.634502 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:18:59.634469 2576 scope.go:117] "RemoveContainer" containerID="f002872b0b68fbaa28c46e0e9b56b24d74277a424cc21c57759a927560af3cf2" Apr 17 20:18:59.634913 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:18:59.634653 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_llm(0288f2a6-594b-4a71-bf11-4312437fc0b7)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" podUID="0288f2a6-594b-4a71-bf11-4312437fc0b7" Apr 17 20:19:00.634566 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:00.634533 2576 scope.go:117] "RemoveContainer" containerID="673f36f070d27dabeef3f86e070e3a7246c693a7cfc7ae0fd6199dc2f2a8928f" Apr 17 20:19:00.634979 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:19:00.634726 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-w8lj4_llm(5a40d06f-f97a-4be6-a799-e52a7a311ff1)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" podUID="5a40d06f-f97a-4be6-a799-e52a7a311ff1" Apr 17 20:19:01.636953 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:01.636916 2576 scope.go:117] "RemoveContainer" containerID="9c28127bc1a9fbeb564ee9ed794e234079dbd288aea73b16595d8d4076c84d59" Apr 17 20:19:01.637325 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:19:01.637227 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_llm(b6316c7d-9ee3-4f7f-87b4-b97fb2117b36)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" podUID="b6316c7d-9ee3-4f7f-87b4-b97fb2117b36" Apr 17 20:19:05.634619 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:05.634520 2576 scope.go:117] "RemoveContainer" containerID="5fe02759f1f762d3e5593bc3892ed4cd0bb44ae2606745e1260cc5eb1771f338" Apr 17 20:19:05.635117 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:19:05.634693 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49_llm(d219c10a-62e0-44ad-8e0d-4def7d54e510)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" podUID="d219c10a-62e0-44ad-8e0d-4def7d54e510" Apr 17 20:19:09.634150 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:09.634119 2576 scope.go:117] "RemoveContainer" containerID="de18846fe6f94b8dee6791d125e90192d8e0f8f7678a0adb014c6fddd51fea49" Apr 17 20:19:09.634534 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:19:09.634344 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_llm(1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" podUID="1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc" Apr 17 20:19:10.633866 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:10.633807 2576 scope.go:117] "RemoveContainer" containerID="f002872b0b68fbaa28c46e0e9b56b24d74277a424cc21c57759a927560af3cf2" Apr 17 20:19:10.634063 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:19:10.634022 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_llm(0288f2a6-594b-4a71-bf11-4312437fc0b7)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" podUID="0288f2a6-594b-4a71-bf11-4312437fc0b7" Apr 17 20:19:13.633746 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:13.633713 2576 scope.go:117] "RemoveContainer" containerID="9c28127bc1a9fbeb564ee9ed794e234079dbd288aea73b16595d8d4076c84d59" Apr 17 20:19:13.634188 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:19:13.633931 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_llm(b6316c7d-9ee3-4f7f-87b4-b97fb2117b36)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" podUID="b6316c7d-9ee3-4f7f-87b4-b97fb2117b36" Apr 17 20:19:14.633908 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:14.633875 2576 scope.go:117] "RemoveContainer" containerID="673f36f070d27dabeef3f86e070e3a7246c693a7cfc7ae0fd6199dc2f2a8928f" Apr 17 20:19:14.634314 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:19:14.634064 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-w8lj4_llm(5a40d06f-f97a-4be6-a799-e52a7a311ff1)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" podUID="5a40d06f-f97a-4be6-a799-e52a7a311ff1" Apr 17 20:19:17.633797 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:17.633764 2576 scope.go:117] "RemoveContainer" containerID="5fe02759f1f762d3e5593bc3892ed4cd0bb44ae2606745e1260cc5eb1771f338" Apr 17 20:19:17.634200 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:19:17.633982 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49_llm(d219c10a-62e0-44ad-8e0d-4def7d54e510)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" podUID="d219c10a-62e0-44ad-8e0d-4def7d54e510" Apr 17 20:19:21.562076 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:21.562033 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-w8lj4_5a40d06f-f97a-4be6-a799-e52a7a311ff1/main/5.log" Apr 17 20:19:21.562920 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:21.562899 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49_d219c10a-62e0-44ad-8e0d-4def7d54e510/main/5.log" Apr 17 20:19:21.563301 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:21.563284 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-w8lj4_5a40d06f-f97a-4be6-a799-e52a7a311ff1/main/5.log" Apr 17 20:19:21.563939 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:21.563915 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49_d219c10a-62e0-44ad-8e0d-4def7d54e510/main/5.log" Apr 17 20:19:21.564087 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:21.564073 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_b6316c7d-9ee3-4f7f-87b4-b97fb2117b36/main/5.log" Apr 17 20:19:21.564654 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:21.564642 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc/main/5.log" Apr 17 20:19:21.565061 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:21.565041 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_b6316c7d-9ee3-4f7f-87b4-b97fb2117b36/main/5.log" Apr 17 20:19:21.565322 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:21.565306 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_0288f2a6-594b-4a71-bf11-4312437fc0b7/main/5.log" Apr 17 20:19:21.565686 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:21.565666 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc/main/5.log" Apr 17 20:19:21.566328 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:21.566313 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_0288f2a6-594b-4a71-bf11-4312437fc0b7/main/5.log" Apr 17 20:19:21.579365 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:21.579344 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2wkz_8c7c8612-982d-4a5d-81bd-d18fab9adeef/ovn-acl-logging/0.log" Apr 17 20:19:21.580296 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:21.580279 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2wkz_8c7c8612-982d-4a5d-81bd-d18fab9adeef/ovn-acl-logging/0.log" Apr 17 20:19:21.636057 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:21.636030 2576 scope.go:117] "RemoveContainer" containerID="f002872b0b68fbaa28c46e0e9b56b24d74277a424cc21c57759a927560af3cf2" Apr 17 20:19:21.636509 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:19:21.636475 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_llm(0288f2a6-594b-4a71-bf11-4312437fc0b7)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" podUID="0288f2a6-594b-4a71-bf11-4312437fc0b7" Apr 17 20:19:23.634055 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:23.634027 2576 scope.go:117] "RemoveContainer" containerID="de18846fe6f94b8dee6791d125e90192d8e0f8f7678a0adb014c6fddd51fea49" Apr 17 20:19:23.634434 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:19:23.634243 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_llm(1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" podUID="1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc" Apr 17 20:19:24.634061 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:24.634030 2576 scope.go:117] "RemoveContainer" containerID="9c28127bc1a9fbeb564ee9ed794e234079dbd288aea73b16595d8d4076c84d59" Apr 17 20:19:24.634430 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:19:24.634207 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_llm(b6316c7d-9ee3-4f7f-87b4-b97fb2117b36)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" podUID="b6316c7d-9ee3-4f7f-87b4-b97fb2117b36" Apr 17 20:19:27.634085 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:27.634051 2576 scope.go:117] "RemoveContainer" containerID="673f36f070d27dabeef3f86e070e3a7246c693a7cfc7ae0fd6199dc2f2a8928f" Apr 17 20:19:27.634673 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:19:27.634239 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-w8lj4_llm(5a40d06f-f97a-4be6-a799-e52a7a311ff1)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" podUID="5a40d06f-f97a-4be6-a799-e52a7a311ff1" Apr 17 20:19:29.634771 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:29.634743 2576 scope.go:117] "RemoveContainer" containerID="5fe02759f1f762d3e5593bc3892ed4cd0bb44ae2606745e1260cc5eb1771f338" Apr 17 20:19:29.635212 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:19:29.634942 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49_llm(d219c10a-62e0-44ad-8e0d-4def7d54e510)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" podUID="d219c10a-62e0-44ad-8e0d-4def7d54e510" Apr 17 20:19:30.731560 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:30.731529 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_0288f2a6-594b-4a71-bf11-4312437fc0b7/main/5.log" Apr 17 20:19:30.843851 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:30.843820 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_0288f2a6-594b-4a71-bf11-4312437fc0b7/main/5.log" Apr 17 20:19:30.959439 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:30.959403 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_0288f2a6-594b-4a71-bf11-4312437fc0b7/storage-initializer/0.log" Apr 17 20:19:31.859510 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:31.859480 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-95444fd86-rgwr8_f944698f-a710-4c44-a3a5-8ebb500e3d86/authorino/0.log" Apr 17 20:19:34.633925 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:34.633897 2576 scope.go:117] "RemoveContainer" containerID="de18846fe6f94b8dee6791d125e90192d8e0f8f7678a0adb014c6fddd51fea49" Apr 17 20:19:34.634339 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:19:34.634093 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_llm(1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" podUID="1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc" Apr 17 20:19:35.709874 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:35.709828 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-54d89dbfcc-xwpgt_8d6c6dde-851f-4c11-84a9-81fc00ad270c/manager/0.log" Apr 17 20:19:36.052105 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:36.052082 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6bcb6fdd5f-ghzxc_cff41bde-c7d2-44c8-ad4b-befce0847da2/manager/0.log" Apr 17 20:19:36.634217 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:36.634180 2576 scope.go:117] "RemoveContainer" containerID="9c28127bc1a9fbeb564ee9ed794e234079dbd288aea73b16595d8d4076c84d59" Apr 17 20:19:36.634435 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:36.634319 2576 scope.go:117] "RemoveContainer" containerID="f002872b0b68fbaa28c46e0e9b56b24d74277a424cc21c57759a927560af3cf2" Apr 17 20:19:36.634435 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:19:36.634393 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_llm(b6316c7d-9ee3-4f7f-87b4-b97fb2117b36)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" podUID="b6316c7d-9ee3-4f7f-87b4-b97fb2117b36" Apr 17 20:19:36.634612 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:19:36.634526 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_llm(0288f2a6-594b-4a71-bf11-4312437fc0b7)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" podUID="0288f2a6-594b-4a71-bf11-4312437fc0b7" Apr 17 20:19:37.472943 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:37.472910 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-95444fd86-rgwr8_f944698f-a710-4c44-a3a5-8ebb500e3d86/authorino/0.log" Apr 17 20:19:38.884727 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:38.884694 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-6568cc58bc-nthq7_2ead0945-3986-4af6-8922-0d1eeabb3e3b/kube-auth-proxy/0.log" Apr 17 20:19:39.097234 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:39.097209 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5f85496cb-r87zd_69b938b6-9d16-44cc-9e04-55819bafd06e/router/0.log" Apr 17 20:19:39.436673 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:39.436641 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49_d219c10a-62e0-44ad-8e0d-4def7d54e510/storage-initializer/0.log" Apr 17 20:19:39.442834 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:39.442811 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49_d219c10a-62e0-44ad-8e0d-4def7d54e510/main/5.log" Apr 17 20:19:39.555755 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:39.555724 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_b6316c7d-9ee3-4f7f-87b4-b97fb2117b36/storage-initializer/0.log" Apr 17 20:19:39.562178 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:39.562156 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_b6316c7d-9ee3-4f7f-87b4-b97fb2117b36/main/5.log" Apr 17 20:19:39.668369 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:39.668337 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-6d5965695-bctd7_3cf31824-04be-4c34-9b73-76116db5f973/storage-initializer/0.log" Apr 17 20:19:39.674688 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:39.674661 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-6d5965695-bctd7_3cf31824-04be-4c34-9b73-76116db5f973/main/0.log" Apr 17 20:19:39.783667 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:39.783588 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc/main/5.log" Apr 17 20:19:39.789435 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:39.789410 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc/storage-initializer/0.log" Apr 17 20:19:39.894118 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:39.894085 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_0288f2a6-594b-4a71-bf11-4312437fc0b7/storage-initializer/0.log" Apr 17 20:19:39.903246 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:39.903219 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_0288f2a6-594b-4a71-bf11-4312437fc0b7/main/5.log" Apr 17 20:19:40.008012 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:40.007986 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-w8lj4_5a40d06f-f97a-4be6-a799-e52a7a311ff1/storage-initializer/0.log" Apr 17 20:19:40.013882 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:40.013827 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-w8lj4_5a40d06f-f97a-4be6-a799-e52a7a311ff1/main/5.log" Apr 17 20:19:40.634036 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:40.634001 2576 scope.go:117] "RemoveContainer" containerID="673f36f070d27dabeef3f86e070e3a7246c693a7cfc7ae0fd6199dc2f2a8928f" Apr 17 20:19:40.634223 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:40.634108 2576 scope.go:117] "RemoveContainer" containerID="5fe02759f1f762d3e5593bc3892ed4cd0bb44ae2606745e1260cc5eb1771f338" Apr 17 20:19:40.634271 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:19:40.634245 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-w8lj4_llm(5a40d06f-f97a-4be6-a799-e52a7a311ff1)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" podUID="5a40d06f-f97a-4be6-a799-e52a7a311ff1" Apr 17 20:19:40.634315 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:19:40.634253 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49_llm(d219c10a-62e0-44ad-8e0d-4def7d54e510)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" podUID="d219c10a-62e0-44ad-8e0d-4def7d54e510" Apr 17 20:19:46.571067 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:46.571037 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-vb6rl_b1fec543-b49e-43b9-ae94-fbf7c58061e8/global-pull-secret-syncer/0.log" Apr 17 20:19:46.669664 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:46.669632 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-g8vzm_951f5411-0db0-4850-b6dc-cdf6bfae4395/konnectivity-agent/0.log" Apr 17 20:19:46.717491 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:46.717463 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-131-172.ec2.internal_980e96a3ec854dc0a95f332d23ffd70e/haproxy/0.log" Apr 17 20:19:49.633912 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:49.633877 2576 scope.go:117] "RemoveContainer" containerID="de18846fe6f94b8dee6791d125e90192d8e0f8f7678a0adb014c6fddd51fea49" Apr 17 20:19:49.634351 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:19:49.634150 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_llm(1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" podUID="1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc" Apr 17 20:19:51.388104 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:51.388075 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-95444fd86-rgwr8_f944698f-a710-4c44-a3a5-8ebb500e3d86/authorino/0.log" Apr 17 20:19:51.637457 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:51.637427 2576 scope.go:117] "RemoveContainer" containerID="9c28127bc1a9fbeb564ee9ed794e234079dbd288aea73b16595d8d4076c84d59" Apr 17 20:19:51.637641 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:51.637493 2576 scope.go:117] "RemoveContainer" containerID="f002872b0b68fbaa28c46e0e9b56b24d74277a424cc21c57759a927560af3cf2" Apr 17 20:19:51.637688 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:19:51.637642 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_llm(b6316c7d-9ee3-4f7f-87b4-b97fb2117b36)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" podUID="b6316c7d-9ee3-4f7f-87b4-b97fb2117b36" Apr 17 20:19:51.637785 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:19:51.637764 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_llm(0288f2a6-594b-4a71-bf11-4312437fc0b7)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" podUID="0288f2a6-594b-4a71-bf11-4312437fc0b7" Apr 17 20:19:53.211568 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:53.211510 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-fbvmc_eea67969-35bc-420e-a3e8-819b529619bf/cluster-monitoring-operator/0.log" Apr 17 20:19:53.321941 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:53.321914 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-6f8f6f89d6-p2bnw_06270147-16c3-4db2-a6dd-58512c0c72cc/metrics-server/0.log" Apr 17 20:19:53.346156 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:53.346129 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-jnvf2_64493f01-c3e5-44b2-879a-259426ce4222/monitoring-plugin/0.log" Apr 17 20:19:53.448183 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:53.448151 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-t2lrx_63581dd1-b691-415d-8ba2-5d1f37bbf88b/node-exporter/0.log" Apr 17 20:19:53.469011 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:53.468979 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-t2lrx_63581dd1-b691-415d-8ba2-5d1f37bbf88b/kube-rbac-proxy/0.log" Apr 17 20:19:53.513682 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:53.513608 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-t2lrx_63581dd1-b691-415d-8ba2-5d1f37bbf88b/init-textfile/0.log" Apr 17 20:19:53.633903 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:53.633870 2576 scope.go:117] "RemoveContainer" containerID="5fe02759f1f762d3e5593bc3892ed4cd0bb44ae2606745e1260cc5eb1771f338" Apr 17 20:19:53.634133 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:19:53.634110 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49_llm(d219c10a-62e0-44ad-8e0d-4def7d54e510)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" podUID="d219c10a-62e0-44ad-8e0d-4def7d54e510" Apr 17 20:19:53.737185 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:53.737153 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_fd866433-16f1-482d-88aa-92bdff80ce12/prometheus/0.log" Apr 17 20:19:53.765899 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:53.765818 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_fd866433-16f1-482d-88aa-92bdff80ce12/config-reloader/0.log" Apr 17 20:19:53.786147 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:53.786113 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_fd866433-16f1-482d-88aa-92bdff80ce12/thanos-sidecar/0.log" Apr 17 20:19:53.805750 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:53.805725 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_fd866433-16f1-482d-88aa-92bdff80ce12/kube-rbac-proxy-web/0.log" Apr 17 20:19:53.825744 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:53.825715 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_fd866433-16f1-482d-88aa-92bdff80ce12/kube-rbac-proxy/0.log" Apr 17 20:19:53.846245 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:53.846220 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_fd866433-16f1-482d-88aa-92bdff80ce12/kube-rbac-proxy-thanos/0.log" Apr 17 20:19:53.867396 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:53.867293 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_fd866433-16f1-482d-88aa-92bdff80ce12/init-config-reloader/0.log" Apr 17 20:19:53.896752 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:53.896725 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-x8nkg_b80e8a19-d2f5-44be-995b-32f6fa5ba15c/prometheus-operator/0.log" Apr 17 20:19:53.924697 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:53.924675 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-x8nkg_b80e8a19-d2f5-44be-995b-32f6fa5ba15c/kube-rbac-proxy/0.log" Apr 17 20:19:53.948303 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:53.948278 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-dzdlg_723a02cc-35ab-40aa-88d0-8ef966947773/prometheus-operator-admission-webhook/0.log" Apr 17 20:19:54.634744 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:54.634716 2576 scope.go:117] "RemoveContainer" containerID="673f36f070d27dabeef3f86e070e3a7246c693a7cfc7ae0fd6199dc2f2a8928f" Apr 17 20:19:54.635139 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:19:54.634919 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-w8lj4_llm(5a40d06f-f97a-4be6-a799-e52a7a311ff1)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" podUID="5a40d06f-f97a-4be6-a799-e52a7a311ff1" Apr 17 20:19:55.274880 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:55.274833 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jrnl5/perf-node-gather-daemonset-kfkfs"] Apr 17 20:19:55.275224 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:55.275212 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e59351fd-81e5-415c-92ac-94da8b9a7b76" containerName="cleanup" Apr 17 20:19:55.275268 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:55.275226 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e59351fd-81e5-415c-92ac-94da8b9a7b76" containerName="cleanup" Apr 17 20:19:55.275268 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:55.275234 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e59351fd-81e5-415c-92ac-94da8b9a7b76" containerName="cleanup" Apr 17 20:19:55.275268 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:55.275240 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e59351fd-81e5-415c-92ac-94da8b9a7b76" containerName="cleanup" Apr 17 20:19:55.275268 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:55.275253 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e59351fd-81e5-415c-92ac-94da8b9a7b76" containerName="cleanup" Apr 17 20:19:55.275268 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:55.275259 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e59351fd-81e5-415c-92ac-94da8b9a7b76" containerName="cleanup" Apr 17 20:19:55.275436 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:55.275329 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e59351fd-81e5-415c-92ac-94da8b9a7b76" containerName="cleanup" Apr 17 20:19:55.275436 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:55.275337 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e59351fd-81e5-415c-92ac-94da8b9a7b76" containerName="cleanup" Apr 17 20:19:55.275436 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:55.275345 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e59351fd-81e5-415c-92ac-94da8b9a7b76" containerName="cleanup" Apr 17 20:19:55.278292 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:55.278275 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-kfkfs" Apr 17 20:19:55.280427 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:55.280399 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-jrnl5\"/\"default-dockercfg-fxgz8\"" Apr 17 20:19:55.281179 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:55.281163 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jrnl5\"/\"openshift-service-ca.crt\"" Apr 17 20:19:55.281250 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:55.281183 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jrnl5\"/\"kube-root-ca.crt\"" Apr 17 20:19:55.285928 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:55.285907 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jrnl5/perf-node-gather-daemonset-kfkfs"] Apr 17 20:19:55.454341 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:55.454303 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3e55d0fb-a719-4dad-80a6-759f43dae4d6-podres\") pod \"perf-node-gather-daemonset-kfkfs\" (UID: \"3e55d0fb-a719-4dad-80a6-759f43dae4d6\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-kfkfs" Apr 17 20:19:55.454341 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:55.454343 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r8b4\" (UniqueName: \"kubernetes.io/projected/3e55d0fb-a719-4dad-80a6-759f43dae4d6-kube-api-access-2r8b4\") pod \"perf-node-gather-daemonset-kfkfs\" (UID: \"3e55d0fb-a719-4dad-80a6-759f43dae4d6\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-kfkfs" Apr 17 20:19:55.454565 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:55.454467 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3e55d0fb-a719-4dad-80a6-759f43dae4d6-proc\") pod \"perf-node-gather-daemonset-kfkfs\" (UID: \"3e55d0fb-a719-4dad-80a6-759f43dae4d6\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-kfkfs" Apr 17 20:19:55.454565 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:55.454512 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3e55d0fb-a719-4dad-80a6-759f43dae4d6-sys\") pod \"perf-node-gather-daemonset-kfkfs\" (UID: \"3e55d0fb-a719-4dad-80a6-759f43dae4d6\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-kfkfs" Apr 17 20:19:55.454637 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:55.454566 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3e55d0fb-a719-4dad-80a6-759f43dae4d6-lib-modules\") pod \"perf-node-gather-daemonset-kfkfs\" (UID: \"3e55d0fb-a719-4dad-80a6-759f43dae4d6\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-kfkfs" Apr 17 20:19:55.555503 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:55.555474 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3e55d0fb-a719-4dad-80a6-759f43dae4d6-lib-modules\") pod \"perf-node-gather-daemonset-kfkfs\" (UID: \"3e55d0fb-a719-4dad-80a6-759f43dae4d6\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-kfkfs" Apr 17 20:19:55.555648 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:55.555523 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3e55d0fb-a719-4dad-80a6-759f43dae4d6-podres\") pod \"perf-node-gather-daemonset-kfkfs\" (UID: \"3e55d0fb-a719-4dad-80a6-759f43dae4d6\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-kfkfs" Apr 17 20:19:55.555648 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:55.555543 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2r8b4\" (UniqueName: \"kubernetes.io/projected/3e55d0fb-a719-4dad-80a6-759f43dae4d6-kube-api-access-2r8b4\") pod \"perf-node-gather-daemonset-kfkfs\" (UID: \"3e55d0fb-a719-4dad-80a6-759f43dae4d6\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-kfkfs" Apr 17 20:19:55.555648 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:55.555610 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3e55d0fb-a719-4dad-80a6-759f43dae4d6-proc\") pod \"perf-node-gather-daemonset-kfkfs\" (UID: \"3e55d0fb-a719-4dad-80a6-759f43dae4d6\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-kfkfs" Apr 17 20:19:55.555769 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:55.555649 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3e55d0fb-a719-4dad-80a6-759f43dae4d6-sys\") pod \"perf-node-gather-daemonset-kfkfs\" (UID: \"3e55d0fb-a719-4dad-80a6-759f43dae4d6\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-kfkfs" Apr 17 20:19:55.555769 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:55.555682 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3e55d0fb-a719-4dad-80a6-759f43dae4d6-podres\") pod \"perf-node-gather-daemonset-kfkfs\" (UID: \"3e55d0fb-a719-4dad-80a6-759f43dae4d6\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-kfkfs" Apr 17 20:19:55.555769 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:55.555682 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3e55d0fb-a719-4dad-80a6-759f43dae4d6-lib-modules\") pod \"perf-node-gather-daemonset-kfkfs\" (UID: \"3e55d0fb-a719-4dad-80a6-759f43dae4d6\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-kfkfs" Apr 17 20:19:55.555769 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:55.555722 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3e55d0fb-a719-4dad-80a6-759f43dae4d6-proc\") pod \"perf-node-gather-daemonset-kfkfs\" (UID: \"3e55d0fb-a719-4dad-80a6-759f43dae4d6\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-kfkfs" Apr 17 20:19:55.555769 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:55.555731 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3e55d0fb-a719-4dad-80a6-759f43dae4d6-sys\") pod \"perf-node-gather-daemonset-kfkfs\" (UID: \"3e55d0fb-a719-4dad-80a6-759f43dae4d6\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-kfkfs" Apr 17 20:19:55.563457 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:55.563427 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r8b4\" (UniqueName: \"kubernetes.io/projected/3e55d0fb-a719-4dad-80a6-759f43dae4d6-kube-api-access-2r8b4\") pod \"perf-node-gather-daemonset-kfkfs\" (UID: \"3e55d0fb-a719-4dad-80a6-759f43dae4d6\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-kfkfs" Apr 17 20:19:55.588329 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:55.588290 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-kfkfs" Apr 17 20:19:55.715057 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:55.715014 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jrnl5/perf-node-gather-daemonset-kfkfs"] Apr 17 20:19:55.718012 ip-10-0-131-172 kubenswrapper[2576]: W0417 20:19:55.717984 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3e55d0fb_a719_4dad_80a6_759f43dae4d6.slice/crio-1d726986bc223a5c3ee97c59698d745b165f1b696029e5225fa10352155db890 WatchSource:0}: Error finding container 1d726986bc223a5c3ee97c59698d745b165f1b696029e5225fa10352155db890: Status 404 returned error can't find the container with id 1d726986bc223a5c3ee97c59698d745b165f1b696029e5225fa10352155db890 Apr 17 20:19:56.044569 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:56.044478 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-kfkfs" event={"ID":"3e55d0fb-a719-4dad-80a6-759f43dae4d6","Type":"ContainerStarted","Data":"557175d883ac0242145dcf5436fdbf4017091bbb5321bf5e8672505e125271d5"} Apr 17 20:19:56.044569 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:56.044521 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-kfkfs" event={"ID":"3e55d0fb-a719-4dad-80a6-759f43dae4d6","Type":"ContainerStarted","Data":"1d726986bc223a5c3ee97c59698d745b165f1b696029e5225fa10352155db890"} Apr 17 20:19:56.044569 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:56.044551 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-kfkfs" Apr 17 20:19:56.060283 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:56.060232 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-kfkfs" podStartSLOduration=1.060218528 podStartE2EDuration="1.060218528s" podCreationTimestamp="2026-04-17 20:19:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:19:56.058248281 +0000 UTC m=+934.926068039" watchObservedRunningTime="2026-04-17 20:19:56.060218528 +0000 UTC m=+934.928038323" Apr 17 20:19:56.316962 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:56.316929 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-l7bms_a29542e8-2ce0-4458-a973-dafcdaa4d486/download-server/0.log" Apr 17 20:19:56.814683 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:56.814660 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-m55qn_cc9af87a-db40-4e36-8c0e-1c4f68713a3d/volume-data-source-validator/0.log" Apr 17 20:19:57.597420 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:57.597381 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-p227m_8314d958-7474-40f1-b9c0-6ce6e8e69f40/dns/0.log" Apr 17 20:19:57.620093 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:57.620063 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-p227m_8314d958-7474-40f1-b9c0-6ce6e8e69f40/kube-rbac-proxy/0.log" Apr 17 20:19:57.666329 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:57.666301 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-fd4vp_959e3ae9-cf8e-4e50-b31f-76cf29653018/dns-node-resolver/0.log" Apr 17 20:19:58.159627 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:58.159600 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-2m9c7_eb52d0ba-f85e-4263-9b90-db812ebeca7c/node-ca/0.log" Apr 17 20:19:59.086693 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:59.086661 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-6568cc58bc-nthq7_2ead0945-3986-4af6-8922-0d1eeabb3e3b/kube-auth-proxy/0.log" Apr 17 20:19:59.138536 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:59.138505 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5f85496cb-r87zd_69b938b6-9d16-44cc-9e04-55819bafd06e/router/0.log" Apr 17 20:19:59.622865 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:19:59.622817 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-l2wrg_642f7cfc-05bf-4010-8948-bc21230b60e0/serve-healthcheck-canary/0.log" Apr 17 20:20:00.192825 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:20:00.192798 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-kjsfm_0ad0b493-02ab-40b9-ac9b-42e735a5eb63/kube-rbac-proxy/0.log" Apr 17 20:20:00.213106 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:20:00.213076 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-kjsfm_0ad0b493-02ab-40b9-ac9b-42e735a5eb63/exporter/0.log" Apr 17 20:20:00.233363 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:20:00.233339 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-kjsfm_0ad0b493-02ab-40b9-ac9b-42e735a5eb63/extractor/0.log" Apr 17 20:20:02.058300 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:20:02.058269 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-kfkfs" Apr 17 20:20:02.152857 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:20:02.152803 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-54d89dbfcc-xwpgt_8d6c6dde-851f-4c11-84a9-81fc00ad270c/manager/0.log" Apr 17 20:20:02.251237 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:20:02.251207 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6bcb6fdd5f-ghzxc_cff41bde-c7d2-44c8-ad4b-befce0847da2/manager/0.log" Apr 17 20:20:03.408065 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:20:03.408034 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-fcb6f8ffb-q95vl_4fac432b-3436-490f-acc2-b7c3e4f955da/manager/0.log" Apr 17 20:20:03.634399 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:20:03.634368 2576 scope.go:117] "RemoveContainer" containerID="de18846fe6f94b8dee6791d125e90192d8e0f8f7678a0adb014c6fddd51fea49" Apr 17 20:20:03.634580 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:20:03.634557 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7_llm(1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8rzqb7" podUID="1ba24aa5-a3f4-4b2d-a0dd-6b49459dc5fc" Apr 17 20:20:04.634411 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:20:04.634379 2576 scope.go:117] "RemoveContainer" containerID="f002872b0b68fbaa28c46e0e9b56b24d74277a424cc21c57759a927560af3cf2" Apr 17 20:20:04.634789 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:20:04.634564 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm_llm(0288f2a6-594b-4a71-bf11-4312437fc0b7)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-n27gm" podUID="0288f2a6-594b-4a71-bf11-4312437fc0b7" Apr 17 20:20:05.634292 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:20:05.634258 2576 scope.go:117] "RemoveContainer" containerID="9c28127bc1a9fbeb564ee9ed794e234079dbd288aea73b16595d8d4076c84d59" Apr 17 20:20:05.634498 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:20:05.634462 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-xgd2b_llm(b6316c7d-9ee3-4f7f-87b4-b97fb2117b36)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-xgd2b" podUID="b6316c7d-9ee3-4f7f-87b4-b97fb2117b36" Apr 17 20:20:05.634824 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:20:05.634535 2576 scope.go:117] "RemoveContainer" containerID="673f36f070d27dabeef3f86e070e3a7246c693a7cfc7ae0fd6199dc2f2a8928f" Apr 17 20:20:05.634824 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:20:05.634715 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-w8lj4_llm(5a40d06f-f97a-4be6-a799-e52a7a311ff1)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-w8lj4" podUID="5a40d06f-f97a-4be6-a799-e52a7a311ff1" Apr 17 20:20:07.633919 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:20:07.633867 2576 scope.go:117] "RemoveContainer" containerID="5fe02759f1f762d3e5593bc3892ed4cd0bb44ae2606745e1260cc5eb1771f338" Apr 17 20:20:07.634374 ip-10-0-131-172 kubenswrapper[2576]: E0417 20:20:07.634134 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49_llm(d219c10a-62e0-44ad-8e0d-4def7d54e510)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-qcd49" podUID="d219c10a-62e0-44ad-8e0d-4def7d54e510" Apr 17 20:20:07.845789 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:20:07.845759 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-dvq7l_3c3a02fe-7f40-416b-90f8-3df7975d906c/migrator/0.log" Apr 17 20:20:07.868247 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:20:07.868211 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-dvq7l_3c3a02fe-7f40-416b-90f8-3df7975d906c/graceful-termination/0.log" Apr 17 20:20:08.217914 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:20:08.217866 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-qrtjl_fb0f783e-6eb1-4b42-82f2-7eed827adacb/kube-storage-version-migrator-operator/1.log" Apr 17 20:20:08.218610 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:20:08.218592 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-qrtjl_fb0f783e-6eb1-4b42-82f2-7eed827adacb/kube-storage-version-migrator-operator/0.log" Apr 17 20:20:09.426148 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:20:09.426120 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fwvg8_7221eba4-90c3-41d0-809f-2cfbbcbf40d8/kube-multus-additional-cni-plugins/0.log" Apr 17 20:20:09.449220 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:20:09.449193 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fwvg8_7221eba4-90c3-41d0-809f-2cfbbcbf40d8/egress-router-binary-copy/0.log" Apr 17 20:20:09.495951 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:20:09.495921 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fwvg8_7221eba4-90c3-41d0-809f-2cfbbcbf40d8/cni-plugins/0.log" Apr 17 20:20:09.570113 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:20:09.570080 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fwvg8_7221eba4-90c3-41d0-809f-2cfbbcbf40d8/bond-cni-plugin/0.log" Apr 17 20:20:09.612730 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:20:09.612701 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fwvg8_7221eba4-90c3-41d0-809f-2cfbbcbf40d8/routeoverride-cni/0.log" Apr 17 20:20:09.637811 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:20:09.637780 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fwvg8_7221eba4-90c3-41d0-809f-2cfbbcbf40d8/whereabouts-cni-bincopy/0.log" Apr 17 20:20:09.662429 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:20:09.662402 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fwvg8_7221eba4-90c3-41d0-809f-2cfbbcbf40d8/whereabouts-cni/0.log" Apr 17 20:20:09.856258 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:20:09.856222 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-k5c2g_a4c3ac1f-f0bf-4ff3-9fef-fb513199e95b/kube-multus/0.log" Apr 17 20:20:10.003815 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:20:10.003782 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-cnnh2_7e87cc22-ed21-46b2-9dac-b1a3906efc5c/network-metrics-daemon/0.log" Apr 17 20:20:10.027125 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:20:10.027096 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-cnnh2_7e87cc22-ed21-46b2-9dac-b1a3906efc5c/kube-rbac-proxy/0.log" Apr 17 20:20:11.412563 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:20:11.412529 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2wkz_8c7c8612-982d-4a5d-81bd-d18fab9adeef/ovn-controller/0.log" Apr 17 20:20:11.430408 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:20:11.430379 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2wkz_8c7c8612-982d-4a5d-81bd-d18fab9adeef/ovn-acl-logging/0.log" Apr 17 20:20:11.435614 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:20:11.435585 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2wkz_8c7c8612-982d-4a5d-81bd-d18fab9adeef/ovn-acl-logging/1.log" Apr 17 20:20:11.452953 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:20:11.452929 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2wkz_8c7c8612-982d-4a5d-81bd-d18fab9adeef/kube-rbac-proxy-node/0.log" Apr 17 20:20:11.473973 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:20:11.473943 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2wkz_8c7c8612-982d-4a5d-81bd-d18fab9adeef/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 20:20:11.492472 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:20:11.492451 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2wkz_8c7c8612-982d-4a5d-81bd-d18fab9adeef/northd/0.log" Apr 17 20:20:11.514947 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:20:11.514923 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2wkz_8c7c8612-982d-4a5d-81bd-d18fab9adeef/nbdb/0.log" Apr 17 20:20:11.537369 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:20:11.537276 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2wkz_8c7c8612-982d-4a5d-81bd-d18fab9adeef/sbdb/0.log" Apr 17 20:20:11.633125 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:20:11.633092 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2wkz_8c7c8612-982d-4a5d-81bd-d18fab9adeef/ovnkube-controller/0.log" Apr 17 20:20:12.816659 ip-10-0-131-172 kubenswrapper[2576]: I0417 20:20:12.816628 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-zgppr_8ea69030-bfbd-46e0-bba4-3408bc57cb1c/network-check-target-container/0.log"