Apr 16 20:35:13.229048 ip-10-0-129-199 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 20:35:13.229061 ip-10-0-129-199 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 20:35:13.229071 ip-10-0-129-199 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 20:35:13.229370 ip-10-0-129-199 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 20:35:23.344098 ip-10-0-129-199 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 20:35:23.344115 ip-10-0-129-199 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 62ba96828e974831964a64d7527e22d1 -- Apr 16 20:37:40.893835 ip-10-0-129-199 systemd[1]: Starting Kubernetes Kubelet... Apr 16 20:37:41.364350 ip-10-0-129-199 kubenswrapper[2537]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:37:41.364350 ip-10-0-129-199 kubenswrapper[2537]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 20:37:41.364350 ip-10-0-129-199 kubenswrapper[2537]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:37:41.364350 ip-10-0-129-199 kubenswrapper[2537]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 20:37:41.364350 ip-10-0-129-199 kubenswrapper[2537]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:37:41.365456 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.365379 2537 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 20:37:41.368534 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368519 2537 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:37:41.368534 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368533 2537 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:37:41.368611 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368538 2537 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:37:41.368611 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368541 2537 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:37:41.368611 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368544 2537 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:37:41.368611 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368547 2537 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:37:41.368611 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368550 2537 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:37:41.368611 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368553 2537 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:37:41.368611 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368565 2537 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:37:41.368611 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368568 2537 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:37:41.368611 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368571 2537 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:37:41.368611 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368573 2537 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:37:41.368611 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368576 2537 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:37:41.368611 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368579 2537 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:37:41.368611 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368581 2537 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:37:41.368611 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368583 2537 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:37:41.368611 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368587 2537 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:37:41.368611 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368591 2537 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:37:41.368611 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368594 2537 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:37:41.368611 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368597 2537 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:37:41.368611 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368600 2537 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:37:41.369059 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368603 2537 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:37:41.369059 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368606 2537 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:37:41.369059 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368608 2537 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:37:41.369059 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368611 2537 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:37:41.369059 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368624 2537 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:37:41.369059 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368627 2537 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:37:41.369059 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368630 2537 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:37:41.369059 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368633 2537 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:37:41.369059 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368635 2537 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:37:41.369059 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368638 2537 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:37:41.369059 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368640 2537 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:37:41.369059 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368643 2537 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:37:41.369059 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368645 2537 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:37:41.369059 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368648 2537 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:37:41.369059 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368650 2537 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:37:41.369059 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368653 2537 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:37:41.369059 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368655 2537 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:37:41.369059 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368658 2537 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:37:41.369059 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368660 2537 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:37:41.369505 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368662 2537 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:37:41.369505 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368665 2537 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:37:41.369505 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368667 2537 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:37:41.369505 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368669 2537 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:37:41.369505 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368672 2537 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:37:41.369505 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368675 2537 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:37:41.369505 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368677 2537 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:37:41.369505 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368679 2537 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:37:41.369505 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368682 2537 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:37:41.369505 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368684 2537 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:37:41.369505 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368688 2537 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:37:41.369505 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368690 2537 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:37:41.369505 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368693 2537 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:37:41.369505 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368695 2537 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:37:41.369505 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368698 2537 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:37:41.369505 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368701 2537 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:37:41.369505 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368703 2537 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:37:41.369505 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368706 2537 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:37:41.369505 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368708 2537 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:37:41.369505 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368711 2537 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:37:41.369984 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368713 2537 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:37:41.369984 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368715 2537 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:37:41.369984 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368718 2537 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:37:41.369984 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368720 2537 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:37:41.369984 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368722 2537 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:37:41.369984 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368725 2537 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:37:41.369984 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368727 2537 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:37:41.369984 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368730 2537 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:37:41.369984 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368732 2537 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:37:41.369984 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368734 2537 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:37:41.369984 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368737 2537 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:37:41.369984 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368739 2537 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:37:41.369984 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368741 2537 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:37:41.369984 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368744 2537 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:37:41.369984 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368747 2537 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:37:41.369984 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368752 2537 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:37:41.369984 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368754 2537 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:37:41.369984 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368757 2537 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:37:41.369984 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368760 2537 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:37:41.370471 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368762 2537 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:37:41.370471 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368764 2537 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:37:41.370471 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368767 2537 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:37:41.370471 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368769 2537 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:37:41.370471 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368771 2537 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:37:41.370471 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368774 2537 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:37:41.370471 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.368778 2537 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:37:41.370471 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369137 2537 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:37:41.370471 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369142 2537 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:37:41.370471 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369145 2537 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:37:41.370471 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369149 2537 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:37:41.370471 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369151 2537 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:37:41.370471 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369154 2537 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:37:41.370471 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369157 2537 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:37:41.370471 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369160 2537 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:37:41.370471 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369163 2537 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:37:41.370471 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369165 2537 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:37:41.370471 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369168 2537 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:37:41.370471 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369171 2537 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:37:41.370471 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369174 2537 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:37:41.370943 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369176 2537 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:37:41.370943 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369179 2537 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:37:41.370943 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369181 2537 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:37:41.370943 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369184 2537 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:37:41.370943 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369186 2537 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:37:41.370943 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369189 2537 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:37:41.370943 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369191 2537 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:37:41.370943 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369193 2537 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:37:41.370943 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369196 2537 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:37:41.370943 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369199 2537 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:37:41.370943 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369202 2537 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:37:41.370943 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369205 2537 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:37:41.370943 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369208 2537 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:37:41.370943 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369210 2537 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:37:41.370943 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369213 2537 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:37:41.370943 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369215 2537 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:37:41.370943 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369218 2537 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:37:41.370943 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369220 2537 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:37:41.370943 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369223 2537 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:37:41.370943 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369226 2537 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:37:41.371416 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369229 2537 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:37:41.371416 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369231 2537 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:37:41.371416 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369233 2537 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:37:41.371416 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369236 2537 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:37:41.371416 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369238 2537 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:37:41.371416 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369240 2537 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:37:41.371416 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369243 2537 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:37:41.371416 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369245 2537 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:37:41.371416 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369247 2537 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:37:41.371416 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369251 2537 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:37:41.371416 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369254 2537 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:37:41.371416 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369257 2537 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:37:41.371416 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369260 2537 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:37:41.371416 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369262 2537 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:37:41.371416 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369265 2537 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:37:41.371416 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369268 2537 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:37:41.371416 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369270 2537 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:37:41.371416 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369273 2537 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:37:41.371416 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369275 2537 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:37:41.371416 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369278 2537 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:37:41.371907 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369281 2537 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:37:41.371907 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369285 2537 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:37:41.371907 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369287 2537 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:37:41.371907 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369290 2537 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:37:41.371907 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369292 2537 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:37:41.371907 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369295 2537 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:37:41.371907 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369297 2537 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:37:41.371907 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369300 2537 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:37:41.371907 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369302 2537 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:37:41.371907 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369305 2537 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:37:41.371907 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369307 2537 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:37:41.371907 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369311 2537 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:37:41.371907 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369314 2537 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:37:41.371907 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369316 2537 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:37:41.371907 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369319 2537 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:37:41.371907 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369322 2537 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:37:41.371907 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369324 2537 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:37:41.371907 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369326 2537 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:37:41.371907 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369329 2537 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:37:41.372362 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369331 2537 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:37:41.372362 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369333 2537 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:37:41.372362 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369336 2537 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:37:41.372362 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369338 2537 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:37:41.372362 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369341 2537 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:37:41.372362 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369343 2537 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:37:41.372362 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369346 2537 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:37:41.372362 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369349 2537 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:37:41.372362 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369351 2537 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:37:41.372362 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369354 2537 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:37:41.372362 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369356 2537 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:37:41.372362 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369359 2537 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:37:41.372362 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369361 2537 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:37:41.372362 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.369363 2537 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:37:41.372362 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369442 2537 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 20:37:41.372362 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369450 2537 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 20:37:41.372362 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369456 2537 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 20:37:41.372362 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369461 2537 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 20:37:41.372362 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369465 2537 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 20:37:41.372362 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369468 2537 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 20:37:41.372362 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369472 2537 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 20:37:41.372868 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369476 2537 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 20:37:41.372868 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369480 2537 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 20:37:41.372868 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369483 2537 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 20:37:41.372868 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369486 2537 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 20:37:41.372868 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369489 2537 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 20:37:41.372868 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369492 2537 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 20:37:41.372868 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369495 2537 flags.go:64] FLAG: --cgroup-root="" Apr 16 20:37:41.372868 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369498 2537 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 20:37:41.372868 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369501 2537 flags.go:64] FLAG: --client-ca-file="" Apr 16 20:37:41.372868 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369503 2537 flags.go:64] FLAG: --cloud-config="" Apr 16 20:37:41.372868 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369506 2537 flags.go:64] FLAG: --cloud-provider="external" Apr 16 20:37:41.372868 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369509 2537 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 20:37:41.372868 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369512 2537 flags.go:64] FLAG: --cluster-domain="" Apr 16 20:37:41.372868 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369515 2537 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 20:37:41.372868 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369518 2537 flags.go:64] FLAG: --config-dir="" Apr 16 20:37:41.372868 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369521 2537 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 20:37:41.372868 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369524 2537 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 20:37:41.372868 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369528 2537 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 20:37:41.372868 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369531 2537 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 20:37:41.372868 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369534 2537 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 20:37:41.372868 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369537 2537 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 20:37:41.372868 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369540 2537 flags.go:64] FLAG: --contention-profiling="false" Apr 16 20:37:41.372868 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369543 2537 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 20:37:41.372868 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369546 2537 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 20:37:41.373452 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369549 2537 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 20:37:41.373452 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369552 2537 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 20:37:41.373452 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369569 2537 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 20:37:41.373452 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369575 2537 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 20:37:41.373452 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369579 2537 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 20:37:41.373452 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369583 2537 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 20:37:41.373452 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369587 2537 flags.go:64] FLAG: --enable-server="true" Apr 16 20:37:41.373452 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369589 2537 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 20:37:41.373452 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369595 2537 flags.go:64] FLAG: --event-burst="100" Apr 16 20:37:41.373452 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369598 2537 flags.go:64] FLAG: --event-qps="50" Apr 16 20:37:41.373452 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369601 2537 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 20:37:41.373452 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369604 2537 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 20:37:41.373452 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369607 2537 flags.go:64] FLAG: --eviction-hard="" Apr 16 20:37:41.373452 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369611 2537 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 20:37:41.373452 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369614 2537 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 20:37:41.373452 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369617 2537 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 20:37:41.373452 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369620 2537 flags.go:64] FLAG: --eviction-soft="" Apr 16 20:37:41.373452 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369622 2537 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 20:37:41.373452 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369625 2537 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 20:37:41.373452 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369628 2537 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 20:37:41.373452 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369631 2537 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 20:37:41.373452 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369634 2537 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 20:37:41.373452 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369636 2537 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 20:37:41.373452 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369639 2537 flags.go:64] FLAG: --feature-gates="" Apr 16 20:37:41.373452 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369643 2537 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 20:37:41.374055 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369645 2537 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 20:37:41.374055 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369649 2537 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 20:37:41.374055 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369652 2537 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 20:37:41.374055 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369655 2537 flags.go:64] FLAG: --healthz-port="10248" Apr 16 20:37:41.374055 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369658 2537 flags.go:64] FLAG: --help="false" Apr 16 20:37:41.374055 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369661 2537 flags.go:64] FLAG: --hostname-override="ip-10-0-129-199.ec2.internal" Apr 16 20:37:41.374055 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369664 2537 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 20:37:41.374055 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369666 2537 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 20:37:41.374055 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369669 2537 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 20:37:41.374055 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369673 2537 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 20:37:41.374055 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369676 2537 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 20:37:41.374055 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369679 2537 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 20:37:41.374055 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369682 2537 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 20:37:41.374055 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369685 2537 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 20:37:41.374055 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369687 2537 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 20:37:41.374055 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369690 2537 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 20:37:41.374055 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369693 2537 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 20:37:41.374055 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369696 2537 flags.go:64] FLAG: --kube-reserved="" Apr 16 20:37:41.374055 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369699 2537 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 20:37:41.374055 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369702 2537 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 20:37:41.374055 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369705 2537 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 20:37:41.374055 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369708 2537 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 20:37:41.374055 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369711 2537 flags.go:64] FLAG: --lock-file="" Apr 16 20:37:41.374055 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369713 2537 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 20:37:41.374641 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369716 2537 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 20:37:41.374641 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369719 2537 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 20:37:41.374641 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369724 2537 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 20:37:41.374641 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369727 2537 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 20:37:41.374641 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369730 2537 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 20:37:41.374641 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369732 2537 flags.go:64] FLAG: --logging-format="text" Apr 16 20:37:41.374641 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369735 2537 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 20:37:41.374641 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369738 2537 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 20:37:41.374641 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369740 2537 flags.go:64] FLAG: --manifest-url="" Apr 16 20:37:41.374641 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369743 2537 flags.go:64] FLAG: --manifest-url-header="" Apr 16 20:37:41.374641 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369748 2537 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 20:37:41.374641 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369750 2537 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 20:37:41.374641 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369755 2537 flags.go:64] FLAG: --max-pods="110" Apr 16 20:37:41.374641 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369758 2537 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 20:37:41.374641 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369761 2537 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 20:37:41.374641 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369764 2537 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 20:37:41.374641 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369766 2537 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 20:37:41.374641 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369769 2537 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 20:37:41.374641 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369772 2537 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 20:37:41.374641 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369775 2537 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 20:37:41.374641 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369782 2537 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 20:37:41.374641 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369785 2537 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 20:37:41.374641 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369788 2537 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 20:37:41.374641 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369791 2537 flags.go:64] FLAG: --pod-cidr="" Apr 16 20:37:41.375311 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369794 2537 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 20:37:41.375311 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369799 2537 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 20:37:41.375311 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369802 2537 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 20:37:41.375311 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369805 2537 flags.go:64] FLAG: --pods-per-core="0" Apr 16 20:37:41.375311 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369808 2537 flags.go:64] FLAG: --port="10250" Apr 16 20:37:41.375311 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369811 2537 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 20:37:41.375311 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369813 2537 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-09d61f4442f345d7b" Apr 16 20:37:41.375311 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369817 2537 flags.go:64] FLAG: --qos-reserved="" Apr 16 20:37:41.375311 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369819 2537 flags.go:64] FLAG: --read-only-port="10255" Apr 16 20:37:41.375311 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369822 2537 flags.go:64] FLAG: --register-node="true" Apr 16 20:37:41.375311 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369825 2537 flags.go:64] FLAG: --register-schedulable="true" Apr 16 20:37:41.375311 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369828 2537 flags.go:64] FLAG: --register-with-taints="" Apr 16 20:37:41.375311 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369831 2537 flags.go:64] FLAG: --registry-burst="10" Apr 16 20:37:41.375311 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369834 2537 flags.go:64] FLAG: --registry-qps="5" Apr 16 20:37:41.375311 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369837 2537 flags.go:64] FLAG: --reserved-cpus="" Apr 16 20:37:41.375311 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369839 2537 flags.go:64] FLAG: --reserved-memory="" Apr 16 20:37:41.375311 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369843 2537 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 20:37:41.375311 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369846 2537 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 20:37:41.375311 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369849 2537 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 20:37:41.375311 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369851 2537 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 20:37:41.375311 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369854 2537 flags.go:64] FLAG: --runonce="false" Apr 16 20:37:41.375311 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369857 2537 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 20:37:41.375311 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369859 2537 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 20:37:41.375311 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369862 2537 flags.go:64] FLAG: --seccomp-default="false" Apr 16 20:37:41.375311 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369865 2537 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 20:37:41.375921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369868 2537 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 20:37:41.375921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369870 2537 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 20:37:41.375921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369873 2537 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 20:37:41.375921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369876 2537 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 20:37:41.375921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369879 2537 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 20:37:41.375921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369882 2537 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 20:37:41.375921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369885 2537 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 20:37:41.375921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369887 2537 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 20:37:41.375921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369891 2537 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 20:37:41.375921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369894 2537 flags.go:64] FLAG: --system-cgroups="" Apr 16 20:37:41.375921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369897 2537 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 20:37:41.375921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369902 2537 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 20:37:41.375921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369905 2537 flags.go:64] FLAG: --tls-cert-file="" Apr 16 20:37:41.375921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369907 2537 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 20:37:41.375921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369911 2537 flags.go:64] FLAG: --tls-min-version="" Apr 16 20:37:41.375921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369914 2537 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 20:37:41.375921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369916 2537 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 20:37:41.375921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369919 2537 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 20:37:41.375921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369922 2537 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 20:37:41.375921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369925 2537 flags.go:64] FLAG: --v="2" Apr 16 20:37:41.375921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369929 2537 flags.go:64] FLAG: --version="false" Apr 16 20:37:41.375921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369933 2537 flags.go:64] FLAG: --vmodule="" Apr 16 20:37:41.375921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369937 2537 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 20:37:41.375921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.369940 2537 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 20:37:41.375921 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370027 2537 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:37:41.376523 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370031 2537 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:37:41.376523 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370034 2537 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:37:41.376523 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370036 2537 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:37:41.376523 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370039 2537 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:37:41.376523 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370042 2537 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:37:41.376523 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370045 2537 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:37:41.376523 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370047 2537 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:37:41.376523 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370050 2537 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:37:41.376523 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370053 2537 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:37:41.376523 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370055 2537 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:37:41.376523 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370057 2537 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:37:41.376523 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370060 2537 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:37:41.376523 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370063 2537 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:37:41.376523 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370066 2537 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:37:41.376523 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370069 2537 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:37:41.376523 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370071 2537 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:37:41.376523 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370074 2537 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:37:41.376523 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370076 2537 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:37:41.376523 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370079 2537 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:37:41.376523 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370082 2537 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:37:41.377018 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370084 2537 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:37:41.377018 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370087 2537 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:37:41.377018 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370090 2537 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:37:41.377018 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370092 2537 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:37:41.377018 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370095 2537 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:37:41.377018 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370097 2537 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:37:41.377018 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370099 2537 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:37:41.377018 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370102 2537 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:37:41.377018 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370104 2537 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:37:41.377018 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370107 2537 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:37:41.377018 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370111 2537 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:37:41.377018 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370115 2537 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:37:41.377018 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370117 2537 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:37:41.377018 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370120 2537 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:37:41.377018 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370122 2537 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:37:41.377018 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370125 2537 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:37:41.377018 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370127 2537 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:37:41.377018 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370129 2537 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:37:41.377018 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370132 2537 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:37:41.377018 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370134 2537 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:37:41.377510 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370138 2537 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:37:41.377510 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370141 2537 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:37:41.377510 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370144 2537 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:37:41.377510 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370147 2537 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:37:41.377510 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370150 2537 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:37:41.377510 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370153 2537 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:37:41.377510 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370156 2537 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:37:41.377510 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370158 2537 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:37:41.377510 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370161 2537 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:37:41.377510 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370163 2537 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:37:41.377510 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370166 2537 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:37:41.377510 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370168 2537 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:37:41.377510 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370171 2537 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:37:41.377510 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370173 2537 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:37:41.377510 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370176 2537 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:37:41.377510 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370178 2537 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:37:41.377510 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370181 2537 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:37:41.377510 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370183 2537 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:37:41.377510 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370186 2537 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:37:41.377995 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370188 2537 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:37:41.377995 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370190 2537 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:37:41.377995 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370193 2537 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:37:41.377995 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370195 2537 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:37:41.377995 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370198 2537 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:37:41.377995 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370200 2537 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:37:41.377995 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370203 2537 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:37:41.377995 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370205 2537 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:37:41.377995 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370207 2537 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:37:41.377995 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370210 2537 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:37:41.377995 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370212 2537 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:37:41.377995 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370215 2537 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:37:41.377995 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370217 2537 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:37:41.377995 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370220 2537 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:37:41.377995 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370222 2537 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:37:41.377995 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370224 2537 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:37:41.377995 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370227 2537 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:37:41.377995 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370229 2537 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:37:41.377995 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370232 2537 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:37:41.377995 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370234 2537 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:37:41.378505 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370237 2537 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:37:41.378505 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370239 2537 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:37:41.378505 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370242 2537 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:37:41.378505 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370244 2537 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:37:41.378505 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370249 2537 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:37:41.378505 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.370251 2537 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:37:41.378505 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.370256 2537 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:37:41.378505 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.376350 2537 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 20:37:41.378505 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.376364 2537 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 20:37:41.378505 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376422 2537 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:37:41.378505 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376427 2537 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:37:41.378505 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376431 2537 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:37:41.378505 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376434 2537 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:37:41.378505 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376436 2537 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:37:41.378505 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376439 2537 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:37:41.378505 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376442 2537 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:37:41.378913 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376445 2537 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:37:41.378913 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376448 2537 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:37:41.378913 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376450 2537 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:37:41.378913 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376454 2537 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:37:41.378913 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376458 2537 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:37:41.378913 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376461 2537 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:37:41.378913 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376464 2537 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:37:41.378913 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376467 2537 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:37:41.378913 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376470 2537 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:37:41.378913 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376472 2537 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:37:41.378913 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376475 2537 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:37:41.378913 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376478 2537 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:37:41.378913 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376480 2537 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:37:41.378913 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376483 2537 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:37:41.378913 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376486 2537 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:37:41.378913 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376490 2537 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:37:41.378913 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376493 2537 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:37:41.378913 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376496 2537 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:37:41.378913 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376498 2537 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:37:41.379374 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376501 2537 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:37:41.379374 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376503 2537 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:37:41.379374 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376506 2537 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:37:41.379374 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376508 2537 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:37:41.379374 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376510 2537 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:37:41.379374 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376514 2537 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:37:41.379374 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376517 2537 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:37:41.379374 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376519 2537 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:37:41.379374 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376522 2537 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:37:41.379374 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376524 2537 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:37:41.379374 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376527 2537 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:37:41.379374 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376530 2537 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:37:41.379374 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376532 2537 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:37:41.379374 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376535 2537 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:37:41.379374 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376538 2537 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:37:41.379374 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376540 2537 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:37:41.379374 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376543 2537 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:37:41.379374 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376545 2537 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:37:41.379374 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376548 2537 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:37:41.379374 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376550 2537 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:37:41.379869 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376552 2537 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:37:41.379869 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376555 2537 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:37:41.379869 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376570 2537 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:37:41.379869 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376572 2537 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:37:41.379869 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376575 2537 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:37:41.379869 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376577 2537 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:37:41.379869 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376580 2537 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:37:41.379869 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376582 2537 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:37:41.379869 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376585 2537 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:37:41.379869 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376589 2537 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:37:41.379869 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376592 2537 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:37:41.379869 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376594 2537 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:37:41.379869 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376598 2537 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:37:41.379869 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376602 2537 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:37:41.379869 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376604 2537 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:37:41.379869 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376607 2537 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:37:41.379869 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376609 2537 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:37:41.379869 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376612 2537 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:37:41.379869 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376615 2537 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:37:41.380329 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376618 2537 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:37:41.380329 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376620 2537 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:37:41.380329 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376623 2537 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:37:41.380329 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376625 2537 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:37:41.380329 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376628 2537 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:37:41.380329 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376630 2537 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:37:41.380329 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376633 2537 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:37:41.380329 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376635 2537 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:37:41.380329 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376638 2537 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:37:41.380329 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376640 2537 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:37:41.380329 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376643 2537 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:37:41.380329 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376645 2537 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:37:41.380329 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376648 2537 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:37:41.380329 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376650 2537 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:37:41.380329 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376653 2537 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:37:41.380329 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376655 2537 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:37:41.380329 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376658 2537 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:37:41.380329 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376660 2537 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:37:41.380329 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376662 2537 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:37:41.380329 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376665 2537 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:37:41.380821 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376667 2537 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:37:41.380821 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.376672 2537 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:37:41.380821 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376759 2537 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:37:41.380821 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376763 2537 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:37:41.380821 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376766 2537 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:37:41.380821 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376769 2537 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:37:41.380821 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376772 2537 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:37:41.380821 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376775 2537 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:37:41.380821 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376778 2537 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:37:41.380821 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376781 2537 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:37:41.380821 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376785 2537 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:37:41.380821 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376787 2537 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:37:41.380821 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376790 2537 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:37:41.380821 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376792 2537 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:37:41.380821 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376795 2537 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:37:41.381183 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376797 2537 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:37:41.381183 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376800 2537 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:37:41.381183 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376803 2537 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:37:41.381183 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376805 2537 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:37:41.381183 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376807 2537 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:37:41.381183 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376810 2537 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:37:41.381183 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376813 2537 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:37:41.381183 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376815 2537 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:37:41.381183 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376818 2537 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:37:41.381183 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376820 2537 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:37:41.381183 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376823 2537 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:37:41.381183 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376825 2537 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:37:41.381183 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376828 2537 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:37:41.381183 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376831 2537 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:37:41.381183 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376833 2537 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:37:41.381183 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376835 2537 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:37:41.381183 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376838 2537 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:37:41.381183 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376840 2537 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:37:41.381183 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376843 2537 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:37:41.381183 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376846 2537 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:37:41.381695 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376848 2537 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:37:41.381695 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376851 2537 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:37:41.381695 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376853 2537 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:37:41.381695 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376856 2537 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:37:41.381695 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376858 2537 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:37:41.381695 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376861 2537 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:37:41.381695 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376863 2537 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:37:41.381695 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376867 2537 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:37:41.381695 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376869 2537 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:37:41.381695 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376872 2537 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:37:41.381695 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376874 2537 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:37:41.381695 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376877 2537 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:37:41.381695 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376879 2537 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:37:41.381695 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376882 2537 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:37:41.381695 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376884 2537 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:37:41.381695 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376886 2537 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:37:41.381695 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376889 2537 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:37:41.381695 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376891 2537 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:37:41.381695 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376894 2537 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:37:41.381695 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376896 2537 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:37:41.382162 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376899 2537 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:37:41.382162 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376901 2537 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:37:41.382162 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376905 2537 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:37:41.382162 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376908 2537 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:37:41.382162 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376911 2537 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:37:41.382162 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376914 2537 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:37:41.382162 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376916 2537 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:37:41.382162 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376919 2537 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:37:41.382162 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376922 2537 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:37:41.382162 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376924 2537 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:37:41.382162 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376927 2537 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:37:41.382162 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376930 2537 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:37:41.382162 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376933 2537 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:37:41.382162 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376935 2537 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:37:41.382162 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376938 2537 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:37:41.382162 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376941 2537 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:37:41.382162 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376944 2537 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:37:41.382162 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376947 2537 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:37:41.382162 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376950 2537 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:37:41.382761 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376953 2537 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:37:41.382761 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376955 2537 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:37:41.382761 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376958 2537 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:37:41.382761 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376960 2537 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:37:41.382761 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376963 2537 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:37:41.382761 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376965 2537 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:37:41.382761 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376967 2537 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:37:41.382761 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376970 2537 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:37:41.382761 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376972 2537 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:37:41.382761 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376975 2537 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:37:41.382761 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376977 2537 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:37:41.382761 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376979 2537 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:37:41.382761 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376982 2537 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:37:41.382761 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:41.376984 2537 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:37:41.382761 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.376989 2537 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:37:41.382761 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.377080 2537 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 20:37:41.383141 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.380792 2537 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 20:37:41.383141 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.381819 2537 server.go:1019] "Starting client certificate rotation" Apr 16 20:37:41.383141 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.382272 2537 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 20:37:41.383141 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.382301 2537 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 20:37:41.408929 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.408913 2537 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 20:37:41.412320 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.412298 2537 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 20:37:41.429964 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.429944 2537 log.go:25] "Validated CRI v1 runtime API" Apr 16 20:37:41.435798 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.435779 2537 log.go:25] "Validated CRI v1 image API" Apr 16 20:37:41.437018 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.437003 2537 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 20:37:41.441404 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.441389 2537 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 20:37:41.442623 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.442606 2537 fs.go:135] Filesystem UUIDs: map[02151dee-a397-4186-bdeb-60c64b5231d6:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 e6b0420f-75de-4aed-abba-f878f855eae1:/dev/nvme0n1p3] Apr 16 20:37:41.442662 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.442624 2537 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 20:37:41.448876 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.448776 2537 manager.go:217] Machine: {Timestamp:2026-04-16 20:37:41.446655411 +0000 UTC m=+0.431798998 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3200740 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2ee292d24cb82d4f3ec307a3fa8fbf SystemUUID:ec2ee292-d24c-b82d-4f3e-c307a3fa8fbf BootID:62ba9682-8e97-4831-964a-64d7527e22d1 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:16:b3:db:1e:55 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:16:b3:db:1e:55 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:8e:bb:bf:e8:ef:bc Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 20:37:41.448876 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.448871 2537 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 20:37:41.448983 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.448960 2537 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 20:37:41.450180 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.450157 2537 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 20:37:41.450308 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.450183 2537 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-199.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 20:37:41.450354 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.450316 2537 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 20:37:41.450354 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.450324 2537 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 20:37:41.450354 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.450337 2537 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 20:37:41.451439 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.451429 2537 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 20:37:41.452394 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.452383 2537 state_mem.go:36] "Initialized new in-memory state store" Apr 16 20:37:41.452502 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.452493 2537 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 20:37:41.455131 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.455121 2537 kubelet.go:491] "Attempting to sync node with API server" Apr 16 20:37:41.455170 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.455138 2537 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 20:37:41.455170 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.455149 2537 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 20:37:41.455170 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.455157 2537 kubelet.go:397] "Adding apiserver pod source" Apr 16 20:37:41.455170 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.455165 2537 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 20:37:41.456267 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.456254 2537 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 20:37:41.456315 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.456272 2537 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 20:37:41.460352 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.460333 2537 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 20:37:41.462110 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.462094 2537 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 20:37:41.463524 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.463512 2537 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 20:37:41.463613 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.463527 2537 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 20:37:41.463613 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.463533 2537 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 20:37:41.463613 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.463538 2537 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 20:37:41.463613 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.463543 2537 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 20:37:41.463613 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.463549 2537 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 20:37:41.463613 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.463554 2537 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 20:37:41.463613 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.463577 2537 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 20:37:41.463613 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.463588 2537 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 20:37:41.463613 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.463596 2537 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 20:37:41.463613 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.463604 2537 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 20:37:41.463613 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.463612 2537 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 20:37:41.464623 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.464612 2537 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 20:37:41.464623 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.464622 2537 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 20:37:41.467878 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.467864 2537 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 20:37:41.467961 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.467897 2537 server.go:1295] "Started kubelet" Apr 16 20:37:41.468015 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.467968 2537 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 20:37:41.468062 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.467981 2537 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 20:37:41.468062 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.468042 2537 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 20:37:41.468676 ip-10-0-129-199 systemd[1]: Started Kubernetes Kubelet. Apr 16 20:37:41.469831 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.469814 2537 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 20:37:41.469894 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.469859 2537 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-199.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 20:37:41.470161 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.470147 2537 server.go:317] "Adding debug handlers to kubelet server" Apr 16 20:37:41.470468 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:41.470449 2537 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 20:37:41.470507 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:41.470450 2537 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-199.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 20:37:41.475287 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.475271 2537 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 20:37:41.476232 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.475891 2537 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 20:37:41.476814 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.476663 2537 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 20:37:41.476814 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.476771 2537 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 20:37:41.476944 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.476857 2537 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 20:37:41.476944 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:41.475815 2537 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-199.ec2.internal.18a6f0cbdfcac8a5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-199.ec2.internal,UID:ip-10-0-129-199.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-129-199.ec2.internal,},FirstTimestamp:2026-04-16 20:37:41.467875493 +0000 UTC m=+0.453019078,LastTimestamp:2026-04-16 20:37:41.467875493 +0000 UTC m=+0.453019078,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-199.ec2.internal,}" Apr 16 20:37:41.476944 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.476903 2537 reconstruct.go:97] "Volume reconstruction finished" Apr 16 20:37:41.476944 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.476909 2537 reconciler.go:26] "Reconciler: start to sync state" Apr 16 20:37:41.477220 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:41.477194 2537 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-199.ec2.internal\" not found" Apr 16 20:37:41.477220 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.477203 2537 factory.go:55] Registering systemd factory Apr 16 20:37:41.477360 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.477235 2537 factory.go:223] Registration of the systemd container factory successfully Apr 16 20:37:41.477360 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:41.477288 2537 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 20:37:41.477732 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.477705 2537 factory.go:153] Registering CRI-O factory Apr 16 20:37:41.477732 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.477725 2537 factory.go:223] Registration of the crio container factory successfully Apr 16 20:37:41.477931 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.477811 2537 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 20:37:41.477931 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.477836 2537 factory.go:103] Registering Raw factory Apr 16 20:37:41.477931 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.477851 2537 manager.go:1196] Started watching for new ooms in manager Apr 16 20:37:41.478268 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.478256 2537 manager.go:319] Starting recovery of all containers Apr 16 20:37:41.483017 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:41.482988 2537 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 20:37:41.483308 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:41.483281 2537 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-129-199.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 20:37:41.484577 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.484539 2537 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-8h85p" Apr 16 20:37:41.488533 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.488480 2537 manager.go:324] Recovery completed Apr 16 20:37:41.489881 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:41.489850 2537 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 16 20:37:41.492620 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.492608 2537 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:37:41.493594 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.493578 2537 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-8h85p" Apr 16 20:37:41.494942 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.494928 2537 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-199.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:37:41.495002 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.494952 2537 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-199.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:37:41.495002 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.494963 2537 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-199.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:37:41.495379 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.495367 2537 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 20:37:41.495379 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.495378 2537 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 20:37:41.495460 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.495391 2537 state_mem.go:36] "Initialized new in-memory state store" Apr 16 20:37:41.497239 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:41.497185 2537 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-199.ec2.internal.18a6f0cbe167c5ef default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-199.ec2.internal,UID:ip-10-0-129-199.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-129-199.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-129-199.ec2.internal,},FirstTimestamp:2026-04-16 20:37:41.494941167 +0000 UTC m=+0.480084748,LastTimestamp:2026-04-16 20:37:41.494941167 +0000 UTC m=+0.480084748,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-199.ec2.internal,}" Apr 16 20:37:41.497962 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.497950 2537 policy_none.go:49] "None policy: Start" Apr 16 20:37:41.498010 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.497966 2537 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 20:37:41.498010 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.497975 2537 state_mem.go:35] "Initializing new in-memory state store" Apr 16 20:37:41.536140 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.536125 2537 manager.go:341] "Starting Device Plugin manager" Apr 16 20:37:41.550794 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:41.536154 2537 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 20:37:41.550794 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.536165 2537 server.go:85] "Starting device plugin registration server" Apr 16 20:37:41.550794 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.536360 2537 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 20:37:41.550794 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.536373 2537 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 20:37:41.550794 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.536473 2537 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 20:37:41.550794 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.536542 2537 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 20:37:41.550794 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.536552 2537 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 20:37:41.550794 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:41.537042 2537 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 20:37:41.550794 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:41.537078 2537 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-199.ec2.internal\" not found" Apr 16 20:37:41.581708 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.581689 2537 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 20:37:41.582799 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.582781 2537 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 20:37:41.582870 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.582805 2537 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 20:37:41.582870 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.582819 2537 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 20:37:41.582870 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.582825 2537 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 20:37:41.582870 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:41.582856 2537 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 20:37:41.586815 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.586799 2537 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:37:41.637487 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.637446 2537 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:37:41.638126 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.638103 2537 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-199.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:37:41.638191 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.638130 2537 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-199.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:37:41.638191 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.638139 2537 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-199.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:37:41.638191 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.638158 2537 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-199.ec2.internal" Apr 16 20:37:41.644676 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.644661 2537 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-199.ec2.internal" Apr 16 20:37:41.644762 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:41.644683 2537 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-199.ec2.internal\": node \"ip-10-0-129-199.ec2.internal\" not found" Apr 16 20:37:41.655082 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:41.655060 2537 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-199.ec2.internal\" not found" Apr 16 20:37:41.682969 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.682945 2537 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-199.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-199.ec2.internal"] Apr 16 20:37:41.683026 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.683011 2537 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:37:41.683650 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.683632 2537 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-199.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:37:41.683718 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.683656 2537 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-199.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:37:41.683718 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.683666 2537 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-199.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:37:41.685087 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.685076 2537 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:37:41.685263 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.685248 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-199.ec2.internal" Apr 16 20:37:41.685296 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.685283 2537 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:37:41.685761 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.685743 2537 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-199.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:37:41.685840 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.685772 2537 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-199.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:37:41.685840 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.685787 2537 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-199.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:37:41.685840 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.685750 2537 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-199.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:37:41.685932 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.685852 2537 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-199.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:37:41.685932 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.685866 2537 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-199.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:37:41.687108 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.687096 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-199.ec2.internal" Apr 16 20:37:41.687157 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.687117 2537 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:37:41.687728 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.687715 2537 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-199.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:37:41.687831 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.687740 2537 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-199.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:37:41.687831 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.687756 2537 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-199.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:37:41.705148 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:41.705127 2537 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-199.ec2.internal\" not found" node="ip-10-0-129-199.ec2.internal" Apr 16 20:37:41.708850 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:41.708837 2537 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-199.ec2.internal\" not found" node="ip-10-0-129-199.ec2.internal" Apr 16 20:37:41.756075 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:41.756058 2537 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-199.ec2.internal\" not found" Apr 16 20:37:41.778933 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.778914 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/416f09eead2ff50b4bdd823b81a2a56e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-199.ec2.internal\" (UID: \"416f09eead2ff50b4bdd823b81a2a56e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-199.ec2.internal" Apr 16 20:37:41.779008 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.778938 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/416f09eead2ff50b4bdd823b81a2a56e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-199.ec2.internal\" (UID: \"416f09eead2ff50b4bdd823b81a2a56e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-199.ec2.internal" Apr 16 20:37:41.779008 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.778955 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/87d4ca475b53fa90f2c794fc65d796bc-config\") pod \"kube-apiserver-proxy-ip-10-0-129-199.ec2.internal\" (UID: \"87d4ca475b53fa90f2c794fc65d796bc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-199.ec2.internal" Apr 16 20:37:41.856407 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:41.856377 2537 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-199.ec2.internal\" not found" Apr 16 20:37:41.879846 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.879825 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/87d4ca475b53fa90f2c794fc65d796bc-config\") pod \"kube-apiserver-proxy-ip-10-0-129-199.ec2.internal\" (UID: \"87d4ca475b53fa90f2c794fc65d796bc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-199.ec2.internal" Apr 16 20:37:41.879914 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.879855 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/87d4ca475b53fa90f2c794fc65d796bc-config\") pod \"kube-apiserver-proxy-ip-10-0-129-199.ec2.internal\" (UID: \"87d4ca475b53fa90f2c794fc65d796bc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-199.ec2.internal" Apr 16 20:37:41.879914 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.879900 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/416f09eead2ff50b4bdd823b81a2a56e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-199.ec2.internal\" (UID: \"416f09eead2ff50b4bdd823b81a2a56e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-199.ec2.internal" Apr 16 20:37:41.879977 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.879919 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/416f09eead2ff50b4bdd823b81a2a56e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-199.ec2.internal\" (UID: \"416f09eead2ff50b4bdd823b81a2a56e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-199.ec2.internal" Apr 16 20:37:41.879977 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.879941 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/416f09eead2ff50b4bdd823b81a2a56e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-199.ec2.internal\" (UID: \"416f09eead2ff50b4bdd823b81a2a56e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-199.ec2.internal" Apr 16 20:37:41.880036 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:41.879963 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/416f09eead2ff50b4bdd823b81a2a56e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-199.ec2.internal\" (UID: \"416f09eead2ff50b4bdd823b81a2a56e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-199.ec2.internal" Apr 16 20:37:41.957125 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:41.957052 2537 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-199.ec2.internal\" not found" Apr 16 20:37:42.007624 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:42.007596 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-199.ec2.internal" Apr 16 20:37:42.011326 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:42.011310 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-199.ec2.internal" Apr 16 20:37:42.057590 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:42.057539 2537 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-199.ec2.internal\" not found" Apr 16 20:37:42.158043 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:42.158009 2537 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-199.ec2.internal\" not found" Apr 16 20:37:42.258532 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:42.258468 2537 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-199.ec2.internal\" not found" Apr 16 20:37:42.293982 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:42.293957 2537 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:37:42.358888 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:42.358862 2537 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-199.ec2.internal\" not found" Apr 16 20:37:42.381183 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:42.381160 2537 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 20:37:42.381825 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:42.381292 2537 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 20:37:42.381825 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:42.381316 2537 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 20:37:42.459815 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:42.459789 2537 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-199.ec2.internal\" not found" Apr 16 20:37:42.475797 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:42.475772 2537 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 20:37:42.492814 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:42.492786 2537 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 20:37:42.495581 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:42.495540 2537 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 20:32:41 +0000 UTC" deadline="2027-09-27 23:33:56.767674851 +0000 UTC" Apr 16 20:37:42.495648 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:42.495581 2537 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12698h56m14.272096988s" Apr 16 20:37:42.517027 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:42.516983 2537 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-nq9hf" Apr 16 20:37:42.522899 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:42.522883 2537 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-nq9hf" Apr 16 20:37:42.560197 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:42.560178 2537 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-199.ec2.internal\" not found" Apr 16 20:37:42.579767 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:42.579697 2537 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:37:42.611090 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:42.611072 2537 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:37:42.656519 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:42.656496 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87d4ca475b53fa90f2c794fc65d796bc.slice/crio-582a1e44b5d6398407c1c2ba7040cb1fe8be4c922b70790e3f2acc720a6deb80 WatchSource:0}: Error finding container 582a1e44b5d6398407c1c2ba7040cb1fe8be4c922b70790e3f2acc720a6deb80: Status 404 returned error can't find the container with id 582a1e44b5d6398407c1c2ba7040cb1fe8be4c922b70790e3f2acc720a6deb80 Apr 16 20:37:42.656984 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:42.656966 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod416f09eead2ff50b4bdd823b81a2a56e.slice/crio-dad380d20b4df771e458beeba6e5f866200b2a859877548d8988c47922296528 WatchSource:0}: Error finding container dad380d20b4df771e458beeba6e5f866200b2a859877548d8988c47922296528: Status 404 returned error can't find the container with id dad380d20b4df771e458beeba6e5f866200b2a859877548d8988c47922296528 Apr 16 20:37:42.662122 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:42.662099 2537 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:37:42.676976 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:42.676954 2537 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-199.ec2.internal" Apr 16 20:37:42.692095 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:42.692078 2537 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 20:37:42.693194 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:42.693182 2537 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-199.ec2.internal" Apr 16 20:37:42.701035 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:42.701023 2537 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 20:37:43.456103 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.456074 2537 apiserver.go:52] "Watching apiserver" Apr 16 20:37:43.464610 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.464414 2537 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 20:37:43.467289 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.467259 2537 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-nlrkh","openshift-cluster-node-tuning-operator/tuned-kftrw","openshift-dns/node-resolver-gsrd4","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-199.ec2.internal","openshift-multus/network-metrics-daemon-d6xkn","openshift-network-operator/iptables-alerter-fz4qv","kube-system/kube-apiserver-proxy-ip-10-0-129-199.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ff6q","openshift-image-registry/node-ca-2t2sk","openshift-multus/multus-8bj89","openshift-multus/multus-additional-cni-plugins-kslqr","openshift-network-diagnostics/network-check-target-4cr2r","openshift-ovn-kubernetes/ovnkube-node-5qqqk"] Apr 16 20:37:43.469397 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.469373 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4cr2r" Apr 16 20:37:43.469528 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:43.469456 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4cr2r" podUID="e8800de1-77ef-480f-81de-cc93318d33b6" Apr 16 20:37:43.472315 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.472061 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-kftrw" Apr 16 20:37:43.473306 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.473285 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gsrd4" Apr 16 20:37:43.474302 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.474280 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 20:37:43.474302 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.474292 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:37:43.474456 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.474439 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-j7x4g\"" Apr 16 20:37:43.474628 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.474611 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d6xkn" Apr 16 20:37:43.474712 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:43.474672 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d6xkn" podUID="11dcf076-13e5-4128-bf13-7e6c86c6dd5b" Apr 16 20:37:43.475464 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.475300 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-nn288\"" Apr 16 20:37:43.475650 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.475493 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 20:37:43.475650 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.475581 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 20:37:43.475919 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.475860 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-nlrkh" Apr 16 20:37:43.477236 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.477217 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-fz4qv" Apr 16 20:37:43.478083 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.478058 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-7wz76\"" Apr 16 20:37:43.478401 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.478265 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 20:37:43.478401 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.478268 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 20:37:43.479138 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.478879 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.479138 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.479119 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 20:37:43.479424 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.479346 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:37:43.479424 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.479421 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ff6q" Apr 16 20:37:43.479573 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.479441 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 20:37:43.479632 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.479542 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-kqz7s\"" Apr 16 20:37:43.480376 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.480342 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2t2sk" Apr 16 20:37:43.481242 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.481221 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 20:37:43.481549 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.481530 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 20:37:43.481646 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.481535 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 20:37:43.481863 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.481818 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.482592 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.482574 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 20:37:43.483134 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.483116 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 20:37:43.483286 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.483269 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kslqr" Apr 16 20:37:43.484117 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.484095 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 20:37:43.484246 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.484204 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 20:37:43.484444 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.484428 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-bdfcd\"" Apr 16 20:37:43.484670 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.484652 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 20:37:43.484929 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.484908 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-72qsr\"" Apr 16 20:37:43.485002 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.484979 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 20:37:43.485198 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.485180 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 20:37:43.485329 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.485311 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 20:37:43.485619 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.485598 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 20:37:43.486004 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.485985 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-48c8z\"" Apr 16 20:37:43.486276 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.486259 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-d2jlp\"" Apr 16 20:37:43.486425 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.486408 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 20:37:43.486487 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.486442 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 20:37:43.486596 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.486549 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 20:37:43.486786 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.486771 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-5jhxj\"" Apr 16 20:37:43.486946 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.486922 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 20:37:43.487174 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.487152 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 20:37:43.487988 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.487964 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 20:37:43.488142 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.488124 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-env-overrides\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.488213 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.488170 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-ovn-node-metrics-cert\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.488285 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.488210 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/43c2ff47-b11f-4e08-99d4-c54547429f56-iptables-alerter-script\") pod \"iptables-alerter-fz4qv\" (UID: \"43c2ff47-b11f-4e08-99d4-c54547429f56\") " pod="openshift-network-operator/iptables-alerter-fz4qv" Apr 16 20:37:43.488285 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.488249 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-var-lib-openvswitch\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.488373 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.488313 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-etc-openvswitch\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.488373 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.488341 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-run-openvswitch\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.488459 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.488370 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-host-cni-netd\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.488459 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.488400 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-ovnkube-script-lib\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.488595 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.488550 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvmvf\" (UniqueName: \"kubernetes.io/projected/e8800de1-77ef-480f-81de-cc93318d33b6-kube-api-access-nvmvf\") pod \"network-check-target-4cr2r\" (UID: \"e8800de1-77ef-480f-81de-cc93318d33b6\") " pod="openshift-network-diagnostics/network-check-target-4cr2r" Apr 16 20:37:43.489921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.489489 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/36f6c978-ccae-4818-a370-35e0101bf84f-agent-certs\") pod \"konnectivity-agent-nlrkh\" (UID: \"36f6c978-ccae-4818-a370-35e0101bf84f\") " pod="kube-system/konnectivity-agent-nlrkh" Apr 16 20:37:43.489921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.489531 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-systemd-units\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.489921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.489571 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e04adda1-dbd0-4e8a-b43f-08fc4771a0d3-lib-modules\") pod \"tuned-kftrw\" (UID: \"e04adda1-dbd0-4e8a-b43f-08fc4771a0d3\") " pod="openshift-cluster-node-tuning-operator/tuned-kftrw" Apr 16 20:37:43.489921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.489597 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/74821fb7-65d5-4396-bc93-add3e5936d13-hosts-file\") pod \"node-resolver-gsrd4\" (UID: \"74821fb7-65d5-4396-bc93-add3e5936d13\") " pod="openshift-dns/node-resolver-gsrd4" Apr 16 20:37:43.489921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.489628 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/43c2ff47-b11f-4e08-99d4-c54547429f56-host-slash\") pod \"iptables-alerter-fz4qv\" (UID: \"43c2ff47-b11f-4e08-99d4-c54547429f56\") " pod="openshift-network-operator/iptables-alerter-fz4qv" Apr 16 20:37:43.489921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.489643 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45qt6\" (UniqueName: \"kubernetes.io/projected/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-kube-api-access-45qt6\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.489921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.489662 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cm77\" (UniqueName: \"kubernetes.io/projected/43c2ff47-b11f-4e08-99d4-c54547429f56-kube-api-access-7cm77\") pod \"iptables-alerter-fz4qv\" (UID: \"43c2ff47-b11f-4e08-99d4-c54547429f56\") " pod="openshift-network-operator/iptables-alerter-fz4qv" Apr 16 20:37:43.489921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.489683 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e04adda1-dbd0-4e8a-b43f-08fc4771a0d3-var-lib-kubelet\") pod \"tuned-kftrw\" (UID: \"e04adda1-dbd0-4e8a-b43f-08fc4771a0d3\") " pod="openshift-cluster-node-tuning-operator/tuned-kftrw" Apr 16 20:37:43.489921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.489703 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e04adda1-dbd0-4e8a-b43f-08fc4771a0d3-etc-modprobe-d\") pod \"tuned-kftrw\" (UID: \"e04adda1-dbd0-4e8a-b43f-08fc4771a0d3\") " pod="openshift-cluster-node-tuning-operator/tuned-kftrw" Apr 16 20:37:43.489921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.489724 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e04adda1-dbd0-4e8a-b43f-08fc4771a0d3-etc-sysctl-d\") pod \"tuned-kftrw\" (UID: \"e04adda1-dbd0-4e8a-b43f-08fc4771a0d3\") " pod="openshift-cluster-node-tuning-operator/tuned-kftrw" Apr 16 20:37:43.489921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.489744 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e04adda1-dbd0-4e8a-b43f-08fc4771a0d3-run\") pod \"tuned-kftrw\" (UID: \"e04adda1-dbd0-4e8a-b43f-08fc4771a0d3\") " pod="openshift-cluster-node-tuning-operator/tuned-kftrw" Apr 16 20:37:43.489921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.489767 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpzql\" (UniqueName: \"kubernetes.io/projected/11dcf076-13e5-4128-bf13-7e6c86c6dd5b-kube-api-access-kpzql\") pod \"network-metrics-daemon-d6xkn\" (UID: \"11dcf076-13e5-4128-bf13-7e6c86c6dd5b\") " pod="openshift-multus/network-metrics-daemon-d6xkn" Apr 16 20:37:43.489921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.489789 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-host-slash\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.489921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.489820 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e04adda1-dbd0-4e8a-b43f-08fc4771a0d3-host\") pod \"tuned-kftrw\" (UID: \"e04adda1-dbd0-4e8a-b43f-08fc4771a0d3\") " pod="openshift-cluster-node-tuning-operator/tuned-kftrw" Apr 16 20:37:43.489921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.489856 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e04adda1-dbd0-4e8a-b43f-08fc4771a0d3-etc-tuned\") pod \"tuned-kftrw\" (UID: \"e04adda1-dbd0-4e8a-b43f-08fc4771a0d3\") " pod="openshift-cluster-node-tuning-operator/tuned-kftrw" Apr 16 20:37:43.489921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.489878 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e04adda1-dbd0-4e8a-b43f-08fc4771a0d3-tmp\") pod \"tuned-kftrw\" (UID: \"e04adda1-dbd0-4e8a-b43f-08fc4771a0d3\") " pod="openshift-cluster-node-tuning-operator/tuned-kftrw" Apr 16 20:37:43.489921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.489901 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/74821fb7-65d5-4396-bc93-add3e5936d13-tmp-dir\") pod \"node-resolver-gsrd4\" (UID: \"74821fb7-65d5-4396-bc93-add3e5936d13\") " pod="openshift-dns/node-resolver-gsrd4" Apr 16 20:37:43.490759 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.489930 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/11dcf076-13e5-4128-bf13-7e6c86c6dd5b-metrics-certs\") pod \"network-metrics-daemon-d6xkn\" (UID: \"11dcf076-13e5-4128-bf13-7e6c86c6dd5b\") " pod="openshift-multus/network-metrics-daemon-d6xkn" Apr 16 20:37:43.490759 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.489954 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-host-kubelet\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.490759 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.489975 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-run-systemd\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.490759 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.489999 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e04adda1-dbd0-4e8a-b43f-08fc4771a0d3-etc-sysconfig\") pod \"tuned-kftrw\" (UID: \"e04adda1-dbd0-4e8a-b43f-08fc4771a0d3\") " pod="openshift-cluster-node-tuning-operator/tuned-kftrw" Apr 16 20:37:43.490759 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.490020 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/36f6c978-ccae-4818-a370-35e0101bf84f-konnectivity-ca\") pod \"konnectivity-agent-nlrkh\" (UID: \"36f6c978-ccae-4818-a370-35e0101bf84f\") " pod="kube-system/konnectivity-agent-nlrkh" Apr 16 20:37:43.490759 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.490041 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-run-ovn\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.490759 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.490062 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-log-socket\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.490759 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.490085 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-host-run-ovn-kubernetes\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.490759 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.490106 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e04adda1-dbd0-4e8a-b43f-08fc4771a0d3-etc-sysctl-conf\") pod \"tuned-kftrw\" (UID: \"e04adda1-dbd0-4e8a-b43f-08fc4771a0d3\") " pod="openshift-cluster-node-tuning-operator/tuned-kftrw" Apr 16 20:37:43.490759 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.490137 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e04adda1-dbd0-4e8a-b43f-08fc4771a0d3-etc-systemd\") pod \"tuned-kftrw\" (UID: \"e04adda1-dbd0-4e8a-b43f-08fc4771a0d3\") " pod="openshift-cluster-node-tuning-operator/tuned-kftrw" Apr 16 20:37:43.490759 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.490169 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e04adda1-dbd0-4e8a-b43f-08fc4771a0d3-sys\") pod \"tuned-kftrw\" (UID: \"e04adda1-dbd0-4e8a-b43f-08fc4771a0d3\") " pod="openshift-cluster-node-tuning-operator/tuned-kftrw" Apr 16 20:37:43.490759 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.490191 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-579l5\" (UniqueName: \"kubernetes.io/projected/74821fb7-65d5-4396-bc93-add3e5936d13-kube-api-access-579l5\") pod \"node-resolver-gsrd4\" (UID: \"74821fb7-65d5-4396-bc93-add3e5936d13\") " pod="openshift-dns/node-resolver-gsrd4" Apr 16 20:37:43.490759 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.490236 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-host-run-netns\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.490759 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.490315 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-host-cni-bin\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.490759 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.490341 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.490759 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.490366 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-ovnkube-config\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.491524 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.490411 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e04adda1-dbd0-4e8a-b43f-08fc4771a0d3-etc-kubernetes\") pod \"tuned-kftrw\" (UID: \"e04adda1-dbd0-4e8a-b43f-08fc4771a0d3\") " pod="openshift-cluster-node-tuning-operator/tuned-kftrw" Apr 16 20:37:43.491524 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.490436 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jc4d\" (UniqueName: \"kubernetes.io/projected/e04adda1-dbd0-4e8a-b43f-08fc4771a0d3-kube-api-access-7jc4d\") pod \"tuned-kftrw\" (UID: \"e04adda1-dbd0-4e8a-b43f-08fc4771a0d3\") " pod="openshift-cluster-node-tuning-operator/tuned-kftrw" Apr 16 20:37:43.491524 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.490465 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-node-log\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.525237 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.524893 2537 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 20:32:42 +0000 UTC" deadline="2028-01-18 03:42:30.038217034 +0000 UTC" Apr 16 20:37:43.525237 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.524917 2537 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15391h4m46.513303302s" Apr 16 20:37:43.577751 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.577730 2537 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 20:37:43.586615 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.586568 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-199.ec2.internal" event={"ID":"87d4ca475b53fa90f2c794fc65d796bc","Type":"ContainerStarted","Data":"582a1e44b5d6398407c1c2ba7040cb1fe8be4c922b70790e3f2acc720a6deb80"} Apr 16 20:37:43.587511 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.587489 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-199.ec2.internal" event={"ID":"416f09eead2ff50b4bdd823b81a2a56e","Type":"ContainerStarted","Data":"dad380d20b4df771e458beeba6e5f866200b2a859877548d8988c47922296528"} Apr 16 20:37:43.590767 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.590736 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-node-log\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.590863 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.590775 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-ovn-node-metrics-cert\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.590863 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.590805 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/366e6f1a-199e-4381-a962-80622ebd99b0-device-dir\") pod \"aws-ebs-csi-driver-node-6ff6q\" (UID: \"366e6f1a-199e-4381-a962-80622ebd99b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ff6q" Apr 16 20:37:43.590863 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.590829 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7770e551-6840-450b-81fa-c37714dbe265-system-cni-dir\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.590863 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.590842 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-node-log\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.590863 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.590854 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7770e551-6840-450b-81fa-c37714dbe265-etc-kubernetes\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.591083 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.590879 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnbw9\" (UniqueName: \"kubernetes.io/projected/7770e551-6840-450b-81fa-c37714dbe265-kube-api-access-vnbw9\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.591083 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.590905 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-etc-openvswitch\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.591083 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.590929 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-run-openvswitch\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.591083 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.590955 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/88ebfbf2-8dfe-4d3c-91ed-559a91a0a925-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kslqr\" (UID: \"88ebfbf2-8dfe-4d3c-91ed-559a91a0a925\") " pod="openshift-multus/multus-additional-cni-plugins-kslqr" Apr 16 20:37:43.591083 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.590992 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-etc-openvswitch\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.591083 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.591038 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-run-openvswitch\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.591083 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.591072 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/88ebfbf2-8dfe-4d3c-91ed-559a91a0a925-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kslqr\" (UID: \"88ebfbf2-8dfe-4d3c-91ed-559a91a0a925\") " pod="openshift-multus/multus-additional-cni-plugins-kslqr" Apr 16 20:37:43.591390 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.591144 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb6mc\" (UniqueName: \"kubernetes.io/projected/81e7762a-1005-4a21-8f55-9dda467004e0-kube-api-access-xb6mc\") pod \"node-ca-2t2sk\" (UID: \"81e7762a-1005-4a21-8f55-9dda467004e0\") " pod="openshift-image-registry/node-ca-2t2sk" Apr 16 20:37:43.591390 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.591161 2537 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 20:37:43.591390 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.591173 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7770e551-6840-450b-81fa-c37714dbe265-multus-cni-dir\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.591390 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.591197 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7770e551-6840-450b-81fa-c37714dbe265-multus-socket-dir-parent\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.591390 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.591220 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7770e551-6840-450b-81fa-c37714dbe265-multus-conf-dir\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.591390 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.591249 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nvmvf\" (UniqueName: \"kubernetes.io/projected/e8800de1-77ef-480f-81de-cc93318d33b6-kube-api-access-nvmvf\") pod \"network-check-target-4cr2r\" (UID: \"e8800de1-77ef-480f-81de-cc93318d33b6\") " pod="openshift-network-diagnostics/network-check-target-4cr2r" Apr 16 20:37:43.591390 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.591274 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/366e6f1a-199e-4381-a962-80622ebd99b0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6ff6q\" (UID: \"366e6f1a-199e-4381-a962-80622ebd99b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ff6q" Apr 16 20:37:43.591390 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.591297 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e04adda1-dbd0-4e8a-b43f-08fc4771a0d3-lib-modules\") pod \"tuned-kftrw\" (UID: \"e04adda1-dbd0-4e8a-b43f-08fc4771a0d3\") " pod="openshift-cluster-node-tuning-operator/tuned-kftrw" Apr 16 20:37:43.591390 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.591318 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/43c2ff47-b11f-4e08-99d4-c54547429f56-host-slash\") pod \"iptables-alerter-fz4qv\" (UID: \"43c2ff47-b11f-4e08-99d4-c54547429f56\") " pod="openshift-network-operator/iptables-alerter-fz4qv" Apr 16 20:37:43.591390 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.591344 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-45qt6\" (UniqueName: \"kubernetes.io/projected/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-kube-api-access-45qt6\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.591390 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.591367 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/366e6f1a-199e-4381-a962-80622ebd99b0-registration-dir\") pod \"aws-ebs-csi-driver-node-6ff6q\" (UID: \"366e6f1a-199e-4381-a962-80622ebd99b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ff6q" Apr 16 20:37:43.591390 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.591391 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7770e551-6840-450b-81fa-c37714dbe265-host-run-netns\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.591937 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.591416 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e04adda1-dbd0-4e8a-b43f-08fc4771a0d3-var-lib-kubelet\") pod \"tuned-kftrw\" (UID: \"e04adda1-dbd0-4e8a-b43f-08fc4771a0d3\") " pod="openshift-cluster-node-tuning-operator/tuned-kftrw" Apr 16 20:37:43.591937 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.591441 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e04adda1-dbd0-4e8a-b43f-08fc4771a0d3-etc-sysctl-d\") pod \"tuned-kftrw\" (UID: \"e04adda1-dbd0-4e8a-b43f-08fc4771a0d3\") " pod="openshift-cluster-node-tuning-operator/tuned-kftrw" Apr 16 20:37:43.591937 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.591461 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e04adda1-dbd0-4e8a-b43f-08fc4771a0d3-run\") pod \"tuned-kftrw\" (UID: \"e04adda1-dbd0-4e8a-b43f-08fc4771a0d3\") " pod="openshift-cluster-node-tuning-operator/tuned-kftrw" Apr 16 20:37:43.591937 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.591484 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-host-slash\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.591937 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.591522 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/81e7762a-1005-4a21-8f55-9dda467004e0-serviceca\") pod \"node-ca-2t2sk\" (UID: \"81e7762a-1005-4a21-8f55-9dda467004e0\") " pod="openshift-image-registry/node-ca-2t2sk" Apr 16 20:37:43.591937 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.591545 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e04adda1-dbd0-4e8a-b43f-08fc4771a0d3-host\") pod \"tuned-kftrw\" (UID: \"e04adda1-dbd0-4e8a-b43f-08fc4771a0d3\") " pod="openshift-cluster-node-tuning-operator/tuned-kftrw" Apr 16 20:37:43.591937 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.591618 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e04adda1-dbd0-4e8a-b43f-08fc4771a0d3-host\") pod \"tuned-kftrw\" (UID: \"e04adda1-dbd0-4e8a-b43f-08fc4771a0d3\") " pod="openshift-cluster-node-tuning-operator/tuned-kftrw" Apr 16 20:37:43.591937 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.591618 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e04adda1-dbd0-4e8a-b43f-08fc4771a0d3-var-lib-kubelet\") pod \"tuned-kftrw\" (UID: \"e04adda1-dbd0-4e8a-b43f-08fc4771a0d3\") " pod="openshift-cluster-node-tuning-operator/tuned-kftrw" Apr 16 20:37:43.591937 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.591656 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/43c2ff47-b11f-4e08-99d4-c54547429f56-host-slash\") pod \"iptables-alerter-fz4qv\" (UID: \"43c2ff47-b11f-4e08-99d4-c54547429f56\") " pod="openshift-network-operator/iptables-alerter-fz4qv" Apr 16 20:37:43.591937 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.591671 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e04adda1-dbd0-4e8a-b43f-08fc4771a0d3-run\") pod \"tuned-kftrw\" (UID: \"e04adda1-dbd0-4e8a-b43f-08fc4771a0d3\") " pod="openshift-cluster-node-tuning-operator/tuned-kftrw" Apr 16 20:37:43.591937 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.591696 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e04adda1-dbd0-4e8a-b43f-08fc4771a0d3-tmp\") pod \"tuned-kftrw\" (UID: \"e04adda1-dbd0-4e8a-b43f-08fc4771a0d3\") " pod="openshift-cluster-node-tuning-operator/tuned-kftrw" Apr 16 20:37:43.591937 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.591706 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-host-slash\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.591937 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.591723 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e04adda1-dbd0-4e8a-b43f-08fc4771a0d3-lib-modules\") pod \"tuned-kftrw\" (UID: \"e04adda1-dbd0-4e8a-b43f-08fc4771a0d3\") " pod="openshift-cluster-node-tuning-operator/tuned-kftrw" Apr 16 20:37:43.591937 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.591725 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/11dcf076-13e5-4128-bf13-7e6c86c6dd5b-metrics-certs\") pod \"network-metrics-daemon-d6xkn\" (UID: \"11dcf076-13e5-4128-bf13-7e6c86c6dd5b\") " pod="openshift-multus/network-metrics-daemon-d6xkn" Apr 16 20:37:43.591937 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.591763 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-run-systemd\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.591937 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.591837 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/88ebfbf2-8dfe-4d3c-91ed-559a91a0a925-os-release\") pod \"multus-additional-cni-plugins-kslqr\" (UID: \"88ebfbf2-8dfe-4d3c-91ed-559a91a0a925\") " pod="openshift-multus/multus-additional-cni-plugins-kslqr" Apr 16 20:37:43.591937 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:43.591863 2537 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:37:43.591937 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.591873 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7770e551-6840-450b-81fa-c37714dbe265-hostroot\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.592717 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.591896 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7770e551-6840-450b-81fa-c37714dbe265-host-run-multus-certs\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.592717 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.591895 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-run-systemd\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.592717 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.591927 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8chxz\" (UniqueName: \"kubernetes.io/projected/88ebfbf2-8dfe-4d3c-91ed-559a91a0a925-kube-api-access-8chxz\") pod \"multus-additional-cni-plugins-kslqr\" (UID: \"88ebfbf2-8dfe-4d3c-91ed-559a91a0a925\") " pod="openshift-multus/multus-additional-cni-plugins-kslqr" Apr 16 20:37:43.592717 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:43.591957 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11dcf076-13e5-4128-bf13-7e6c86c6dd5b-metrics-certs podName:11dcf076-13e5-4128-bf13-7e6c86c6dd5b nodeName:}" failed. No retries permitted until 2026-04-16 20:37:44.091933361 +0000 UTC m=+3.077076930 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/11dcf076-13e5-4128-bf13-7e6c86c6dd5b-metrics-certs") pod "network-metrics-daemon-d6xkn" (UID: "11dcf076-13e5-4128-bf13-7e6c86c6dd5b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:37:43.592717 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.591997 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/36f6c978-ccae-4818-a370-35e0101bf84f-konnectivity-ca\") pod \"konnectivity-agent-nlrkh\" (UID: \"36f6c978-ccae-4818-a370-35e0101bf84f\") " pod="kube-system/konnectivity-agent-nlrkh" Apr 16 20:37:43.592717 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.592034 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-run-ovn\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.592717 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.592074 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e04adda1-dbd0-4e8a-b43f-08fc4771a0d3-etc-sysctl-d\") pod \"tuned-kftrw\" (UID: \"e04adda1-dbd0-4e8a-b43f-08fc4771a0d3\") " pod="openshift-cluster-node-tuning-operator/tuned-kftrw" Apr 16 20:37:43.592717 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.592123 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-log-socket\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.592717 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.592145 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-run-ovn\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.592717 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.592200 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/88ebfbf2-8dfe-4d3c-91ed-559a91a0a925-system-cni-dir\") pod \"multus-additional-cni-plugins-kslqr\" (UID: \"88ebfbf2-8dfe-4d3c-91ed-559a91a0a925\") " pod="openshift-multus/multus-additional-cni-plugins-kslqr" Apr 16 20:37:43.592717 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.592226 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-log-socket\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.593721 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.593668 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/366e6f1a-199e-4381-a962-80622ebd99b0-socket-dir\") pod \"aws-ebs-csi-driver-node-6ff6q\" (UID: \"366e6f1a-199e-4381-a962-80622ebd99b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ff6q" Apr 16 20:37:43.593791 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.593738 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/81e7762a-1005-4a21-8f55-9dda467004e0-host\") pod \"node-ca-2t2sk\" (UID: \"81e7762a-1005-4a21-8f55-9dda467004e0\") " pod="openshift-image-registry/node-ca-2t2sk" Apr 16 20:37:43.593791 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.593668 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/36f6c978-ccae-4818-a370-35e0101bf84f-konnectivity-ca\") pod \"konnectivity-agent-nlrkh\" (UID: \"36f6c978-ccae-4818-a370-35e0101bf84f\") " pod="kube-system/konnectivity-agent-nlrkh" Apr 16 20:37:43.593881 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.593836 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e04adda1-dbd0-4e8a-b43f-08fc4771a0d3-etc-sysctl-conf\") pod \"tuned-kftrw\" (UID: \"e04adda1-dbd0-4e8a-b43f-08fc4771a0d3\") " pod="openshift-cluster-node-tuning-operator/tuned-kftrw" Apr 16 20:37:43.593881 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.593861 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e04adda1-dbd0-4e8a-b43f-08fc4771a0d3-sys\") pod \"tuned-kftrw\" (UID: \"e04adda1-dbd0-4e8a-b43f-08fc4771a0d3\") " pod="openshift-cluster-node-tuning-operator/tuned-kftrw" Apr 16 20:37:43.593972 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.593891 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-host-run-netns\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.593972 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.593923 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.593972 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.593958 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7770e551-6840-450b-81fa-c37714dbe265-multus-daemon-config\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.594110 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.593993 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e04adda1-dbd0-4e8a-b43f-08fc4771a0d3-etc-kubernetes\") pod \"tuned-kftrw\" (UID: \"e04adda1-dbd0-4e8a-b43f-08fc4771a0d3\") " pod="openshift-cluster-node-tuning-operator/tuned-kftrw" Apr 16 20:37:43.594110 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.594020 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7jc4d\" (UniqueName: \"kubernetes.io/projected/e04adda1-dbd0-4e8a-b43f-08fc4771a0d3-kube-api-access-7jc4d\") pod \"tuned-kftrw\" (UID: \"e04adda1-dbd0-4e8a-b43f-08fc4771a0d3\") " pod="openshift-cluster-node-tuning-operator/tuned-kftrw" Apr 16 20:37:43.594110 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.594055 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/88ebfbf2-8dfe-4d3c-91ed-559a91a0a925-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kslqr\" (UID: \"88ebfbf2-8dfe-4d3c-91ed-559a91a0a925\") " pod="openshift-multus/multus-additional-cni-plugins-kslqr" Apr 16 20:37:43.594110 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.594086 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/366e6f1a-199e-4381-a962-80622ebd99b0-sys-fs\") pod \"aws-ebs-csi-driver-node-6ff6q\" (UID: \"366e6f1a-199e-4381-a962-80622ebd99b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ff6q" Apr 16 20:37:43.594323 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.594127 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7770e551-6840-450b-81fa-c37714dbe265-host-var-lib-cni-bin\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.594323 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.594110 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.594323 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.594175 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/43c2ff47-b11f-4e08-99d4-c54547429f56-iptables-alerter-script\") pod \"iptables-alerter-fz4qv\" (UID: \"43c2ff47-b11f-4e08-99d4-c54547429f56\") " pod="openshift-network-operator/iptables-alerter-fz4qv" Apr 16 20:37:43.594323 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.594214 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-var-lib-openvswitch\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.594323 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.594227 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-host-run-netns\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.594323 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.594248 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-host-cni-netd\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.594323 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.594284 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-ovnkube-script-lib\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.594323 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.594266 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e04adda1-dbd0-4e8a-b43f-08fc4771a0d3-sys\") pod \"tuned-kftrw\" (UID: \"e04adda1-dbd0-4e8a-b43f-08fc4771a0d3\") " pod="openshift-cluster-node-tuning-operator/tuned-kftrw" Apr 16 20:37:43.594674 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.594326 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7770e551-6840-450b-81fa-c37714dbe265-cnibin\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.594674 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.594389 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7770e551-6840-450b-81fa-c37714dbe265-host-var-lib-cni-multus\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.594674 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.594428 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/36f6c978-ccae-4818-a370-35e0101bf84f-agent-certs\") pod \"konnectivity-agent-nlrkh\" (UID: \"36f6c978-ccae-4818-a370-35e0101bf84f\") " pod="kube-system/konnectivity-agent-nlrkh" Apr 16 20:37:43.594674 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.594450 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-systemd-units\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.594674 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.594472 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/74821fb7-65d5-4396-bc93-add3e5936d13-hosts-file\") pod \"node-resolver-gsrd4\" (UID: \"74821fb7-65d5-4396-bc93-add3e5936d13\") " pod="openshift-dns/node-resolver-gsrd4" Apr 16 20:37:43.594674 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.594496 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7770e551-6840-450b-81fa-c37714dbe265-cni-binary-copy\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.594674 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.594522 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7cm77\" (UniqueName: \"kubernetes.io/projected/43c2ff47-b11f-4e08-99d4-c54547429f56-kube-api-access-7cm77\") pod \"iptables-alerter-fz4qv\" (UID: \"43c2ff47-b11f-4e08-99d4-c54547429f56\") " pod="openshift-network-operator/iptables-alerter-fz4qv" Apr 16 20:37:43.594674 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.594546 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e04adda1-dbd0-4e8a-b43f-08fc4771a0d3-etc-modprobe-d\") pod \"tuned-kftrw\" (UID: \"e04adda1-dbd0-4e8a-b43f-08fc4771a0d3\") " pod="openshift-cluster-node-tuning-operator/tuned-kftrw" Apr 16 20:37:43.594674 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.594586 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kpzql\" (UniqueName: \"kubernetes.io/projected/11dcf076-13e5-4128-bf13-7e6c86c6dd5b-kube-api-access-kpzql\") pod \"network-metrics-daemon-d6xkn\" (UID: \"11dcf076-13e5-4128-bf13-7e6c86c6dd5b\") " pod="openshift-multus/network-metrics-daemon-d6xkn" Apr 16 20:37:43.594674 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.594551 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e04adda1-dbd0-4e8a-b43f-08fc4771a0d3-etc-sysctl-conf\") pod \"tuned-kftrw\" (UID: \"e04adda1-dbd0-4e8a-b43f-08fc4771a0d3\") " pod="openshift-cluster-node-tuning-operator/tuned-kftrw" Apr 16 20:37:43.594674 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.594611 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e04adda1-dbd0-4e8a-b43f-08fc4771a0d3-etc-tuned\") pod \"tuned-kftrw\" (UID: \"e04adda1-dbd0-4e8a-b43f-08fc4771a0d3\") " pod="openshift-cluster-node-tuning-operator/tuned-kftrw" Apr 16 20:37:43.594674 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.594636 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/74821fb7-65d5-4396-bc93-add3e5936d13-tmp-dir\") pod \"node-resolver-gsrd4\" (UID: \"74821fb7-65d5-4396-bc93-add3e5936d13\") " pod="openshift-dns/node-resolver-gsrd4" Apr 16 20:37:43.594674 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.594662 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-host-kubelet\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.595180 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.594686 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/366e6f1a-199e-4381-a962-80622ebd99b0-etc-selinux\") pod \"aws-ebs-csi-driver-node-6ff6q\" (UID: \"366e6f1a-199e-4381-a962-80622ebd99b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ff6q" Apr 16 20:37:43.595180 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.594709 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e04adda1-dbd0-4e8a-b43f-08fc4771a0d3-etc-sysconfig\") pod \"tuned-kftrw\" (UID: \"e04adda1-dbd0-4e8a-b43f-08fc4771a0d3\") " pod="openshift-cluster-node-tuning-operator/tuned-kftrw" Apr 16 20:37:43.595180 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.594727 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-host-run-ovn-kubernetes\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.595180 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.594781 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/88ebfbf2-8dfe-4d3c-91ed-559a91a0a925-cnibin\") pod \"multus-additional-cni-plugins-kslqr\" (UID: \"88ebfbf2-8dfe-4d3c-91ed-559a91a0a925\") " pod="openshift-multus/multus-additional-cni-plugins-kslqr" Apr 16 20:37:43.595180 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.594813 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kck8\" (UniqueName: \"kubernetes.io/projected/366e6f1a-199e-4381-a962-80622ebd99b0-kube-api-access-2kck8\") pod \"aws-ebs-csi-driver-node-6ff6q\" (UID: \"366e6f1a-199e-4381-a962-80622ebd99b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ff6q" Apr 16 20:37:43.595180 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.594833 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7770e551-6840-450b-81fa-c37714dbe265-host-var-lib-kubelet\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.595180 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.594896 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e04adda1-dbd0-4e8a-b43f-08fc4771a0d3-etc-systemd\") pod \"tuned-kftrw\" (UID: \"e04adda1-dbd0-4e8a-b43f-08fc4771a0d3\") " pod="openshift-cluster-node-tuning-operator/tuned-kftrw" Apr 16 20:37:43.595180 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.595033 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-579l5\" (UniqueName: \"kubernetes.io/projected/74821fb7-65d5-4396-bc93-add3e5936d13-kube-api-access-579l5\") pod \"node-resolver-gsrd4\" (UID: \"74821fb7-65d5-4396-bc93-add3e5936d13\") " pod="openshift-dns/node-resolver-gsrd4" Apr 16 20:37:43.595180 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.595056 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-host-cni-bin\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.595180 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.595112 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-host-cni-bin\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.595180 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.595129 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-host-kubelet\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.595180 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.595183 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e04adda1-dbd0-4e8a-b43f-08fc4771a0d3-etc-sysconfig\") pod \"tuned-kftrw\" (UID: \"e04adda1-dbd0-4e8a-b43f-08fc4771a0d3\") " pod="openshift-cluster-node-tuning-operator/tuned-kftrw" Apr 16 20:37:43.595677 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.595184 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-ovnkube-config\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.595677 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.595221 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-host-run-ovn-kubernetes\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.595677 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.595232 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-env-overrides\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.595677 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.595270 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/88ebfbf2-8dfe-4d3c-91ed-559a91a0a925-cni-binary-copy\") pod \"multus-additional-cni-plugins-kslqr\" (UID: \"88ebfbf2-8dfe-4d3c-91ed-559a91a0a925\") " pod="openshift-multus/multus-additional-cni-plugins-kslqr" Apr 16 20:37:43.595677 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.595310 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7770e551-6840-450b-81fa-c37714dbe265-os-release\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.595677 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.595336 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7770e551-6840-450b-81fa-c37714dbe265-host-run-k8s-cni-cncf-io\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.595677 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.595494 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e04adda1-dbd0-4e8a-b43f-08fc4771a0d3-etc-modprobe-d\") pod \"tuned-kftrw\" (UID: \"e04adda1-dbd0-4e8a-b43f-08fc4771a0d3\") " pod="openshift-cluster-node-tuning-operator/tuned-kftrw" Apr 16 20:37:43.595677 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.595616 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-systemd-units\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.595992 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.595684 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e04adda1-dbd0-4e8a-b43f-08fc4771a0d3-etc-systemd\") pod \"tuned-kftrw\" (UID: \"e04adda1-dbd0-4e8a-b43f-08fc4771a0d3\") " pod="openshift-cluster-node-tuning-operator/tuned-kftrw" Apr 16 20:37:43.595992 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.595747 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-env-overrides\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.595992 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.595770 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/74821fb7-65d5-4396-bc93-add3e5936d13-hosts-file\") pod \"node-resolver-gsrd4\" (UID: \"74821fb7-65d5-4396-bc93-add3e5936d13\") " pod="openshift-dns/node-resolver-gsrd4" Apr 16 20:37:43.596229 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.596205 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e04adda1-dbd0-4e8a-b43f-08fc4771a0d3-etc-kubernetes\") pod \"tuned-kftrw\" (UID: \"e04adda1-dbd0-4e8a-b43f-08fc4771a0d3\") " pod="openshift-cluster-node-tuning-operator/tuned-kftrw" Apr 16 20:37:43.596502 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.596476 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-host-cni-netd\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.596587 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.596502 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/74821fb7-65d5-4396-bc93-add3e5936d13-tmp-dir\") pod \"node-resolver-gsrd4\" (UID: \"74821fb7-65d5-4396-bc93-add3e5936d13\") " pod="openshift-dns/node-resolver-gsrd4" Apr 16 20:37:43.596641 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.596579 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-var-lib-openvswitch\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.596921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.596901 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-ovnkube-config\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.597102 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.597083 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/43c2ff47-b11f-4e08-99d4-c54547429f56-iptables-alerter-script\") pod \"iptables-alerter-fz4qv\" (UID: \"43c2ff47-b11f-4e08-99d4-c54547429f56\") " pod="openshift-network-operator/iptables-alerter-fz4qv" Apr 16 20:37:43.597244 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.597225 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-ovn-node-metrics-cert\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.597303 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.597272 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-ovnkube-script-lib\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.597782 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.597688 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/36f6c978-ccae-4818-a370-35e0101bf84f-agent-certs\") pod \"konnectivity-agent-nlrkh\" (UID: \"36f6c978-ccae-4818-a370-35e0101bf84f\") " pod="kube-system/konnectivity-agent-nlrkh" Apr 16 20:37:43.597871 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:43.597861 2537 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:37:43.597917 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:43.597879 2537 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:37:43.597917 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:43.597898 2537 projected.go:194] Error preparing data for projected volume kube-api-access-nvmvf for pod openshift-network-diagnostics/network-check-target-4cr2r: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:37:43.598007 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:43.597994 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e8800de1-77ef-480f-81de-cc93318d33b6-kube-api-access-nvmvf podName:e8800de1-77ef-480f-81de-cc93318d33b6 nodeName:}" failed. No retries permitted until 2026-04-16 20:37:44.097977494 +0000 UTC m=+3.083121067 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-nvmvf" (UniqueName: "kubernetes.io/projected/e8800de1-77ef-480f-81de-cc93318d33b6-kube-api-access-nvmvf") pod "network-check-target-4cr2r" (UID: "e8800de1-77ef-480f-81de-cc93318d33b6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:37:43.599012 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.598537 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e04adda1-dbd0-4e8a-b43f-08fc4771a0d3-etc-tuned\") pod \"tuned-kftrw\" (UID: \"e04adda1-dbd0-4e8a-b43f-08fc4771a0d3\") " pod="openshift-cluster-node-tuning-operator/tuned-kftrw" Apr 16 20:37:43.599519 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.599499 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e04adda1-dbd0-4e8a-b43f-08fc4771a0d3-tmp\") pod \"tuned-kftrw\" (UID: \"e04adda1-dbd0-4e8a-b43f-08fc4771a0d3\") " pod="openshift-cluster-node-tuning-operator/tuned-kftrw" Apr 16 20:37:43.600116 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.600094 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-45qt6\" (UniqueName: \"kubernetes.io/projected/4f9a3fdc-c6ac-415d-a704-3724ed4158a1-kube-api-access-45qt6\") pod \"ovnkube-node-5qqqk\" (UID: \"4f9a3fdc-c6ac-415d-a704-3724ed4158a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.602889 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.602845 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-579l5\" (UniqueName: \"kubernetes.io/projected/74821fb7-65d5-4396-bc93-add3e5936d13-kube-api-access-579l5\") pod \"node-resolver-gsrd4\" (UID: \"74821fb7-65d5-4396-bc93-add3e5936d13\") " pod="openshift-dns/node-resolver-gsrd4" Apr 16 20:37:43.603317 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.603297 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cm77\" (UniqueName: \"kubernetes.io/projected/43c2ff47-b11f-4e08-99d4-c54547429f56-kube-api-access-7cm77\") pod \"iptables-alerter-fz4qv\" (UID: \"43c2ff47-b11f-4e08-99d4-c54547429f56\") " pod="openshift-network-operator/iptables-alerter-fz4qv" Apr 16 20:37:43.604438 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.604414 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpzql\" (UniqueName: \"kubernetes.io/projected/11dcf076-13e5-4128-bf13-7e6c86c6dd5b-kube-api-access-kpzql\") pod \"network-metrics-daemon-d6xkn\" (UID: \"11dcf076-13e5-4128-bf13-7e6c86c6dd5b\") " pod="openshift-multus/network-metrics-daemon-d6xkn" Apr 16 20:37:43.604605 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.604587 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jc4d\" (UniqueName: \"kubernetes.io/projected/e04adda1-dbd0-4e8a-b43f-08fc4771a0d3-kube-api-access-7jc4d\") pod \"tuned-kftrw\" (UID: \"e04adda1-dbd0-4e8a-b43f-08fc4771a0d3\") " pod="openshift-cluster-node-tuning-operator/tuned-kftrw" Apr 16 20:37:43.696091 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.696060 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/366e6f1a-199e-4381-a962-80622ebd99b0-device-dir\") pod \"aws-ebs-csi-driver-node-6ff6q\" (UID: \"366e6f1a-199e-4381-a962-80622ebd99b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ff6q" Apr 16 20:37:43.696253 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.696100 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7770e551-6840-450b-81fa-c37714dbe265-system-cni-dir\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.696253 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.696124 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7770e551-6840-450b-81fa-c37714dbe265-etc-kubernetes\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.696253 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.696144 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vnbw9\" (UniqueName: \"kubernetes.io/projected/7770e551-6840-450b-81fa-c37714dbe265-kube-api-access-vnbw9\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.696253 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.696168 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/88ebfbf2-8dfe-4d3c-91ed-559a91a0a925-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kslqr\" (UID: \"88ebfbf2-8dfe-4d3c-91ed-559a91a0a925\") " pod="openshift-multus/multus-additional-cni-plugins-kslqr" Apr 16 20:37:43.696253 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.696175 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/366e6f1a-199e-4381-a962-80622ebd99b0-device-dir\") pod \"aws-ebs-csi-driver-node-6ff6q\" (UID: \"366e6f1a-199e-4381-a962-80622ebd99b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ff6q" Apr 16 20:37:43.696253 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.696191 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/88ebfbf2-8dfe-4d3c-91ed-559a91a0a925-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kslqr\" (UID: \"88ebfbf2-8dfe-4d3c-91ed-559a91a0a925\") " pod="openshift-multus/multus-additional-cni-plugins-kslqr" Apr 16 20:37:43.696253 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.696197 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7770e551-6840-450b-81fa-c37714dbe265-system-cni-dir\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.696253 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.696238 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7770e551-6840-450b-81fa-c37714dbe265-etc-kubernetes\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.696671 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.696241 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xb6mc\" (UniqueName: \"kubernetes.io/projected/81e7762a-1005-4a21-8f55-9dda467004e0-kube-api-access-xb6mc\") pod \"node-ca-2t2sk\" (UID: \"81e7762a-1005-4a21-8f55-9dda467004e0\") " pod="openshift-image-registry/node-ca-2t2sk" Apr 16 20:37:43.696671 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.696334 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7770e551-6840-450b-81fa-c37714dbe265-multus-cni-dir\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.696671 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.696362 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7770e551-6840-450b-81fa-c37714dbe265-multus-socket-dir-parent\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.696671 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.696384 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7770e551-6840-450b-81fa-c37714dbe265-multus-conf-dir\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.696671 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.696423 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/366e6f1a-199e-4381-a962-80622ebd99b0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6ff6q\" (UID: \"366e6f1a-199e-4381-a962-80622ebd99b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ff6q" Apr 16 20:37:43.696671 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.696444 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/366e6f1a-199e-4381-a962-80622ebd99b0-registration-dir\") pod \"aws-ebs-csi-driver-node-6ff6q\" (UID: \"366e6f1a-199e-4381-a962-80622ebd99b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ff6q" Apr 16 20:37:43.696671 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.696454 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7770e551-6840-450b-81fa-c37714dbe265-multus-cni-dir\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.696671 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.696458 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7770e551-6840-450b-81fa-c37714dbe265-host-run-netns\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.696671 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.696478 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7770e551-6840-450b-81fa-c37714dbe265-host-run-netns\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.696671 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.696516 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7770e551-6840-450b-81fa-c37714dbe265-multus-conf-dir\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.696671 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.696507 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/81e7762a-1005-4a21-8f55-9dda467004e0-serviceca\") pod \"node-ca-2t2sk\" (UID: \"81e7762a-1005-4a21-8f55-9dda467004e0\") " pod="openshift-image-registry/node-ca-2t2sk" Apr 16 20:37:43.696671 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.696583 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/366e6f1a-199e-4381-a962-80622ebd99b0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6ff6q\" (UID: \"366e6f1a-199e-4381-a962-80622ebd99b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ff6q" Apr 16 20:37:43.696671 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.696612 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/366e6f1a-199e-4381-a962-80622ebd99b0-registration-dir\") pod \"aws-ebs-csi-driver-node-6ff6q\" (UID: \"366e6f1a-199e-4381-a962-80622ebd99b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ff6q" Apr 16 20:37:43.696671 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.696613 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7770e551-6840-450b-81fa-c37714dbe265-multus-socket-dir-parent\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.696671 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.696663 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/88ebfbf2-8dfe-4d3c-91ed-559a91a0a925-os-release\") pod \"multus-additional-cni-plugins-kslqr\" (UID: \"88ebfbf2-8dfe-4d3c-91ed-559a91a0a925\") " pod="openshift-multus/multus-additional-cni-plugins-kslqr" Apr 16 20:37:43.697314 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.696690 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7770e551-6840-450b-81fa-c37714dbe265-hostroot\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.697314 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.696716 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7770e551-6840-450b-81fa-c37714dbe265-host-run-multus-certs\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.697314 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.696740 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8chxz\" (UniqueName: \"kubernetes.io/projected/88ebfbf2-8dfe-4d3c-91ed-559a91a0a925-kube-api-access-8chxz\") pod \"multus-additional-cni-plugins-kslqr\" (UID: \"88ebfbf2-8dfe-4d3c-91ed-559a91a0a925\") " pod="openshift-multus/multus-additional-cni-plugins-kslqr" Apr 16 20:37:43.697314 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.696774 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7770e551-6840-450b-81fa-c37714dbe265-hostroot\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.697314 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.696786 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/88ebfbf2-8dfe-4d3c-91ed-559a91a0a925-system-cni-dir\") pod \"multus-additional-cni-plugins-kslqr\" (UID: \"88ebfbf2-8dfe-4d3c-91ed-559a91a0a925\") " pod="openshift-multus/multus-additional-cni-plugins-kslqr" Apr 16 20:37:43.697314 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.696790 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7770e551-6840-450b-81fa-c37714dbe265-host-run-multus-certs\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.697314 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.696755 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/88ebfbf2-8dfe-4d3c-91ed-559a91a0a925-os-release\") pod \"multus-additional-cni-plugins-kslqr\" (UID: \"88ebfbf2-8dfe-4d3c-91ed-559a91a0a925\") " pod="openshift-multus/multus-additional-cni-plugins-kslqr" Apr 16 20:37:43.697314 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.696813 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/366e6f1a-199e-4381-a962-80622ebd99b0-socket-dir\") pod \"aws-ebs-csi-driver-node-6ff6q\" (UID: \"366e6f1a-199e-4381-a962-80622ebd99b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ff6q" Apr 16 20:37:43.697314 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.696827 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/88ebfbf2-8dfe-4d3c-91ed-559a91a0a925-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kslqr\" (UID: \"88ebfbf2-8dfe-4d3c-91ed-559a91a0a925\") " pod="openshift-multus/multus-additional-cni-plugins-kslqr" Apr 16 20:37:43.697314 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.696836 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/81e7762a-1005-4a21-8f55-9dda467004e0-host\") pod \"node-ca-2t2sk\" (UID: \"81e7762a-1005-4a21-8f55-9dda467004e0\") " pod="openshift-image-registry/node-ca-2t2sk" Apr 16 20:37:43.697314 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.696842 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/88ebfbf2-8dfe-4d3c-91ed-559a91a0a925-system-cni-dir\") pod \"multus-additional-cni-plugins-kslqr\" (UID: \"88ebfbf2-8dfe-4d3c-91ed-559a91a0a925\") " pod="openshift-multus/multus-additional-cni-plugins-kslqr" Apr 16 20:37:43.697314 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.696870 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7770e551-6840-450b-81fa-c37714dbe265-multus-daemon-config\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.697314 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.696904 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/88ebfbf2-8dfe-4d3c-91ed-559a91a0a925-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kslqr\" (UID: \"88ebfbf2-8dfe-4d3c-91ed-559a91a0a925\") " pod="openshift-multus/multus-additional-cni-plugins-kslqr" Apr 16 20:37:43.697314 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.696907 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/88ebfbf2-8dfe-4d3c-91ed-559a91a0a925-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kslqr\" (UID: \"88ebfbf2-8dfe-4d3c-91ed-559a91a0a925\") " pod="openshift-multus/multus-additional-cni-plugins-kslqr" Apr 16 20:37:43.697314 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.696927 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/366e6f1a-199e-4381-a962-80622ebd99b0-sys-fs\") pod \"aws-ebs-csi-driver-node-6ff6q\" (UID: \"366e6f1a-199e-4381-a962-80622ebd99b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ff6q" Apr 16 20:37:43.697314 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.696970 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7770e551-6840-450b-81fa-c37714dbe265-host-var-lib-cni-bin\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.697314 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.696976 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/81e7762a-1005-4a21-8f55-9dda467004e0-host\") pod \"node-ca-2t2sk\" (UID: \"81e7762a-1005-4a21-8f55-9dda467004e0\") " pod="openshift-image-registry/node-ca-2t2sk" Apr 16 20:37:43.698046 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.696943 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/366e6f1a-199e-4381-a962-80622ebd99b0-socket-dir\") pod \"aws-ebs-csi-driver-node-6ff6q\" (UID: \"366e6f1a-199e-4381-a962-80622ebd99b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ff6q" Apr 16 20:37:43.698046 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.696997 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7770e551-6840-450b-81fa-c37714dbe265-cnibin\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.698046 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.696944 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/81e7762a-1005-4a21-8f55-9dda467004e0-serviceca\") pod \"node-ca-2t2sk\" (UID: \"81e7762a-1005-4a21-8f55-9dda467004e0\") " pod="openshift-image-registry/node-ca-2t2sk" Apr 16 20:37:43.698046 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.697032 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/88ebfbf2-8dfe-4d3c-91ed-559a91a0a925-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kslqr\" (UID: \"88ebfbf2-8dfe-4d3c-91ed-559a91a0a925\") " pod="openshift-multus/multus-additional-cni-plugins-kslqr" Apr 16 20:37:43.698046 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.697025 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7770e551-6840-450b-81fa-c37714dbe265-host-var-lib-cni-multus\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.698046 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.697061 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/366e6f1a-199e-4381-a962-80622ebd99b0-sys-fs\") pod \"aws-ebs-csi-driver-node-6ff6q\" (UID: \"366e6f1a-199e-4381-a962-80622ebd99b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ff6q" Apr 16 20:37:43.698046 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.697070 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7770e551-6840-450b-81fa-c37714dbe265-cni-binary-copy\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.698046 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.697061 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7770e551-6840-450b-81fa-c37714dbe265-host-var-lib-cni-bin\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.698046 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.697100 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7770e551-6840-450b-81fa-c37714dbe265-cnibin\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.698046 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.697102 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/366e6f1a-199e-4381-a962-80622ebd99b0-etc-selinux\") pod \"aws-ebs-csi-driver-node-6ff6q\" (UID: \"366e6f1a-199e-4381-a962-80622ebd99b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ff6q" Apr 16 20:37:43.698046 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.697116 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7770e551-6840-450b-81fa-c37714dbe265-host-var-lib-cni-multus\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.698046 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.697146 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/88ebfbf2-8dfe-4d3c-91ed-559a91a0a925-cnibin\") pod \"multus-additional-cni-plugins-kslqr\" (UID: \"88ebfbf2-8dfe-4d3c-91ed-559a91a0a925\") " pod="openshift-multus/multus-additional-cni-plugins-kslqr" Apr 16 20:37:43.698046 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.697150 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/366e6f1a-199e-4381-a962-80622ebd99b0-etc-selinux\") pod \"aws-ebs-csi-driver-node-6ff6q\" (UID: \"366e6f1a-199e-4381-a962-80622ebd99b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ff6q" Apr 16 20:37:43.698046 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.697172 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2kck8\" (UniqueName: \"kubernetes.io/projected/366e6f1a-199e-4381-a962-80622ebd99b0-kube-api-access-2kck8\") pod \"aws-ebs-csi-driver-node-6ff6q\" (UID: \"366e6f1a-199e-4381-a962-80622ebd99b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ff6q" Apr 16 20:37:43.698046 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.697189 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/88ebfbf2-8dfe-4d3c-91ed-559a91a0a925-cnibin\") pod \"multus-additional-cni-plugins-kslqr\" (UID: \"88ebfbf2-8dfe-4d3c-91ed-559a91a0a925\") " pod="openshift-multus/multus-additional-cni-plugins-kslqr" Apr 16 20:37:43.698046 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.697196 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7770e551-6840-450b-81fa-c37714dbe265-host-var-lib-kubelet\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.698046 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.697227 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/88ebfbf2-8dfe-4d3c-91ed-559a91a0a925-cni-binary-copy\") pod \"multus-additional-cni-plugins-kslqr\" (UID: \"88ebfbf2-8dfe-4d3c-91ed-559a91a0a925\") " pod="openshift-multus/multus-additional-cni-plugins-kslqr" Apr 16 20:37:43.698575 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.697253 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7770e551-6840-450b-81fa-c37714dbe265-os-release\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.698575 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.697277 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7770e551-6840-450b-81fa-c37714dbe265-host-run-k8s-cni-cncf-io\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.698575 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.697298 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7770e551-6840-450b-81fa-c37714dbe265-host-var-lib-kubelet\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.698575 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.697348 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7770e551-6840-450b-81fa-c37714dbe265-host-run-k8s-cni-cncf-io\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.698575 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.697391 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7770e551-6840-450b-81fa-c37714dbe265-os-release\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.698575 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.697425 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7770e551-6840-450b-81fa-c37714dbe265-multus-daemon-config\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.698575 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.697554 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7770e551-6840-450b-81fa-c37714dbe265-cni-binary-copy\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.698575 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.697760 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/88ebfbf2-8dfe-4d3c-91ed-559a91a0a925-cni-binary-copy\") pod \"multus-additional-cni-plugins-kslqr\" (UID: \"88ebfbf2-8dfe-4d3c-91ed-559a91a0a925\") " pod="openshift-multus/multus-additional-cni-plugins-kslqr" Apr 16 20:37:43.704334 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.704310 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb6mc\" (UniqueName: \"kubernetes.io/projected/81e7762a-1005-4a21-8f55-9dda467004e0-kube-api-access-xb6mc\") pod \"node-ca-2t2sk\" (UID: \"81e7762a-1005-4a21-8f55-9dda467004e0\") " pod="openshift-image-registry/node-ca-2t2sk" Apr 16 20:37:43.704430 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.704405 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnbw9\" (UniqueName: \"kubernetes.io/projected/7770e551-6840-450b-81fa-c37714dbe265-kube-api-access-vnbw9\") pod \"multus-8bj89\" (UID: \"7770e551-6840-450b-81fa-c37714dbe265\") " pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.704945 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.704924 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kck8\" (UniqueName: \"kubernetes.io/projected/366e6f1a-199e-4381-a962-80622ebd99b0-kube-api-access-2kck8\") pod \"aws-ebs-csi-driver-node-6ff6q\" (UID: \"366e6f1a-199e-4381-a962-80622ebd99b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ff6q" Apr 16 20:37:43.705022 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.704926 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8chxz\" (UniqueName: \"kubernetes.io/projected/88ebfbf2-8dfe-4d3c-91ed-559a91a0a925-kube-api-access-8chxz\") pod \"multus-additional-cni-plugins-kslqr\" (UID: \"88ebfbf2-8dfe-4d3c-91ed-559a91a0a925\") " pod="openshift-multus/multus-additional-cni-plugins-kslqr" Apr 16 20:37:43.789073 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.789001 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-kftrw" Apr 16 20:37:43.794612 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.794587 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gsrd4" Apr 16 20:37:43.802363 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.802342 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-nlrkh" Apr 16 20:37:43.809143 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.808929 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-fz4qv" Apr 16 20:37:43.817724 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.817677 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:37:43.825294 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.825272 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ff6q" Apr 16 20:37:43.831871 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.831852 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2t2sk" Apr 16 20:37:43.847412 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.847390 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8bj89" Apr 16 20:37:43.852072 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.852054 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kslqr" Apr 16 20:37:43.976172 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:43.976140 2537 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:37:44.100847 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:44.100766 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nvmvf\" (UniqueName: \"kubernetes.io/projected/e8800de1-77ef-480f-81de-cc93318d33b6-kube-api-access-nvmvf\") pod \"network-check-target-4cr2r\" (UID: \"e8800de1-77ef-480f-81de-cc93318d33b6\") " pod="openshift-network-diagnostics/network-check-target-4cr2r" Apr 16 20:37:44.100847 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:44.100811 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/11dcf076-13e5-4128-bf13-7e6c86c6dd5b-metrics-certs\") pod \"network-metrics-daemon-d6xkn\" (UID: \"11dcf076-13e5-4128-bf13-7e6c86c6dd5b\") " pod="openshift-multus/network-metrics-daemon-d6xkn" Apr 16 20:37:44.101031 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:44.100934 2537 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:37:44.101031 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:44.100983 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11dcf076-13e5-4128-bf13-7e6c86c6dd5b-metrics-certs podName:11dcf076-13e5-4128-bf13-7e6c86c6dd5b nodeName:}" failed. No retries permitted until 2026-04-16 20:37:45.100970174 +0000 UTC m=+4.086113743 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/11dcf076-13e5-4128-bf13-7e6c86c6dd5b-metrics-certs") pod "network-metrics-daemon-d6xkn" (UID: "11dcf076-13e5-4128-bf13-7e6c86c6dd5b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:37:44.101031 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:44.100937 2537 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:37:44.101031 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:44.101031 2537 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:37:44.101180 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:44.101043 2537 projected.go:194] Error preparing data for projected volume kube-api-access-nvmvf for pod openshift-network-diagnostics/network-check-target-4cr2r: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:37:44.101180 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:44.101088 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e8800de1-77ef-480f-81de-cc93318d33b6-kube-api-access-nvmvf podName:e8800de1-77ef-480f-81de-cc93318d33b6 nodeName:}" failed. No retries permitted until 2026-04-16 20:37:45.101074551 +0000 UTC m=+4.086218137 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-nvmvf" (UniqueName: "kubernetes.io/projected/e8800de1-77ef-480f-81de-cc93318d33b6-kube-api-access-nvmvf") pod "network-check-target-4cr2r" (UID: "e8800de1-77ef-480f-81de-cc93318d33b6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:37:44.525426 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:44.525333 2537 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 20:32:42 +0000 UTC" deadline="2027-11-19 14:34:53.556955831 +0000 UTC" Apr 16 20:37:44.525426 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:44.525363 2537 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13961h57m9.031595642s" Apr 16 20:37:44.575396 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:44.575187 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f9a3fdc_c6ac_415d_a704_3724ed4158a1.slice/crio-322120176faf13846c4260a6750f1f57512bb39b14340ddfb1518bfbd9414145 WatchSource:0}: Error finding container 322120176faf13846c4260a6750f1f57512bb39b14340ddfb1518bfbd9414145: Status 404 returned error can't find the container with id 322120176faf13846c4260a6750f1f57512bb39b14340ddfb1518bfbd9414145 Apr 16 20:37:44.577324 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:44.577281 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36f6c978_ccae_4818_a370_35e0101bf84f.slice/crio-11db0219cd862b8d374ac488c43f4d5cc126bd955ca3f4d7f30b0ae3190916a7 WatchSource:0}: Error finding container 11db0219cd862b8d374ac488c43f4d5cc126bd955ca3f4d7f30b0ae3190916a7: Status 404 returned error can't find the container with id 11db0219cd862b8d374ac488c43f4d5cc126bd955ca3f4d7f30b0ae3190916a7 Apr 16 20:37:44.580500 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:44.580462 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode04adda1_dbd0_4e8a_b43f_08fc4771a0d3.slice/crio-207064cc2d0153fe4e5e77561ea441846c588fc3053c80a96627e98da9c5850c WatchSource:0}: Error finding container 207064cc2d0153fe4e5e77561ea441846c588fc3053c80a96627e98da9c5850c: Status 404 returned error can't find the container with id 207064cc2d0153fe4e5e77561ea441846c588fc3053c80a96627e98da9c5850c Apr 16 20:37:44.581641 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:44.581617 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7770e551_6840_450b_81fa_c37714dbe265.slice/crio-5fceac8f46612e8dcdf817be653bfc9e77d70ae314387a00e4520daa3e00dea5 WatchSource:0}: Error finding container 5fceac8f46612e8dcdf817be653bfc9e77d70ae314387a00e4520daa3e00dea5: Status 404 returned error can't find the container with id 5fceac8f46612e8dcdf817be653bfc9e77d70ae314387a00e4520daa3e00dea5 Apr 16 20:37:44.582549 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:44.582520 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81e7762a_1005_4a21_8f55_9dda467004e0.slice/crio-ebb7b3095505be52e757bb0f6c371f35bb8ad92e5186fa16f57ef1cfeec75bbe WatchSource:0}: Error finding container ebb7b3095505be52e757bb0f6c371f35bb8ad92e5186fa16f57ef1cfeec75bbe: Status 404 returned error can't find the container with id ebb7b3095505be52e757bb0f6c371f35bb8ad92e5186fa16f57ef1cfeec75bbe Apr 16 20:37:44.583641 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:44.583350 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43c2ff47_b11f_4e08_99d4_c54547429f56.slice/crio-bc9f6e9fe00bdfa36de2e2cc619c84223e892f2e96a654331728c568fe994019 WatchSource:0}: Error finding container bc9f6e9fe00bdfa36de2e2cc619c84223e892f2e96a654331728c568fe994019: Status 404 returned error can't find the container with id bc9f6e9fe00bdfa36de2e2cc619c84223e892f2e96a654331728c568fe994019 Apr 16 20:37:44.586041 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:44.585686 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74821fb7_65d5_4396_bc93_add3e5936d13.slice/crio-1dfb479920a04c4acbff68fdbb2ee5bcacdce33b878fd706b0265ffa78c2fcb1 WatchSource:0}: Error finding container 1dfb479920a04c4acbff68fdbb2ee5bcacdce33b878fd706b0265ffa78c2fcb1: Status 404 returned error can't find the container with id 1dfb479920a04c4acbff68fdbb2ee5bcacdce33b878fd706b0265ffa78c2fcb1 Apr 16 20:37:44.587370 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:44.587060 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod366e6f1a_199e_4381_a962_80622ebd99b0.slice/crio-11bf39b7d14cb32ff826d659a0052733552871145db3bbb3aa598a6cdd1cc015 WatchSource:0}: Error finding container 11bf39b7d14cb32ff826d659a0052733552871145db3bbb3aa598a6cdd1cc015: Status 404 returned error can't find the container with id 11bf39b7d14cb32ff826d659a0052733552871145db3bbb3aa598a6cdd1cc015 Apr 16 20:37:44.588025 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:37:44.587916 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88ebfbf2_8dfe_4d3c_91ed_559a91a0a925.slice/crio-b5854bc00f6368383e9928edf38b816a55a9f8523362d017c2de3210273826a1 WatchSource:0}: Error finding container b5854bc00f6368383e9928edf38b816a55a9f8523362d017c2de3210273826a1: Status 404 returned error can't find the container with id b5854bc00f6368383e9928edf38b816a55a9f8523362d017c2de3210273826a1 Apr 16 20:37:44.590872 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:44.590529 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8bj89" event={"ID":"7770e551-6840-450b-81fa-c37714dbe265","Type":"ContainerStarted","Data":"5fceac8f46612e8dcdf817be653bfc9e77d70ae314387a00e4520daa3e00dea5"} Apr 16 20:37:44.592618 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:44.592134 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" event={"ID":"4f9a3fdc-c6ac-415d-a704-3724ed4158a1","Type":"ContainerStarted","Data":"322120176faf13846c4260a6750f1f57512bb39b14340ddfb1518bfbd9414145"} Apr 16 20:37:44.593673 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:44.593645 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-fz4qv" event={"ID":"43c2ff47-b11f-4e08-99d4-c54547429f56","Type":"ContainerStarted","Data":"bc9f6e9fe00bdfa36de2e2cc619c84223e892f2e96a654331728c568fe994019"} Apr 16 20:37:44.594854 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:44.594832 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-kftrw" event={"ID":"e04adda1-dbd0-4e8a-b43f-08fc4771a0d3","Type":"ContainerStarted","Data":"207064cc2d0153fe4e5e77561ea441846c588fc3053c80a96627e98da9c5850c"} Apr 16 20:37:44.595937 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:44.595908 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-nlrkh" event={"ID":"36f6c978-ccae-4818-a370-35e0101bf84f","Type":"ContainerStarted","Data":"11db0219cd862b8d374ac488c43f4d5cc126bd955ca3f4d7f30b0ae3190916a7"} Apr 16 20:37:44.596984 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:44.596953 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ff6q" event={"ID":"366e6f1a-199e-4381-a962-80622ebd99b0","Type":"ContainerStarted","Data":"11bf39b7d14cb32ff826d659a0052733552871145db3bbb3aa598a6cdd1cc015"} Apr 16 20:37:44.597948 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:44.597926 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gsrd4" event={"ID":"74821fb7-65d5-4396-bc93-add3e5936d13","Type":"ContainerStarted","Data":"1dfb479920a04c4acbff68fdbb2ee5bcacdce33b878fd706b0265ffa78c2fcb1"} Apr 16 20:37:44.598958 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:44.598921 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2t2sk" event={"ID":"81e7762a-1005-4a21-8f55-9dda467004e0","Type":"ContainerStarted","Data":"ebb7b3095505be52e757bb0f6c371f35bb8ad92e5186fa16f57ef1cfeec75bbe"} Apr 16 20:37:44.947585 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:44.945897 2537 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:37:45.109630 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:45.109599 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nvmvf\" (UniqueName: \"kubernetes.io/projected/e8800de1-77ef-480f-81de-cc93318d33b6-kube-api-access-nvmvf\") pod \"network-check-target-4cr2r\" (UID: \"e8800de1-77ef-480f-81de-cc93318d33b6\") " pod="openshift-network-diagnostics/network-check-target-4cr2r" Apr 16 20:37:45.109763 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:45.109655 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/11dcf076-13e5-4128-bf13-7e6c86c6dd5b-metrics-certs\") pod \"network-metrics-daemon-d6xkn\" (UID: \"11dcf076-13e5-4128-bf13-7e6c86c6dd5b\") " pod="openshift-multus/network-metrics-daemon-d6xkn" Apr 16 20:37:45.109825 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:45.109813 2537 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:37:45.109903 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:45.109877 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11dcf076-13e5-4128-bf13-7e6c86c6dd5b-metrics-certs podName:11dcf076-13e5-4128-bf13-7e6c86c6dd5b nodeName:}" failed. No retries permitted until 2026-04-16 20:37:47.109855903 +0000 UTC m=+6.094999492 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/11dcf076-13e5-4128-bf13-7e6c86c6dd5b-metrics-certs") pod "network-metrics-daemon-d6xkn" (UID: "11dcf076-13e5-4128-bf13-7e6c86c6dd5b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:37:45.110090 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:45.109984 2537 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:37:45.110090 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:45.110007 2537 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:37:45.110090 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:45.110019 2537 projected.go:194] Error preparing data for projected volume kube-api-access-nvmvf for pod openshift-network-diagnostics/network-check-target-4cr2r: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:37:45.110090 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:45.110068 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e8800de1-77ef-480f-81de-cc93318d33b6-kube-api-access-nvmvf podName:e8800de1-77ef-480f-81de-cc93318d33b6 nodeName:}" failed. No retries permitted until 2026-04-16 20:37:47.110055744 +0000 UTC m=+6.095199325 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-nvmvf" (UniqueName: "kubernetes.io/projected/e8800de1-77ef-480f-81de-cc93318d33b6-kube-api-access-nvmvf") pod "network-check-target-4cr2r" (UID: "e8800de1-77ef-480f-81de-cc93318d33b6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:37:45.584125 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:45.584055 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4cr2r" Apr 16 20:37:45.584512 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:45.584164 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4cr2r" podUID="e8800de1-77ef-480f-81de-cc93318d33b6" Apr 16 20:37:45.584512 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:45.584356 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d6xkn" Apr 16 20:37:45.584512 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:45.584451 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d6xkn" podUID="11dcf076-13e5-4128-bf13-7e6c86c6dd5b" Apr 16 20:37:45.606983 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:45.606953 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kslqr" event={"ID":"88ebfbf2-8dfe-4d3c-91ed-559a91a0a925","Type":"ContainerStarted","Data":"b5854bc00f6368383e9928edf38b816a55a9f8523362d017c2de3210273826a1"} Apr 16 20:37:45.611754 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:45.611727 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-199.ec2.internal" event={"ID":"87d4ca475b53fa90f2c794fc65d796bc","Type":"ContainerStarted","Data":"b0c06f8af5aca5f6f6cbd1ab9f827af3af4e29d3ae1229771e752f18c313abd9"} Apr 16 20:37:45.625297 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:45.625246 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-199.ec2.internal" podStartSLOduration=3.625231713 podStartE2EDuration="3.625231713s" podCreationTimestamp="2026-04-16 20:37:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:37:45.624968103 +0000 UTC m=+4.610111690" watchObservedRunningTime="2026-04-16 20:37:45.625231713 +0000 UTC m=+4.610375307" Apr 16 20:37:45.625870 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:45.625835 2537 generic.go:358] "Generic (PLEG): container finished" podID="416f09eead2ff50b4bdd823b81a2a56e" containerID="56d902c17ce88b2e17f020858fe6059dd6f8fdf568e1bb9048a51feb212e5e55" exitCode=0 Apr 16 20:37:45.625870 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:45.625869 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-199.ec2.internal" event={"ID":"416f09eead2ff50b4bdd823b81a2a56e","Type":"ContainerDied","Data":"56d902c17ce88b2e17f020858fe6059dd6f8fdf568e1bb9048a51feb212e5e55"} Apr 16 20:37:46.650757 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:46.650668 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-199.ec2.internal" event={"ID":"416f09eead2ff50b4bdd823b81a2a56e","Type":"ContainerStarted","Data":"a7c569b0ecc65ac01831bbbcb741eba47af966a8dbf6075c7e467cd17249074e"} Apr 16 20:37:46.662213 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:46.662163 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-199.ec2.internal" podStartSLOduration=4.662145776 podStartE2EDuration="4.662145776s" podCreationTimestamp="2026-04-16 20:37:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:37:46.662124886 +0000 UTC m=+5.647268477" watchObservedRunningTime="2026-04-16 20:37:46.662145776 +0000 UTC m=+5.647289369" Apr 16 20:37:47.129773 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:47.128939 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nvmvf\" (UniqueName: \"kubernetes.io/projected/e8800de1-77ef-480f-81de-cc93318d33b6-kube-api-access-nvmvf\") pod \"network-check-target-4cr2r\" (UID: \"e8800de1-77ef-480f-81de-cc93318d33b6\") " pod="openshift-network-diagnostics/network-check-target-4cr2r" Apr 16 20:37:47.129773 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:47.128992 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/11dcf076-13e5-4128-bf13-7e6c86c6dd5b-metrics-certs\") pod \"network-metrics-daemon-d6xkn\" (UID: \"11dcf076-13e5-4128-bf13-7e6c86c6dd5b\") " pod="openshift-multus/network-metrics-daemon-d6xkn" Apr 16 20:37:47.129773 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:47.129120 2537 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:37:47.129773 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:47.129186 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11dcf076-13e5-4128-bf13-7e6c86c6dd5b-metrics-certs podName:11dcf076-13e5-4128-bf13-7e6c86c6dd5b nodeName:}" failed. No retries permitted until 2026-04-16 20:37:51.129167733 +0000 UTC m=+10.114311311 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/11dcf076-13e5-4128-bf13-7e6c86c6dd5b-metrics-certs") pod "network-metrics-daemon-d6xkn" (UID: "11dcf076-13e5-4128-bf13-7e6c86c6dd5b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:37:47.129773 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:47.129592 2537 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:37:47.129773 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:47.129615 2537 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:37:47.129773 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:47.129630 2537 projected.go:194] Error preparing data for projected volume kube-api-access-nvmvf for pod openshift-network-diagnostics/network-check-target-4cr2r: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:37:47.129773 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:47.129691 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e8800de1-77ef-480f-81de-cc93318d33b6-kube-api-access-nvmvf podName:e8800de1-77ef-480f-81de-cc93318d33b6 nodeName:}" failed. No retries permitted until 2026-04-16 20:37:51.12965924 +0000 UTC m=+10.114802810 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-nvmvf" (UniqueName: "kubernetes.io/projected/e8800de1-77ef-480f-81de-cc93318d33b6-kube-api-access-nvmvf") pod "network-check-target-4cr2r" (UID: "e8800de1-77ef-480f-81de-cc93318d33b6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:37:47.583981 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:47.583898 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d6xkn" Apr 16 20:37:47.584139 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:47.584044 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d6xkn" podUID="11dcf076-13e5-4128-bf13-7e6c86c6dd5b" Apr 16 20:37:47.584139 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:47.584085 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4cr2r" Apr 16 20:37:47.584223 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:47.584162 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4cr2r" podUID="e8800de1-77ef-480f-81de-cc93318d33b6" Apr 16 20:37:49.583242 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:49.583212 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d6xkn" Apr 16 20:37:49.583683 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:49.583354 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d6xkn" podUID="11dcf076-13e5-4128-bf13-7e6c86c6dd5b" Apr 16 20:37:49.583683 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:49.583386 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4cr2r" Apr 16 20:37:49.583683 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:49.583533 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4cr2r" podUID="e8800de1-77ef-480f-81de-cc93318d33b6" Apr 16 20:37:50.699666 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:50.699637 2537 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-fsph8"] Apr 16 20:37:50.702727 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:50.702703 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fsph8" Apr 16 20:37:50.702841 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:50.702784 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fsph8" podUID="7d19a112-53ff-4260-97bb-aaa69848369c" Apr 16 20:37:50.756075 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:50.756043 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7d19a112-53ff-4260-97bb-aaa69848369c-dbus\") pod \"global-pull-secret-syncer-fsph8\" (UID: \"7d19a112-53ff-4260-97bb-aaa69848369c\") " pod="kube-system/global-pull-secret-syncer-fsph8" Apr 16 20:37:50.756215 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:50.756147 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7d19a112-53ff-4260-97bb-aaa69848369c-kubelet-config\") pod \"global-pull-secret-syncer-fsph8\" (UID: \"7d19a112-53ff-4260-97bb-aaa69848369c\") " pod="kube-system/global-pull-secret-syncer-fsph8" Apr 16 20:37:50.756215 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:50.756192 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7d19a112-53ff-4260-97bb-aaa69848369c-original-pull-secret\") pod \"global-pull-secret-syncer-fsph8\" (UID: \"7d19a112-53ff-4260-97bb-aaa69848369c\") " pod="kube-system/global-pull-secret-syncer-fsph8" Apr 16 20:37:50.857024 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:50.856507 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7d19a112-53ff-4260-97bb-aaa69848369c-dbus\") pod \"global-pull-secret-syncer-fsph8\" (UID: \"7d19a112-53ff-4260-97bb-aaa69848369c\") " pod="kube-system/global-pull-secret-syncer-fsph8" Apr 16 20:37:50.857024 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:50.856605 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7d19a112-53ff-4260-97bb-aaa69848369c-kubelet-config\") pod \"global-pull-secret-syncer-fsph8\" (UID: \"7d19a112-53ff-4260-97bb-aaa69848369c\") " pod="kube-system/global-pull-secret-syncer-fsph8" Apr 16 20:37:50.857024 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:50.856642 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7d19a112-53ff-4260-97bb-aaa69848369c-original-pull-secret\") pod \"global-pull-secret-syncer-fsph8\" (UID: \"7d19a112-53ff-4260-97bb-aaa69848369c\") " pod="kube-system/global-pull-secret-syncer-fsph8" Apr 16 20:37:50.857024 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:50.856691 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7d19a112-53ff-4260-97bb-aaa69848369c-dbus\") pod \"global-pull-secret-syncer-fsph8\" (UID: \"7d19a112-53ff-4260-97bb-aaa69848369c\") " pod="kube-system/global-pull-secret-syncer-fsph8" Apr 16 20:37:50.857024 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:50.856764 2537 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:37:50.857024 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:50.856769 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7d19a112-53ff-4260-97bb-aaa69848369c-kubelet-config\") pod \"global-pull-secret-syncer-fsph8\" (UID: \"7d19a112-53ff-4260-97bb-aaa69848369c\") " pod="kube-system/global-pull-secret-syncer-fsph8" Apr 16 20:37:50.857024 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:50.856824 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d19a112-53ff-4260-97bb-aaa69848369c-original-pull-secret podName:7d19a112-53ff-4260-97bb-aaa69848369c nodeName:}" failed. No retries permitted until 2026-04-16 20:37:51.356804611 +0000 UTC m=+10.341948182 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7d19a112-53ff-4260-97bb-aaa69848369c-original-pull-secret") pod "global-pull-secret-syncer-fsph8" (UID: "7d19a112-53ff-4260-97bb-aaa69848369c") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:37:51.158802 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:51.158719 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nvmvf\" (UniqueName: \"kubernetes.io/projected/e8800de1-77ef-480f-81de-cc93318d33b6-kube-api-access-nvmvf\") pod \"network-check-target-4cr2r\" (UID: \"e8800de1-77ef-480f-81de-cc93318d33b6\") " pod="openshift-network-diagnostics/network-check-target-4cr2r" Apr 16 20:37:51.158802 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:51.158785 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/11dcf076-13e5-4128-bf13-7e6c86c6dd5b-metrics-certs\") pod \"network-metrics-daemon-d6xkn\" (UID: \"11dcf076-13e5-4128-bf13-7e6c86c6dd5b\") " pod="openshift-multus/network-metrics-daemon-d6xkn" Apr 16 20:37:51.159007 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:51.158864 2537 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:37:51.159007 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:51.158888 2537 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:37:51.159007 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:51.158900 2537 projected.go:194] Error preparing data for projected volume kube-api-access-nvmvf for pod openshift-network-diagnostics/network-check-target-4cr2r: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:37:51.159007 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:51.158913 2537 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:37:51.159007 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:51.158953 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e8800de1-77ef-480f-81de-cc93318d33b6-kube-api-access-nvmvf podName:e8800de1-77ef-480f-81de-cc93318d33b6 nodeName:}" failed. No retries permitted until 2026-04-16 20:37:59.158935881 +0000 UTC m=+18.144079463 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-nvmvf" (UniqueName: "kubernetes.io/projected/e8800de1-77ef-480f-81de-cc93318d33b6-kube-api-access-nvmvf") pod "network-check-target-4cr2r" (UID: "e8800de1-77ef-480f-81de-cc93318d33b6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:37:51.159007 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:51.158973 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11dcf076-13e5-4128-bf13-7e6c86c6dd5b-metrics-certs podName:11dcf076-13e5-4128-bf13-7e6c86c6dd5b nodeName:}" failed. No retries permitted until 2026-04-16 20:37:59.158963798 +0000 UTC m=+18.144107390 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/11dcf076-13e5-4128-bf13-7e6c86c6dd5b-metrics-certs") pod "network-metrics-daemon-d6xkn" (UID: "11dcf076-13e5-4128-bf13-7e6c86c6dd5b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:37:51.360223 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:51.360185 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7d19a112-53ff-4260-97bb-aaa69848369c-original-pull-secret\") pod \"global-pull-secret-syncer-fsph8\" (UID: \"7d19a112-53ff-4260-97bb-aaa69848369c\") " pod="kube-system/global-pull-secret-syncer-fsph8" Apr 16 20:37:51.360393 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:51.360369 2537 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:37:51.360469 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:51.360457 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d19a112-53ff-4260-97bb-aaa69848369c-original-pull-secret podName:7d19a112-53ff-4260-97bb-aaa69848369c nodeName:}" failed. No retries permitted until 2026-04-16 20:37:52.360435598 +0000 UTC m=+11.345579182 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7d19a112-53ff-4260-97bb-aaa69848369c-original-pull-secret") pod "global-pull-secret-syncer-fsph8" (UID: "7d19a112-53ff-4260-97bb-aaa69848369c") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:37:51.584773 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:51.584260 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d6xkn" Apr 16 20:37:51.584773 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:51.584366 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d6xkn" podUID="11dcf076-13e5-4128-bf13-7e6c86c6dd5b" Apr 16 20:37:51.584773 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:51.584429 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4cr2r" Apr 16 20:37:51.584773 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:51.584517 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4cr2r" podUID="e8800de1-77ef-480f-81de-cc93318d33b6" Apr 16 20:37:52.369033 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:52.368986 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7d19a112-53ff-4260-97bb-aaa69848369c-original-pull-secret\") pod \"global-pull-secret-syncer-fsph8\" (UID: \"7d19a112-53ff-4260-97bb-aaa69848369c\") " pod="kube-system/global-pull-secret-syncer-fsph8" Apr 16 20:37:52.369472 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:52.369141 2537 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:37:52.369472 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:52.369214 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d19a112-53ff-4260-97bb-aaa69848369c-original-pull-secret podName:7d19a112-53ff-4260-97bb-aaa69848369c nodeName:}" failed. No retries permitted until 2026-04-16 20:37:54.369193534 +0000 UTC m=+13.354337103 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7d19a112-53ff-4260-97bb-aaa69848369c-original-pull-secret") pod "global-pull-secret-syncer-fsph8" (UID: "7d19a112-53ff-4260-97bb-aaa69848369c") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:37:52.583616 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:52.583580 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fsph8" Apr 16 20:37:52.583784 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:52.583713 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fsph8" podUID="7d19a112-53ff-4260-97bb-aaa69848369c" Apr 16 20:37:53.583551 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:53.583330 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4cr2r" Apr 16 20:37:53.584040 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:53.583669 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4cr2r" podUID="e8800de1-77ef-480f-81de-cc93318d33b6" Apr 16 20:37:53.584040 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:53.583362 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d6xkn" Apr 16 20:37:53.584040 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:53.583849 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d6xkn" podUID="11dcf076-13e5-4128-bf13-7e6c86c6dd5b" Apr 16 20:37:53.663897 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:53.663831 2537 generic.go:358] "Generic (PLEG): container finished" podID="88ebfbf2-8dfe-4d3c-91ed-559a91a0a925" containerID="54e92ae8af66c246c978329da7ab568a2d46fdd5a27b2a033fdfc72c9c0b7066" exitCode=0 Apr 16 20:37:53.664032 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:53.663906 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kslqr" event={"ID":"88ebfbf2-8dfe-4d3c-91ed-559a91a0a925","Type":"ContainerDied","Data":"54e92ae8af66c246c978329da7ab568a2d46fdd5a27b2a033fdfc72c9c0b7066"} Apr 16 20:37:53.665389 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:53.665320 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ff6q" event={"ID":"366e6f1a-199e-4381-a962-80622ebd99b0","Type":"ContainerStarted","Data":"570282d76fd47faa81de11df7d23162dd502419402359e6e54583dfba3617896"} Apr 16 20:37:53.667038 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:53.666963 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gsrd4" event={"ID":"74821fb7-65d5-4396-bc93-add3e5936d13","Type":"ContainerStarted","Data":"b08cb3d960411e4182fbd7e09a89ce72f6dd6385c8c90b603d8af01e7ef1815f"} Apr 16 20:37:53.668585 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:53.668545 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2t2sk" event={"ID":"81e7762a-1005-4a21-8f55-9dda467004e0","Type":"ContainerStarted","Data":"d0407ca687c6624c3757c442a491b64e266ea87df429d1d47f6eabbedb534f0c"} Apr 16 20:37:53.670529 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:53.670506 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-kftrw" event={"ID":"e04adda1-dbd0-4e8a-b43f-08fc4771a0d3","Type":"ContainerStarted","Data":"24bf26db02c261562fc52e2187b8f6c1129d906b6524abf0fb9d0db5d09305cb"} Apr 16 20:37:53.672438 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:53.672415 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-nlrkh" event={"ID":"36f6c978-ccae-4818-a370-35e0101bf84f","Type":"ContainerStarted","Data":"0e239189624eb124f6e1ae2b457de62c1642e68d21003260e75c5dcbebb62c9d"} Apr 16 20:37:53.695336 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:53.695293 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-gsrd4" podStartSLOduration=4.459470605 podStartE2EDuration="12.695279937s" podCreationTimestamp="2026-04-16 20:37:41 +0000 UTC" firstStartedPulling="2026-04-16 20:37:44.588194207 +0000 UTC m=+3.573337780" lastFinishedPulling="2026-04-16 20:37:52.824003536 +0000 UTC m=+11.809147112" observedRunningTime="2026-04-16 20:37:53.694971615 +0000 UTC m=+12.680115209" watchObservedRunningTime="2026-04-16 20:37:53.695279937 +0000 UTC m=+12.680423529" Apr 16 20:37:53.704774 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:53.704732 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-2t2sk" podStartSLOduration=4.46464639 podStartE2EDuration="12.704717526s" podCreationTimestamp="2026-04-16 20:37:41 +0000 UTC" firstStartedPulling="2026-04-16 20:37:44.584704599 +0000 UTC m=+3.569848179" lastFinishedPulling="2026-04-16 20:37:52.824775728 +0000 UTC m=+11.809919315" observedRunningTime="2026-04-16 20:37:53.70447088 +0000 UTC m=+12.689614474" watchObservedRunningTime="2026-04-16 20:37:53.704717526 +0000 UTC m=+12.689861117" Apr 16 20:37:53.727882 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:53.727840 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-nlrkh" podStartSLOduration=4.487927658 podStartE2EDuration="12.727828138s" podCreationTimestamp="2026-04-16 20:37:41 +0000 UTC" firstStartedPulling="2026-04-16 20:37:44.579516694 +0000 UTC m=+3.564660263" lastFinishedPulling="2026-04-16 20:37:52.819417168 +0000 UTC m=+11.804560743" observedRunningTime="2026-04-16 20:37:53.727427803 +0000 UTC m=+12.712571394" watchObservedRunningTime="2026-04-16 20:37:53.727828138 +0000 UTC m=+12.712971729" Apr 16 20:37:53.727963 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:53.727922 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-kftrw" podStartSLOduration=4.431303368 podStartE2EDuration="12.727917354s" podCreationTimestamp="2026-04-16 20:37:41 +0000 UTC" firstStartedPulling="2026-04-16 20:37:44.582312064 +0000 UTC m=+3.567455646" lastFinishedPulling="2026-04-16 20:37:52.878926061 +0000 UTC m=+11.864069632" observedRunningTime="2026-04-16 20:37:53.716031018 +0000 UTC m=+12.701174611" watchObservedRunningTime="2026-04-16 20:37:53.727917354 +0000 UTC m=+12.713060946" Apr 16 20:37:54.385208 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:54.385173 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7d19a112-53ff-4260-97bb-aaa69848369c-original-pull-secret\") pod \"global-pull-secret-syncer-fsph8\" (UID: \"7d19a112-53ff-4260-97bb-aaa69848369c\") " pod="kube-system/global-pull-secret-syncer-fsph8" Apr 16 20:37:54.385339 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:54.385298 2537 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:37:54.385409 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:54.385356 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d19a112-53ff-4260-97bb-aaa69848369c-original-pull-secret podName:7d19a112-53ff-4260-97bb-aaa69848369c nodeName:}" failed. No retries permitted until 2026-04-16 20:37:58.385339455 +0000 UTC m=+17.370483031 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7d19a112-53ff-4260-97bb-aaa69848369c-original-pull-secret") pod "global-pull-secret-syncer-fsph8" (UID: "7d19a112-53ff-4260-97bb-aaa69848369c") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:37:54.583462 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:54.583383 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fsph8" Apr 16 20:37:54.583630 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:54.583509 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fsph8" podUID="7d19a112-53ff-4260-97bb-aaa69848369c" Apr 16 20:37:54.676135 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:54.676100 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-fz4qv" event={"ID":"43c2ff47-b11f-4e08-99d4-c54547429f56","Type":"ContainerStarted","Data":"1cf2656351575003b1905d32e35a41971b3894cdbd578bd038b524e1a336b424"} Apr 16 20:37:54.691779 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:54.691710 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-fz4qv" podStartSLOduration=5.453996758 podStartE2EDuration="13.69169454s" podCreationTimestamp="2026-04-16 20:37:41 +0000 UTC" firstStartedPulling="2026-04-16 20:37:44.586092092 +0000 UTC m=+3.571235670" lastFinishedPulling="2026-04-16 20:37:52.823789878 +0000 UTC m=+11.808933452" observedRunningTime="2026-04-16 20:37:54.691192388 +0000 UTC m=+13.676336023" watchObservedRunningTime="2026-04-16 20:37:54.69169454 +0000 UTC m=+13.676838131" Apr 16 20:37:54.776388 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:54.776359 2537 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-nlrkh" Apr 16 20:37:54.777542 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:54.777520 2537 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-nlrkh" Apr 16 20:37:55.587543 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:55.587511 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4cr2r" Apr 16 20:37:55.588106 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:55.587648 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4cr2r" podUID="e8800de1-77ef-480f-81de-cc93318d33b6" Apr 16 20:37:55.588106 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:55.587687 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d6xkn" Apr 16 20:37:55.588106 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:55.587804 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d6xkn" podUID="11dcf076-13e5-4128-bf13-7e6c86c6dd5b" Apr 16 20:37:56.583420 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:56.583392 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fsph8" Apr 16 20:37:56.583599 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:56.583512 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fsph8" podUID="7d19a112-53ff-4260-97bb-aaa69848369c" Apr 16 20:37:56.679722 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:56.679692 2537 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 20:37:57.586612 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:57.586580 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4cr2r" Apr 16 20:37:57.586766 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:57.586586 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d6xkn" Apr 16 20:37:57.586766 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:57.586703 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4cr2r" podUID="e8800de1-77ef-480f-81de-cc93318d33b6" Apr 16 20:37:57.586766 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:57.586759 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d6xkn" podUID="11dcf076-13e5-4128-bf13-7e6c86c6dd5b" Apr 16 20:37:58.419428 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:58.419220 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7d19a112-53ff-4260-97bb-aaa69848369c-original-pull-secret\") pod \"global-pull-secret-syncer-fsph8\" (UID: \"7d19a112-53ff-4260-97bb-aaa69848369c\") " pod="kube-system/global-pull-secret-syncer-fsph8" Apr 16 20:37:58.419892 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:58.419352 2537 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:37:58.419892 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:58.419529 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d19a112-53ff-4260-97bb-aaa69848369c-original-pull-secret podName:7d19a112-53ff-4260-97bb-aaa69848369c nodeName:}" failed. No retries permitted until 2026-04-16 20:38:06.4195067 +0000 UTC m=+25.404650275 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7d19a112-53ff-4260-97bb-aaa69848369c-original-pull-secret") pod "global-pull-secret-syncer-fsph8" (UID: "7d19a112-53ff-4260-97bb-aaa69848369c") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:37:58.583685 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:58.583579 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fsph8" Apr 16 20:37:58.583827 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:58.583715 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fsph8" podUID="7d19a112-53ff-4260-97bb-aaa69848369c" Apr 16 20:37:59.225383 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:59.225354 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nvmvf\" (UniqueName: \"kubernetes.io/projected/e8800de1-77ef-480f-81de-cc93318d33b6-kube-api-access-nvmvf\") pod \"network-check-target-4cr2r\" (UID: \"e8800de1-77ef-480f-81de-cc93318d33b6\") " pod="openshift-network-diagnostics/network-check-target-4cr2r" Apr 16 20:37:59.225587 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:59.225407 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/11dcf076-13e5-4128-bf13-7e6c86c6dd5b-metrics-certs\") pod \"network-metrics-daemon-d6xkn\" (UID: \"11dcf076-13e5-4128-bf13-7e6c86c6dd5b\") " pod="openshift-multus/network-metrics-daemon-d6xkn" Apr 16 20:37:59.225587 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:59.225525 2537 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:37:59.225587 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:59.225547 2537 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:37:59.225587 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:59.225526 2537 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:37:59.225587 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:59.225573 2537 projected.go:194] Error preparing data for projected volume kube-api-access-nvmvf for pod openshift-network-diagnostics/network-check-target-4cr2r: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:37:59.225780 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:59.225627 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11dcf076-13e5-4128-bf13-7e6c86c6dd5b-metrics-certs podName:11dcf076-13e5-4128-bf13-7e6c86c6dd5b nodeName:}" failed. No retries permitted until 2026-04-16 20:38:15.225610055 +0000 UTC m=+34.210753637 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/11dcf076-13e5-4128-bf13-7e6c86c6dd5b-metrics-certs") pod "network-metrics-daemon-d6xkn" (UID: "11dcf076-13e5-4128-bf13-7e6c86c6dd5b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:37:59.225780 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:59.225647 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e8800de1-77ef-480f-81de-cc93318d33b6-kube-api-access-nvmvf podName:e8800de1-77ef-480f-81de-cc93318d33b6 nodeName:}" failed. No retries permitted until 2026-04-16 20:38:15.225637915 +0000 UTC m=+34.210781488 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-nvmvf" (UniqueName: "kubernetes.io/projected/e8800de1-77ef-480f-81de-cc93318d33b6-kube-api-access-nvmvf") pod "network-check-target-4cr2r" (UID: "e8800de1-77ef-480f-81de-cc93318d33b6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:37:59.583365 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:59.583277 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d6xkn" Apr 16 20:37:59.583365 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:37:59.583295 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4cr2r" Apr 16 20:37:59.583852 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:59.583412 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d6xkn" podUID="11dcf076-13e5-4128-bf13-7e6c86c6dd5b" Apr 16 20:37:59.583852 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:37:59.583537 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4cr2r" podUID="e8800de1-77ef-480f-81de-cc93318d33b6" Apr 16 20:38:00.176138 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:00.176110 2537 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-nlrkh" Apr 16 20:38:00.176319 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:00.176247 2537 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 20:38:00.177038 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:00.177018 2537 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-nlrkh" Apr 16 20:38:00.583761 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:00.583685 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fsph8" Apr 16 20:38:00.584185 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:00.583809 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fsph8" podUID="7d19a112-53ff-4260-97bb-aaa69848369c" Apr 16 20:38:01.584685 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:01.584655 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d6xkn" Apr 16 20:38:01.585141 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:01.584756 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4cr2r" Apr 16 20:38:01.585141 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:01.584783 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d6xkn" podUID="11dcf076-13e5-4128-bf13-7e6c86c6dd5b" Apr 16 20:38:01.585141 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:01.584822 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4cr2r" podUID="e8800de1-77ef-480f-81de-cc93318d33b6" Apr 16 20:38:02.583379 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:02.583343 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fsph8" Apr 16 20:38:02.583530 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:02.583473 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fsph8" podUID="7d19a112-53ff-4260-97bb-aaa69848369c" Apr 16 20:38:03.583426 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:03.583394 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4cr2r" Apr 16 20:38:03.583931 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:03.583512 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4cr2r" podUID="e8800de1-77ef-480f-81de-cc93318d33b6" Apr 16 20:38:03.583931 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:03.583592 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d6xkn" Apr 16 20:38:03.583931 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:03.583723 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d6xkn" podUID="11dcf076-13e5-4128-bf13-7e6c86c6dd5b" Apr 16 20:38:04.383085 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:04.383039 2537 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 20:38:04.548402 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:04.548144 2537 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T20:38:04.383059481Z","UUID":"50b0c00a-9698-45be-b2e7-4d0864aca5ef","Handler":null,"Name":"","Endpoint":""} Apr 16 20:38:04.549839 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:04.549822 2537 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 20:38:04.549911 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:04.549846 2537 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 20:38:04.583406 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:04.583326 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fsph8" Apr 16 20:38:04.583466 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:04.583422 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fsph8" podUID="7d19a112-53ff-4260-97bb-aaa69848369c" Apr 16 20:38:04.695895 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:04.695872 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ff6q" event={"ID":"366e6f1a-199e-4381-a962-80622ebd99b0","Type":"ContainerStarted","Data":"6efa7fe9e5eed731e14aa5fe4172fb9146e817624e2bd483a17d784b90a8bf33"} Apr 16 20:38:04.696995 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:04.696976 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8bj89" event={"ID":"7770e551-6840-450b-81fa-c37714dbe265","Type":"ContainerStarted","Data":"d370003b3953f8ed13a4021b2539933021edc0a7edd337404f1a12c64f9128e2"} Apr 16 20:38:04.698874 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:04.698850 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" event={"ID":"4f9a3fdc-c6ac-415d-a704-3724ed4158a1","Type":"ContainerStarted","Data":"4fe8026c1eae69de6dfce76b4d68200fc0d9118f9c5ee7edce741a2c45c65f1b"} Apr 16 20:38:04.698952 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:04.698878 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" event={"ID":"4f9a3fdc-c6ac-415d-a704-3724ed4158a1","Type":"ContainerStarted","Data":"667a69025813fecb6878d4f3c7f3801de082a673e6894a821869fdfd783691be"} Apr 16 20:38:04.698952 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:04.698892 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" event={"ID":"4f9a3fdc-c6ac-415d-a704-3724ed4158a1","Type":"ContainerStarted","Data":"5dc3a0d8febb684e2e35690740b7b6018b4c6b7146558eae5ad63b8df723c134"} Apr 16 20:38:04.698952 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:04.698902 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" event={"ID":"4f9a3fdc-c6ac-415d-a704-3724ed4158a1","Type":"ContainerStarted","Data":"479746998e8d89401efc51d206a72e817bbff17a8585752aa59e0e13b4c98225"} Apr 16 20:38:04.700288 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:04.700269 2537 generic.go:358] "Generic (PLEG): container finished" podID="88ebfbf2-8dfe-4d3c-91ed-559a91a0a925" containerID="8a2abbce9979208f83495708fd01dc4f6343821c4ddf169099b6f926c2e122a5" exitCode=0 Apr 16 20:38:04.700363 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:04.700296 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kslqr" event={"ID":"88ebfbf2-8dfe-4d3c-91ed-559a91a0a925","Type":"ContainerDied","Data":"8a2abbce9979208f83495708fd01dc4f6343821c4ddf169099b6f926c2e122a5"} Apr 16 20:38:04.721311 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:04.721277 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-8bj89" podStartSLOduration=4.068701521 podStartE2EDuration="23.721268116s" podCreationTimestamp="2026-04-16 20:37:41 +0000 UTC" firstStartedPulling="2026-04-16 20:37:44.583439043 +0000 UTC m=+3.568582626" lastFinishedPulling="2026-04-16 20:38:04.236005651 +0000 UTC m=+23.221149221" observedRunningTime="2026-04-16 20:38:04.721000255 +0000 UTC m=+23.706143847" watchObservedRunningTime="2026-04-16 20:38:04.721268116 +0000 UTC m=+23.706411706" Apr 16 20:38:05.583387 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:05.583316 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4cr2r" Apr 16 20:38:05.583387 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:05.583367 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d6xkn" Apr 16 20:38:05.583931 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:05.583458 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d6xkn" podUID="11dcf076-13e5-4128-bf13-7e6c86c6dd5b" Apr 16 20:38:05.583931 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:05.583594 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4cr2r" podUID="e8800de1-77ef-480f-81de-cc93318d33b6" Apr 16 20:38:05.703529 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:05.703502 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" event={"ID":"4f9a3fdc-c6ac-415d-a704-3724ed4158a1","Type":"ContainerStarted","Data":"1692ceea25b50380dd655e9089ca7c20383513470d53c0140db7091b68d8c2c5"} Apr 16 20:38:05.703529 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:05.703533 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" event={"ID":"4f9a3fdc-c6ac-415d-a704-3724ed4158a1","Type":"ContainerStarted","Data":"b1aeced2e92f20173741a72511156237865397e9808af10d8a832500bf712a86"} Apr 16 20:38:06.483297 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:06.483274 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7d19a112-53ff-4260-97bb-aaa69848369c-original-pull-secret\") pod \"global-pull-secret-syncer-fsph8\" (UID: \"7d19a112-53ff-4260-97bb-aaa69848369c\") " pod="kube-system/global-pull-secret-syncer-fsph8" Apr 16 20:38:06.483384 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:06.483369 2537 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:38:06.483422 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:06.483412 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d19a112-53ff-4260-97bb-aaa69848369c-original-pull-secret podName:7d19a112-53ff-4260-97bb-aaa69848369c nodeName:}" failed. No retries permitted until 2026-04-16 20:38:22.483400352 +0000 UTC m=+41.468543922 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7d19a112-53ff-4260-97bb-aaa69848369c-original-pull-secret") pod "global-pull-secret-syncer-fsph8" (UID: "7d19a112-53ff-4260-97bb-aaa69848369c") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:38:06.583429 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:06.583383 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fsph8" Apr 16 20:38:06.583517 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:06.583467 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fsph8" podUID="7d19a112-53ff-4260-97bb-aaa69848369c" Apr 16 20:38:06.706440 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:06.706417 2537 generic.go:358] "Generic (PLEG): container finished" podID="88ebfbf2-8dfe-4d3c-91ed-559a91a0a925" containerID="ab952c25397151356a6d261b2bfa356226bd08e56ebe390561debf7cdd8344b3" exitCode=0 Apr 16 20:38:06.706796 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:06.706486 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kslqr" event={"ID":"88ebfbf2-8dfe-4d3c-91ed-559a91a0a925","Type":"ContainerDied","Data":"ab952c25397151356a6d261b2bfa356226bd08e56ebe390561debf7cdd8344b3"} Apr 16 20:38:06.708222 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:06.708194 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ff6q" event={"ID":"366e6f1a-199e-4381-a962-80622ebd99b0","Type":"ContainerStarted","Data":"ed510d287bda03157b83248f35185678acfad390f91e4877e014007e3f6e315f"} Apr 16 20:38:06.744061 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:06.744025 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ff6q" podStartSLOduration=4.339287785 podStartE2EDuration="25.74401661s" podCreationTimestamp="2026-04-16 20:37:41 +0000 UTC" firstStartedPulling="2026-04-16 20:37:44.589090954 +0000 UTC m=+3.574234539" lastFinishedPulling="2026-04-16 20:38:05.993819789 +0000 UTC m=+24.978963364" observedRunningTime="2026-04-16 20:38:06.743988255 +0000 UTC m=+25.729131846" watchObservedRunningTime="2026-04-16 20:38:06.74401661 +0000 UTC m=+25.729160200" Apr 16 20:38:07.583038 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:07.583011 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d6xkn" Apr 16 20:38:07.583206 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:07.583016 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4cr2r" Apr 16 20:38:07.583206 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:07.583136 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d6xkn" podUID="11dcf076-13e5-4128-bf13-7e6c86c6dd5b" Apr 16 20:38:07.583412 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:07.583198 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4cr2r" podUID="e8800de1-77ef-480f-81de-cc93318d33b6" Apr 16 20:38:07.712210 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:07.712179 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" event={"ID":"4f9a3fdc-c6ac-415d-a704-3724ed4158a1","Type":"ContainerStarted","Data":"8b3bcc8ef35e0a16128163ebf0cedb1394080bc7f933f917bf5d3ac7f75e8810"} Apr 16 20:38:08.583399 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:08.583344 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fsph8" Apr 16 20:38:08.583499 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:08.583433 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fsph8" podUID="7d19a112-53ff-4260-97bb-aaa69848369c" Apr 16 20:38:08.715889 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:08.715856 2537 generic.go:358] "Generic (PLEG): container finished" podID="88ebfbf2-8dfe-4d3c-91ed-559a91a0a925" containerID="6265c6006c0f1df911f4044ccbeaf88099ed700c976486bc261cd742644c91c2" exitCode=0 Apr 16 20:38:08.716213 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:08.715925 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kslqr" event={"ID":"88ebfbf2-8dfe-4d3c-91ed-559a91a0a925","Type":"ContainerDied","Data":"6265c6006c0f1df911f4044ccbeaf88099ed700c976486bc261cd742644c91c2"} Apr 16 20:38:09.584221 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:09.584059 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d6xkn" Apr 16 20:38:09.584395 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:09.584104 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4cr2r" Apr 16 20:38:09.584395 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:09.584302 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d6xkn" podUID="11dcf076-13e5-4128-bf13-7e6c86c6dd5b" Apr 16 20:38:09.584395 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:09.584349 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4cr2r" podUID="e8800de1-77ef-480f-81de-cc93318d33b6" Apr 16 20:38:09.720233 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:09.720198 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" event={"ID":"4f9a3fdc-c6ac-415d-a704-3724ed4158a1","Type":"ContainerStarted","Data":"39398a559db1e21a06dec2433022de101cfaf4b3830e37a531497269046d5362"} Apr 16 20:38:09.720648 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:09.720470 2537 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:38:09.720648 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:09.720492 2537 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:38:09.734169 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:09.734146 2537 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:38:09.745085 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:09.745046 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" podStartSLOduration=9.073754591 podStartE2EDuration="28.745034956s" podCreationTimestamp="2026-04-16 20:37:41 +0000 UTC" firstStartedPulling="2026-04-16 20:37:44.577816683 +0000 UTC m=+3.562960267" lastFinishedPulling="2026-04-16 20:38:04.249097048 +0000 UTC m=+23.234240632" observedRunningTime="2026-04-16 20:38:09.744745388 +0000 UTC m=+28.729888998" watchObservedRunningTime="2026-04-16 20:38:09.745034956 +0000 UTC m=+28.730178547" Apr 16 20:38:10.583577 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:10.583535 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fsph8" Apr 16 20:38:10.583723 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:10.583680 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fsph8" podUID="7d19a112-53ff-4260-97bb-aaa69848369c" Apr 16 20:38:10.722897 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:10.722868 2537 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:38:10.735545 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:10.735523 2537 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:38:11.032181 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:11.032113 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-fsph8"] Apr 16 20:38:11.032312 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:11.032235 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fsph8" Apr 16 20:38:11.032364 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:11.032331 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fsph8" podUID="7d19a112-53ff-4260-97bb-aaa69848369c" Apr 16 20:38:11.034464 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:11.034444 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-d6xkn"] Apr 16 20:38:11.034546 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:11.034531 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d6xkn" Apr 16 20:38:11.034631 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:11.034617 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d6xkn" podUID="11dcf076-13e5-4128-bf13-7e6c86c6dd5b" Apr 16 20:38:11.045572 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:11.045539 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-4cr2r"] Apr 16 20:38:11.045639 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:11.045628 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4cr2r" Apr 16 20:38:11.045738 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:11.045720 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4cr2r" podUID="e8800de1-77ef-480f-81de-cc93318d33b6" Apr 16 20:38:12.583664 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:12.583634 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d6xkn" Apr 16 20:38:12.584110 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:12.583745 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fsph8" Apr 16 20:38:12.584110 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:12.583753 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d6xkn" podUID="11dcf076-13e5-4128-bf13-7e6c86c6dd5b" Apr 16 20:38:12.584110 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:12.583823 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fsph8" podUID="7d19a112-53ff-4260-97bb-aaa69848369c" Apr 16 20:38:12.584110 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:12.583857 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4cr2r" Apr 16 20:38:12.584110 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:12.583908 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4cr2r" podUID="e8800de1-77ef-480f-81de-cc93318d33b6" Apr 16 20:38:14.583768 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:14.583735 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fsph8" Apr 16 20:38:14.584227 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:14.583776 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4cr2r" Apr 16 20:38:14.584227 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:14.583814 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d6xkn" Apr 16 20:38:14.584227 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:14.583923 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4cr2r" podUID="e8800de1-77ef-480f-81de-cc93318d33b6" Apr 16 20:38:14.584227 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:14.584028 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fsph8" podUID="7d19a112-53ff-4260-97bb-aaa69848369c" Apr 16 20:38:14.584227 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:14.584104 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d6xkn" podUID="11dcf076-13e5-4128-bf13-7e6c86c6dd5b" Apr 16 20:38:15.256015 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:15.255984 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/11dcf076-13e5-4128-bf13-7e6c86c6dd5b-metrics-certs\") pod \"network-metrics-daemon-d6xkn\" (UID: \"11dcf076-13e5-4128-bf13-7e6c86c6dd5b\") " pod="openshift-multus/network-metrics-daemon-d6xkn" Apr 16 20:38:15.256180 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:15.256076 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nvmvf\" (UniqueName: \"kubernetes.io/projected/e8800de1-77ef-480f-81de-cc93318d33b6-kube-api-access-nvmvf\") pod \"network-check-target-4cr2r\" (UID: \"e8800de1-77ef-480f-81de-cc93318d33b6\") " pod="openshift-network-diagnostics/network-check-target-4cr2r" Apr 16 20:38:15.256180 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:15.256136 2537 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:38:15.256295 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:15.256183 2537 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:38:15.256295 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:15.256201 2537 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:38:15.256295 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:15.256211 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11dcf076-13e5-4128-bf13-7e6c86c6dd5b-metrics-certs podName:11dcf076-13e5-4128-bf13-7e6c86c6dd5b nodeName:}" failed. No retries permitted until 2026-04-16 20:38:47.256190566 +0000 UTC m=+66.241334147 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/11dcf076-13e5-4128-bf13-7e6c86c6dd5b-metrics-certs") pod "network-metrics-daemon-d6xkn" (UID: "11dcf076-13e5-4128-bf13-7e6c86c6dd5b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:38:15.256295 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:15.256213 2537 projected.go:194] Error preparing data for projected volume kube-api-access-nvmvf for pod openshift-network-diagnostics/network-check-target-4cr2r: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:38:15.256295 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:15.256263 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e8800de1-77ef-480f-81de-cc93318d33b6-kube-api-access-nvmvf podName:e8800de1-77ef-480f-81de-cc93318d33b6 nodeName:}" failed. No retries permitted until 2026-04-16 20:38:47.256248945 +0000 UTC m=+66.241392534 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-nvmvf" (UniqueName: "kubernetes.io/projected/e8800de1-77ef-480f-81de-cc93318d33b6-kube-api-access-nvmvf") pod "network-check-target-4cr2r" (UID: "e8800de1-77ef-480f-81de-cc93318d33b6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:38:16.583838 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:16.583648 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fsph8" Apr 16 20:38:16.584301 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:16.583652 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d6xkn" Apr 16 20:38:16.584301 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:16.583936 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fsph8" podUID="7d19a112-53ff-4260-97bb-aaa69848369c" Apr 16 20:38:16.584301 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:16.583662 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4cr2r" Apr 16 20:38:16.584445 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:16.584350 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4cr2r" podUID="e8800de1-77ef-480f-81de-cc93318d33b6" Apr 16 20:38:16.584528 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:16.584494 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d6xkn" podUID="11dcf076-13e5-4128-bf13-7e6c86c6dd5b" Apr 16 20:38:17.737354 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.737318 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kslqr" event={"ID":"88ebfbf2-8dfe-4d3c-91ed-559a91a0a925","Type":"ContainerStarted","Data":"56f6499439813a55585b363a1051d381783e8e0aa2dff23c23594a475c879ed8"} Apr 16 20:38:17.850379 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.850342 2537 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-199.ec2.internal" event="NodeReady" Apr 16 20:38:17.850510 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.850502 2537 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 20:38:17.884160 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.884131 2537 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-s2hb7"] Apr 16 20:38:17.898972 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.898947 2537 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-657b97c9cd-g52rs"] Apr 16 20:38:17.899144 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.899121 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-s2hb7" Apr 16 20:38:17.901788 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.901622 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 20:38:17.901788 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.901677 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 20:38:17.901935 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.901894 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-pgwz8\"" Apr 16 20:38:17.917639 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.917619 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-s2hb7"] Apr 16 20:38:17.917639 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.917642 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-657b97c9cd-g52rs"] Apr 16 20:38:17.917796 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.917652 2537 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-fw7ch"] Apr 16 20:38:17.917796 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.917778 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" Apr 16 20:38:17.920447 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.920422 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 20:38:17.921689 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.921447 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 20:38:17.921689 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.921458 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-66wkg\"" Apr 16 20:38:17.922285 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.921974 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 20:38:17.927877 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.927860 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 20:38:17.936446 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.936430 2537 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-dxvbz"] Apr 16 20:38:17.936602 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.936576 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fw7ch" Apr 16 20:38:17.938991 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.938968 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 20:38:17.939085 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.938968 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-wwh4f\"" Apr 16 20:38:17.939085 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.938968 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 20:38:17.960360 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.960343 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dxvbz"] Apr 16 20:38:17.960360 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.960363 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fw7ch"] Apr 16 20:38:17.960473 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.960441 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dxvbz" Apr 16 20:38:17.962604 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.962578 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 20:38:17.962740 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.962726 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 20:38:17.962801 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.962767 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 20:38:17.962850 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.962834 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-7vq8v\"" Apr 16 20:38:17.976001 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.975976 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f300a967-50a7-4aca-b7dc-57b123274d17-bound-sa-token\") pod \"image-registry-657b97c9cd-g52rs\" (UID: \"f300a967-50a7-4aca-b7dc-57b123274d17\") " pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" Apr 16 20:38:17.976097 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.976027 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4qqb\" (UniqueName: \"kubernetes.io/projected/e45e2b17-af71-470b-a92b-013389ef5f6c-kube-api-access-t4qqb\") pod \"ingress-canary-dxvbz\" (UID: \"e45e2b17-af71-470b-a92b-013389ef5f6c\") " pod="openshift-ingress-canary/ingress-canary-dxvbz" Apr 16 20:38:17.976097 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.976053 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f300a967-50a7-4aca-b7dc-57b123274d17-image-registry-private-configuration\") pod \"image-registry-657b97c9cd-g52rs\" (UID: \"f300a967-50a7-4aca-b7dc-57b123274d17\") " pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" Apr 16 20:38:17.976097 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.976078 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f300a967-50a7-4aca-b7dc-57b123274d17-trusted-ca\") pod \"image-registry-657b97c9cd-g52rs\" (UID: \"f300a967-50a7-4aca-b7dc-57b123274d17\") " pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" Apr 16 20:38:17.976247 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.976102 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s98nw\" (UniqueName: \"kubernetes.io/projected/f300a967-50a7-4aca-b7dc-57b123274d17-kube-api-access-s98nw\") pod \"image-registry-657b97c9cd-g52rs\" (UID: \"f300a967-50a7-4aca-b7dc-57b123274d17\") " pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" Apr 16 20:38:17.976247 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.976193 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmrh6\" (UniqueName: \"kubernetes.io/projected/8ff520c7-8bbe-42a7-8fc6-dced59fa3098-kube-api-access-bmrh6\") pod \"dns-default-fw7ch\" (UID: \"8ff520c7-8bbe-42a7-8fc6-dced59fa3098\") " pod="openshift-dns/dns-default-fw7ch" Apr 16 20:38:17.976343 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.976248 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/963c2c1a-13eb-461e-bac5-d7a50b6c68ca-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-s2hb7\" (UID: \"963c2c1a-13eb-461e-bac5-d7a50b6c68ca\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-s2hb7" Apr 16 20:38:17.976343 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.976277 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8ff520c7-8bbe-42a7-8fc6-dced59fa3098-metrics-tls\") pod \"dns-default-fw7ch\" (UID: \"8ff520c7-8bbe-42a7-8fc6-dced59fa3098\") " pod="openshift-dns/dns-default-fw7ch" Apr 16 20:38:17.976343 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.976303 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8ff520c7-8bbe-42a7-8fc6-dced59fa3098-tmp-dir\") pod \"dns-default-fw7ch\" (UID: \"8ff520c7-8bbe-42a7-8fc6-dced59fa3098\") " pod="openshift-dns/dns-default-fw7ch" Apr 16 20:38:17.976466 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.976340 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f300a967-50a7-4aca-b7dc-57b123274d17-registry-certificates\") pod \"image-registry-657b97c9cd-g52rs\" (UID: \"f300a967-50a7-4aca-b7dc-57b123274d17\") " pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" Apr 16 20:38:17.976466 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.976369 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f300a967-50a7-4aca-b7dc-57b123274d17-installation-pull-secrets\") pod \"image-registry-657b97c9cd-g52rs\" (UID: \"f300a967-50a7-4aca-b7dc-57b123274d17\") " pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" Apr 16 20:38:17.976466 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.976412 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ff520c7-8bbe-42a7-8fc6-dced59fa3098-config-volume\") pod \"dns-default-fw7ch\" (UID: \"8ff520c7-8bbe-42a7-8fc6-dced59fa3098\") " pod="openshift-dns/dns-default-fw7ch" Apr 16 20:38:17.976466 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.976436 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/963c2c1a-13eb-461e-bac5-d7a50b6c68ca-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-s2hb7\" (UID: \"963c2c1a-13eb-461e-bac5-d7a50b6c68ca\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-s2hb7" Apr 16 20:38:17.976466 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.976452 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f300a967-50a7-4aca-b7dc-57b123274d17-registry-tls\") pod \"image-registry-657b97c9cd-g52rs\" (UID: \"f300a967-50a7-4aca-b7dc-57b123274d17\") " pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" Apr 16 20:38:17.976641 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.976467 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f300a967-50a7-4aca-b7dc-57b123274d17-ca-trust-extracted\") pod \"image-registry-657b97c9cd-g52rs\" (UID: \"f300a967-50a7-4aca-b7dc-57b123274d17\") " pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" Apr 16 20:38:17.976641 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:17.976492 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e45e2b17-af71-470b-a92b-013389ef5f6c-cert\") pod \"ingress-canary-dxvbz\" (UID: \"e45e2b17-af71-470b-a92b-013389ef5f6c\") " pod="openshift-ingress-canary/ingress-canary-dxvbz" Apr 16 20:38:18.077307 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:18.077274 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bmrh6\" (UniqueName: \"kubernetes.io/projected/8ff520c7-8bbe-42a7-8fc6-dced59fa3098-kube-api-access-bmrh6\") pod \"dns-default-fw7ch\" (UID: \"8ff520c7-8bbe-42a7-8fc6-dced59fa3098\") " pod="openshift-dns/dns-default-fw7ch" Apr 16 20:38:18.077307 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:18.077308 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/963c2c1a-13eb-461e-bac5-d7a50b6c68ca-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-s2hb7\" (UID: \"963c2c1a-13eb-461e-bac5-d7a50b6c68ca\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-s2hb7" Apr 16 20:38:18.077544 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:18.077331 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8ff520c7-8bbe-42a7-8fc6-dced59fa3098-metrics-tls\") pod \"dns-default-fw7ch\" (UID: \"8ff520c7-8bbe-42a7-8fc6-dced59fa3098\") " pod="openshift-dns/dns-default-fw7ch" Apr 16 20:38:18.077544 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:18.077347 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8ff520c7-8bbe-42a7-8fc6-dced59fa3098-tmp-dir\") pod \"dns-default-fw7ch\" (UID: \"8ff520c7-8bbe-42a7-8fc6-dced59fa3098\") " pod="openshift-dns/dns-default-fw7ch" Apr 16 20:38:18.077544 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:18.077385 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f300a967-50a7-4aca-b7dc-57b123274d17-registry-certificates\") pod \"image-registry-657b97c9cd-g52rs\" (UID: \"f300a967-50a7-4aca-b7dc-57b123274d17\") " pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" Apr 16 20:38:18.077544 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:18.077412 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f300a967-50a7-4aca-b7dc-57b123274d17-installation-pull-secrets\") pod \"image-registry-657b97c9cd-g52rs\" (UID: \"f300a967-50a7-4aca-b7dc-57b123274d17\") " pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" Apr 16 20:38:18.077544 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:18.077428 2537 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 20:38:18.077544 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:18.077446 2537 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:38:18.077544 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:18.077436 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ff520c7-8bbe-42a7-8fc6-dced59fa3098-config-volume\") pod \"dns-default-fw7ch\" (UID: \"8ff520c7-8bbe-42a7-8fc6-dced59fa3098\") " pod="openshift-dns/dns-default-fw7ch" Apr 16 20:38:18.077544 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:18.077508 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ff520c7-8bbe-42a7-8fc6-dced59fa3098-metrics-tls podName:8ff520c7-8bbe-42a7-8fc6-dced59fa3098 nodeName:}" failed. No retries permitted until 2026-04-16 20:38:18.577486776 +0000 UTC m=+37.562630353 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8ff520c7-8bbe-42a7-8fc6-dced59fa3098-metrics-tls") pod "dns-default-fw7ch" (UID: "8ff520c7-8bbe-42a7-8fc6-dced59fa3098") : secret "dns-default-metrics-tls" not found Apr 16 20:38:18.077544 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:18.077543 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/963c2c1a-13eb-461e-bac5-d7a50b6c68ca-networking-console-plugin-cert podName:963c2c1a-13eb-461e-bac5-d7a50b6c68ca nodeName:}" failed. No retries permitted until 2026-04-16 20:38:18.57753249 +0000 UTC m=+37.562676065 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/963c2c1a-13eb-461e-bac5-d7a50b6c68ca-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-s2hb7" (UID: "963c2c1a-13eb-461e-bac5-d7a50b6c68ca") : secret "networking-console-plugin-cert" not found Apr 16 20:38:18.078069 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:18.077584 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/963c2c1a-13eb-461e-bac5-d7a50b6c68ca-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-s2hb7\" (UID: \"963c2c1a-13eb-461e-bac5-d7a50b6c68ca\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-s2hb7" Apr 16 20:38:18.078069 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:18.077613 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f300a967-50a7-4aca-b7dc-57b123274d17-registry-tls\") pod \"image-registry-657b97c9cd-g52rs\" (UID: \"f300a967-50a7-4aca-b7dc-57b123274d17\") " pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" Apr 16 20:38:18.078069 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:18.077649 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f300a967-50a7-4aca-b7dc-57b123274d17-ca-trust-extracted\") pod \"image-registry-657b97c9cd-g52rs\" (UID: \"f300a967-50a7-4aca-b7dc-57b123274d17\") " pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" Apr 16 20:38:18.078069 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:18.077678 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e45e2b17-af71-470b-a92b-013389ef5f6c-cert\") pod \"ingress-canary-dxvbz\" (UID: \"e45e2b17-af71-470b-a92b-013389ef5f6c\") " pod="openshift-ingress-canary/ingress-canary-dxvbz" Apr 16 20:38:18.078069 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:18.077740 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8ff520c7-8bbe-42a7-8fc6-dced59fa3098-tmp-dir\") pod \"dns-default-fw7ch\" (UID: \"8ff520c7-8bbe-42a7-8fc6-dced59fa3098\") " pod="openshift-dns/dns-default-fw7ch" Apr 16 20:38:18.078069 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:18.077750 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f300a967-50a7-4aca-b7dc-57b123274d17-bound-sa-token\") pod \"image-registry-657b97c9cd-g52rs\" (UID: \"f300a967-50a7-4aca-b7dc-57b123274d17\") " pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" Apr 16 20:38:18.078069 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:18.077818 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t4qqb\" (UniqueName: \"kubernetes.io/projected/e45e2b17-af71-470b-a92b-013389ef5f6c-kube-api-access-t4qqb\") pod \"ingress-canary-dxvbz\" (UID: \"e45e2b17-af71-470b-a92b-013389ef5f6c\") " pod="openshift-ingress-canary/ingress-canary-dxvbz" Apr 16 20:38:18.078069 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:18.077845 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f300a967-50a7-4aca-b7dc-57b123274d17-image-registry-private-configuration\") pod \"image-registry-657b97c9cd-g52rs\" (UID: \"f300a967-50a7-4aca-b7dc-57b123274d17\") " pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" Apr 16 20:38:18.078069 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:18.077870 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f300a967-50a7-4aca-b7dc-57b123274d17-trusted-ca\") pod \"image-registry-657b97c9cd-g52rs\" (UID: \"f300a967-50a7-4aca-b7dc-57b123274d17\") " pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" Apr 16 20:38:18.078069 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:18.077897 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s98nw\" (UniqueName: \"kubernetes.io/projected/f300a967-50a7-4aca-b7dc-57b123274d17-kube-api-access-s98nw\") pod \"image-registry-657b97c9cd-g52rs\" (UID: \"f300a967-50a7-4aca-b7dc-57b123274d17\") " pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" Apr 16 20:38:18.078069 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:18.078025 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f300a967-50a7-4aca-b7dc-57b123274d17-registry-certificates\") pod \"image-registry-657b97c9cd-g52rs\" (UID: \"f300a967-50a7-4aca-b7dc-57b123274d17\") " pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" Apr 16 20:38:18.078069 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:18.078034 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ff520c7-8bbe-42a7-8fc6-dced59fa3098-config-volume\") pod \"dns-default-fw7ch\" (UID: \"8ff520c7-8bbe-42a7-8fc6-dced59fa3098\") " pod="openshift-dns/dns-default-fw7ch" Apr 16 20:38:18.078069 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:18.078038 2537 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:38:18.078069 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:18.078060 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f300a967-50a7-4aca-b7dc-57b123274d17-ca-trust-extracted\") pod \"image-registry-657b97c9cd-g52rs\" (UID: \"f300a967-50a7-4aca-b7dc-57b123274d17\") " pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" Apr 16 20:38:18.078590 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:18.078095 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e45e2b17-af71-470b-a92b-013389ef5f6c-cert podName:e45e2b17-af71-470b-a92b-013389ef5f6c nodeName:}" failed. No retries permitted until 2026-04-16 20:38:18.578079113 +0000 UTC m=+37.563222685 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e45e2b17-af71-470b-a92b-013389ef5f6c-cert") pod "ingress-canary-dxvbz" (UID: "e45e2b17-af71-470b-a92b-013389ef5f6c") : secret "canary-serving-cert" not found Apr 16 20:38:18.078590 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:18.078107 2537 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:38:18.078590 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:18.078118 2537 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-657b97c9cd-g52rs: secret "image-registry-tls" not found Apr 16 20:38:18.078590 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:18.078165 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f300a967-50a7-4aca-b7dc-57b123274d17-registry-tls podName:f300a967-50a7-4aca-b7dc-57b123274d17 nodeName:}" failed. No retries permitted until 2026-04-16 20:38:18.578152193 +0000 UTC m=+37.563295779 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f300a967-50a7-4aca-b7dc-57b123274d17-registry-tls") pod "image-registry-657b97c9cd-g52rs" (UID: "f300a967-50a7-4aca-b7dc-57b123274d17") : secret "image-registry-tls" not found Apr 16 20:38:18.078590 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:18.078260 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/963c2c1a-13eb-461e-bac5-d7a50b6c68ca-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-s2hb7\" (UID: \"963c2c1a-13eb-461e-bac5-d7a50b6c68ca\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-s2hb7" Apr 16 20:38:18.078845 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:18.078827 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f300a967-50a7-4aca-b7dc-57b123274d17-trusted-ca\") pod \"image-registry-657b97c9cd-g52rs\" (UID: \"f300a967-50a7-4aca-b7dc-57b123274d17\") " pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" Apr 16 20:38:18.081453 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:18.081431 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f300a967-50a7-4aca-b7dc-57b123274d17-image-registry-private-configuration\") pod \"image-registry-657b97c9cd-g52rs\" (UID: \"f300a967-50a7-4aca-b7dc-57b123274d17\") " pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" Apr 16 20:38:18.081536 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:18.081469 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f300a967-50a7-4aca-b7dc-57b123274d17-installation-pull-secrets\") pod \"image-registry-657b97c9cd-g52rs\" (UID: \"f300a967-50a7-4aca-b7dc-57b123274d17\") " pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" Apr 16 20:38:18.086360 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:18.086290 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmrh6\" (UniqueName: \"kubernetes.io/projected/8ff520c7-8bbe-42a7-8fc6-dced59fa3098-kube-api-access-bmrh6\") pod \"dns-default-fw7ch\" (UID: \"8ff520c7-8bbe-42a7-8fc6-dced59fa3098\") " pod="openshift-dns/dns-default-fw7ch" Apr 16 20:38:18.086450 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:18.086369 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s98nw\" (UniqueName: \"kubernetes.io/projected/f300a967-50a7-4aca-b7dc-57b123274d17-kube-api-access-s98nw\") pod \"image-registry-657b97c9cd-g52rs\" (UID: \"f300a967-50a7-4aca-b7dc-57b123274d17\") " pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" Apr 16 20:38:18.086450 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:18.086416 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f300a967-50a7-4aca-b7dc-57b123274d17-bound-sa-token\") pod \"image-registry-657b97c9cd-g52rs\" (UID: \"f300a967-50a7-4aca-b7dc-57b123274d17\") " pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" Apr 16 20:38:18.086524 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:18.086451 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4qqb\" (UniqueName: \"kubernetes.io/projected/e45e2b17-af71-470b-a92b-013389ef5f6c-kube-api-access-t4qqb\") pod \"ingress-canary-dxvbz\" (UID: \"e45e2b17-af71-470b-a92b-013389ef5f6c\") " pod="openshift-ingress-canary/ingress-canary-dxvbz" Apr 16 20:38:18.580312 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:18.580276 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f300a967-50a7-4aca-b7dc-57b123274d17-registry-tls\") pod \"image-registry-657b97c9cd-g52rs\" (UID: \"f300a967-50a7-4aca-b7dc-57b123274d17\") " pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" Apr 16 20:38:18.580312 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:18.580312 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e45e2b17-af71-470b-a92b-013389ef5f6c-cert\") pod \"ingress-canary-dxvbz\" (UID: \"e45e2b17-af71-470b-a92b-013389ef5f6c\") " pod="openshift-ingress-canary/ingress-canary-dxvbz" Apr 16 20:38:18.580548 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:18.580368 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/963c2c1a-13eb-461e-bac5-d7a50b6c68ca-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-s2hb7\" (UID: \"963c2c1a-13eb-461e-bac5-d7a50b6c68ca\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-s2hb7" Apr 16 20:38:18.580548 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:18.580386 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8ff520c7-8bbe-42a7-8fc6-dced59fa3098-metrics-tls\") pod \"dns-default-fw7ch\" (UID: \"8ff520c7-8bbe-42a7-8fc6-dced59fa3098\") " pod="openshift-dns/dns-default-fw7ch" Apr 16 20:38:18.580548 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:18.580420 2537 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:38:18.580548 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:18.580439 2537 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-657b97c9cd-g52rs: secret "image-registry-tls" not found Apr 16 20:38:18.580548 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:18.580488 2537 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:38:18.580548 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:18.580495 2537 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:38:18.580548 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:18.580502 2537 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 20:38:18.580548 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:18.580513 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f300a967-50a7-4aca-b7dc-57b123274d17-registry-tls podName:f300a967-50a7-4aca-b7dc-57b123274d17 nodeName:}" failed. No retries permitted until 2026-04-16 20:38:19.580490618 +0000 UTC m=+38.565634208 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f300a967-50a7-4aca-b7dc-57b123274d17-registry-tls") pod "image-registry-657b97c9cd-g52rs" (UID: "f300a967-50a7-4aca-b7dc-57b123274d17") : secret "image-registry-tls" not found Apr 16 20:38:18.580548 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:18.580537 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ff520c7-8bbe-42a7-8fc6-dced59fa3098-metrics-tls podName:8ff520c7-8bbe-42a7-8fc6-dced59fa3098 nodeName:}" failed. No retries permitted until 2026-04-16 20:38:19.580521429 +0000 UTC m=+38.565664998 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8ff520c7-8bbe-42a7-8fc6-dced59fa3098-metrics-tls") pod "dns-default-fw7ch" (UID: "8ff520c7-8bbe-42a7-8fc6-dced59fa3098") : secret "dns-default-metrics-tls" not found Apr 16 20:38:18.580548 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:18.580554 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e45e2b17-af71-470b-a92b-013389ef5f6c-cert podName:e45e2b17-af71-470b-a92b-013389ef5f6c nodeName:}" failed. No retries permitted until 2026-04-16 20:38:19.580546037 +0000 UTC m=+38.565689611 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e45e2b17-af71-470b-a92b-013389ef5f6c-cert") pod "ingress-canary-dxvbz" (UID: "e45e2b17-af71-470b-a92b-013389ef5f6c") : secret "canary-serving-cert" not found Apr 16 20:38:18.580887 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:18.580590 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/963c2c1a-13eb-461e-bac5-d7a50b6c68ca-networking-console-plugin-cert podName:963c2c1a-13eb-461e-bac5-d7a50b6c68ca nodeName:}" failed. No retries permitted until 2026-04-16 20:38:19.580582266 +0000 UTC m=+38.565725836 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/963c2c1a-13eb-461e-bac5-d7a50b6c68ca-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-s2hb7" (UID: "963c2c1a-13eb-461e-bac5-d7a50b6c68ca") : secret "networking-console-plugin-cert" not found Apr 16 20:38:18.583368 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:18.583355 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d6xkn" Apr 16 20:38:18.583449 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:18.583355 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fsph8" Apr 16 20:38:18.583495 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:18.583470 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4cr2r" Apr 16 20:38:18.585796 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:18.585780 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 20:38:18.585889 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:18.585814 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 20:38:18.586767 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:18.586744 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 20:38:18.586871 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:18.586800 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vdj2n\"" Apr 16 20:38:18.586871 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:18.586822 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 20:38:18.586960 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:18.586822 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-484zf\"" Apr 16 20:38:18.741112 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:18.741088 2537 generic.go:358] "Generic (PLEG): container finished" podID="88ebfbf2-8dfe-4d3c-91ed-559a91a0a925" containerID="56f6499439813a55585b363a1051d381783e8e0aa2dff23c23594a475c879ed8" exitCode=0 Apr 16 20:38:18.741551 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:18.741135 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kslqr" event={"ID":"88ebfbf2-8dfe-4d3c-91ed-559a91a0a925","Type":"ContainerDied","Data":"56f6499439813a55585b363a1051d381783e8e0aa2dff23c23594a475c879ed8"} Apr 16 20:38:19.586199 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:19.586170 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f300a967-50a7-4aca-b7dc-57b123274d17-registry-tls\") pod \"image-registry-657b97c9cd-g52rs\" (UID: \"f300a967-50a7-4aca-b7dc-57b123274d17\") " pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" Apr 16 20:38:19.586374 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:19.586202 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e45e2b17-af71-470b-a92b-013389ef5f6c-cert\") pod \"ingress-canary-dxvbz\" (UID: \"e45e2b17-af71-470b-a92b-013389ef5f6c\") " pod="openshift-ingress-canary/ingress-canary-dxvbz" Apr 16 20:38:19.586374 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:19.586259 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/963c2c1a-13eb-461e-bac5-d7a50b6c68ca-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-s2hb7\" (UID: \"963c2c1a-13eb-461e-bac5-d7a50b6c68ca\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-s2hb7" Apr 16 20:38:19.586374 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:19.586284 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8ff520c7-8bbe-42a7-8fc6-dced59fa3098-metrics-tls\") pod \"dns-default-fw7ch\" (UID: \"8ff520c7-8bbe-42a7-8fc6-dced59fa3098\") " pod="openshift-dns/dns-default-fw7ch" Apr 16 20:38:19.586374 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:19.586353 2537 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:38:19.586520 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:19.586379 2537 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-657b97c9cd-g52rs: secret "image-registry-tls" not found Apr 16 20:38:19.586520 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:19.586385 2537 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:38:19.586520 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:19.586430 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f300a967-50a7-4aca-b7dc-57b123274d17-registry-tls podName:f300a967-50a7-4aca-b7dc-57b123274d17 nodeName:}" failed. No retries permitted until 2026-04-16 20:38:21.586414484 +0000 UTC m=+40.571558057 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f300a967-50a7-4aca-b7dc-57b123274d17-registry-tls") pod "image-registry-657b97c9cd-g52rs" (UID: "f300a967-50a7-4aca-b7dc-57b123274d17") : secret "image-registry-tls" not found Apr 16 20:38:19.586520 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:19.586351 2537 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 20:38:19.586520 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:19.586459 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e45e2b17-af71-470b-a92b-013389ef5f6c-cert podName:e45e2b17-af71-470b-a92b-013389ef5f6c nodeName:}" failed. No retries permitted until 2026-04-16 20:38:21.586440405 +0000 UTC m=+40.571583980 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e45e2b17-af71-470b-a92b-013389ef5f6c-cert") pod "ingress-canary-dxvbz" (UID: "e45e2b17-af71-470b-a92b-013389ef5f6c") : secret "canary-serving-cert" not found Apr 16 20:38:19.586520 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:19.586390 2537 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:38:19.586520 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:19.586486 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/963c2c1a-13eb-461e-bac5-d7a50b6c68ca-networking-console-plugin-cert podName:963c2c1a-13eb-461e-bac5-d7a50b6c68ca nodeName:}" failed. No retries permitted until 2026-04-16 20:38:21.586474565 +0000 UTC m=+40.571618134 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/963c2c1a-13eb-461e-bac5-d7a50b6c68ca-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-s2hb7" (UID: "963c2c1a-13eb-461e-bac5-d7a50b6c68ca") : secret "networking-console-plugin-cert" not found Apr 16 20:38:19.586520 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:19.586507 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ff520c7-8bbe-42a7-8fc6-dced59fa3098-metrics-tls podName:8ff520c7-8bbe-42a7-8fc6-dced59fa3098 nodeName:}" failed. No retries permitted until 2026-04-16 20:38:21.586494946 +0000 UTC m=+40.571638516 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8ff520c7-8bbe-42a7-8fc6-dced59fa3098-metrics-tls") pod "dns-default-fw7ch" (UID: "8ff520c7-8bbe-42a7-8fc6-dced59fa3098") : secret "dns-default-metrics-tls" not found Apr 16 20:38:19.745626 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:19.745596 2537 generic.go:358] "Generic (PLEG): container finished" podID="88ebfbf2-8dfe-4d3c-91ed-559a91a0a925" containerID="a136dd1eea9a08c4c7e71443cb1004f9a0bd070893a693e1d310b34ab4a0c10f" exitCode=0 Apr 16 20:38:19.746084 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:19.745643 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kslqr" event={"ID":"88ebfbf2-8dfe-4d3c-91ed-559a91a0a925","Type":"ContainerDied","Data":"a136dd1eea9a08c4c7e71443cb1004f9a0bd070893a693e1d310b34ab4a0c10f"} Apr 16 20:38:20.751185 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:20.751151 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kslqr" event={"ID":"88ebfbf2-8dfe-4d3c-91ed-559a91a0a925","Type":"ContainerStarted","Data":"c20ca328259ce15b5c33dba160ab4a7c8f7c7d89b312705793b315183a746456"} Apr 16 20:38:20.774720 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:20.774681 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-kslqr" podStartSLOduration=6.804091727 podStartE2EDuration="39.774667739s" podCreationTimestamp="2026-04-16 20:37:41 +0000 UTC" firstStartedPulling="2026-04-16 20:37:44.589706437 +0000 UTC m=+3.574850019" lastFinishedPulling="2026-04-16 20:38:17.560282459 +0000 UTC m=+36.545426031" observedRunningTime="2026-04-16 20:38:20.773088694 +0000 UTC m=+39.758232295" watchObservedRunningTime="2026-04-16 20:38:20.774667739 +0000 UTC m=+39.759811329" Apr 16 20:38:21.601455 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:21.601423 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f300a967-50a7-4aca-b7dc-57b123274d17-registry-tls\") pod \"image-registry-657b97c9cd-g52rs\" (UID: \"f300a967-50a7-4aca-b7dc-57b123274d17\") " pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" Apr 16 20:38:21.601455 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:21.601458 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e45e2b17-af71-470b-a92b-013389ef5f6c-cert\") pod \"ingress-canary-dxvbz\" (UID: \"e45e2b17-af71-470b-a92b-013389ef5f6c\") " pod="openshift-ingress-canary/ingress-canary-dxvbz" Apr 16 20:38:21.601664 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:21.601537 2537 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:38:21.601664 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:21.601541 2537 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:38:21.601664 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:21.601568 2537 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-657b97c9cd-g52rs: secret "image-registry-tls" not found Apr 16 20:38:21.601664 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:21.601599 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e45e2b17-af71-470b-a92b-013389ef5f6c-cert podName:e45e2b17-af71-470b-a92b-013389ef5f6c nodeName:}" failed. No retries permitted until 2026-04-16 20:38:25.601585615 +0000 UTC m=+44.586729185 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e45e2b17-af71-470b-a92b-013389ef5f6c-cert") pod "ingress-canary-dxvbz" (UID: "e45e2b17-af71-470b-a92b-013389ef5f6c") : secret "canary-serving-cert" not found Apr 16 20:38:21.601785 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:21.601677 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f300a967-50a7-4aca-b7dc-57b123274d17-registry-tls podName:f300a967-50a7-4aca-b7dc-57b123274d17 nodeName:}" failed. No retries permitted until 2026-04-16 20:38:25.601658658 +0000 UTC m=+44.586802241 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f300a967-50a7-4aca-b7dc-57b123274d17-registry-tls") pod "image-registry-657b97c9cd-g52rs" (UID: "f300a967-50a7-4aca-b7dc-57b123274d17") : secret "image-registry-tls" not found Apr 16 20:38:21.601785 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:21.601750 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/963c2c1a-13eb-461e-bac5-d7a50b6c68ca-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-s2hb7\" (UID: \"963c2c1a-13eb-461e-bac5-d7a50b6c68ca\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-s2hb7" Apr 16 20:38:21.601785 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:21.601775 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8ff520c7-8bbe-42a7-8fc6-dced59fa3098-metrics-tls\") pod \"dns-default-fw7ch\" (UID: \"8ff520c7-8bbe-42a7-8fc6-dced59fa3098\") " pod="openshift-dns/dns-default-fw7ch" Apr 16 20:38:21.601887 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:21.601869 2537 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:38:21.601887 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:21.601872 2537 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 20:38:21.601950 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:21.601913 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/963c2c1a-13eb-461e-bac5-d7a50b6c68ca-networking-console-plugin-cert podName:963c2c1a-13eb-461e-bac5-d7a50b6c68ca nodeName:}" failed. No retries permitted until 2026-04-16 20:38:25.601900402 +0000 UTC m=+44.587043981 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/963c2c1a-13eb-461e-bac5-d7a50b6c68ca-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-s2hb7" (UID: "963c2c1a-13eb-461e-bac5-d7a50b6c68ca") : secret "networking-console-plugin-cert" not found Apr 16 20:38:21.601950 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:21.601929 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ff520c7-8bbe-42a7-8fc6-dced59fa3098-metrics-tls podName:8ff520c7-8bbe-42a7-8fc6-dced59fa3098 nodeName:}" failed. No retries permitted until 2026-04-16 20:38:25.601921197 +0000 UTC m=+44.587064770 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8ff520c7-8bbe-42a7-8fc6-dced59fa3098-metrics-tls") pod "dns-default-fw7ch" (UID: "8ff520c7-8bbe-42a7-8fc6-dced59fa3098") : secret "dns-default-metrics-tls" not found Apr 16 20:38:22.506643 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:22.506614 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7d19a112-53ff-4260-97bb-aaa69848369c-original-pull-secret\") pod \"global-pull-secret-syncer-fsph8\" (UID: \"7d19a112-53ff-4260-97bb-aaa69848369c\") " pod="kube-system/global-pull-secret-syncer-fsph8" Apr 16 20:38:22.508870 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:22.508848 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7d19a112-53ff-4260-97bb-aaa69848369c-original-pull-secret\") pod \"global-pull-secret-syncer-fsph8\" (UID: \"7d19a112-53ff-4260-97bb-aaa69848369c\") " pod="kube-system/global-pull-secret-syncer-fsph8" Apr 16 20:38:22.798763 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:22.798701 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fsph8" Apr 16 20:38:22.969842 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:22.969646 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-fsph8"] Apr 16 20:38:22.973064 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:38:22.973038 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d19a112_53ff_4260_97bb_aaa69848369c.slice/crio-8d1f1c990169cdc5ac5119cd50f65bde0230668b41b1fb8a544b9756b3113d57 WatchSource:0}: Error finding container 8d1f1c990169cdc5ac5119cd50f65bde0230668b41b1fb8a544b9756b3113d57: Status 404 returned error can't find the container with id 8d1f1c990169cdc5ac5119cd50f65bde0230668b41b1fb8a544b9756b3113d57 Apr 16 20:38:23.757147 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:23.757106 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-fsph8" event={"ID":"7d19a112-53ff-4260-97bb-aaa69848369c","Type":"ContainerStarted","Data":"8d1f1c990169cdc5ac5119cd50f65bde0230668b41b1fb8a544b9756b3113d57"} Apr 16 20:38:25.626122 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:25.626091 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/963c2c1a-13eb-461e-bac5-d7a50b6c68ca-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-s2hb7\" (UID: \"963c2c1a-13eb-461e-bac5-d7a50b6c68ca\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-s2hb7" Apr 16 20:38:25.626122 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:25.626125 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8ff520c7-8bbe-42a7-8fc6-dced59fa3098-metrics-tls\") pod \"dns-default-fw7ch\" (UID: \"8ff520c7-8bbe-42a7-8fc6-dced59fa3098\") " pod="openshift-dns/dns-default-fw7ch" Apr 16 20:38:25.626538 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:25.626152 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f300a967-50a7-4aca-b7dc-57b123274d17-registry-tls\") pod \"image-registry-657b97c9cd-g52rs\" (UID: \"f300a967-50a7-4aca-b7dc-57b123274d17\") " pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" Apr 16 20:38:25.626538 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:25.626213 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e45e2b17-af71-470b-a92b-013389ef5f6c-cert\") pod \"ingress-canary-dxvbz\" (UID: \"e45e2b17-af71-470b-a92b-013389ef5f6c\") " pod="openshift-ingress-canary/ingress-canary-dxvbz" Apr 16 20:38:25.626538 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:25.626230 2537 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:38:25.626538 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:25.626234 2537 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 20:38:25.626538 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:25.626271 2537 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:38:25.626538 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:25.626287 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/963c2c1a-13eb-461e-bac5-d7a50b6c68ca-networking-console-plugin-cert podName:963c2c1a-13eb-461e-bac5-d7a50b6c68ca nodeName:}" failed. No retries permitted until 2026-04-16 20:38:33.626272288 +0000 UTC m=+52.611415857 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/963c2c1a-13eb-461e-bac5-d7a50b6c68ca-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-s2hb7" (UID: "963c2c1a-13eb-461e-bac5-d7a50b6c68ca") : secret "networking-console-plugin-cert" not found Apr 16 20:38:25.626538 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:25.626296 2537 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:38:25.626538 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:25.626305 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ff520c7-8bbe-42a7-8fc6-dced59fa3098-metrics-tls podName:8ff520c7-8bbe-42a7-8fc6-dced59fa3098 nodeName:}" failed. No retries permitted until 2026-04-16 20:38:33.626293354 +0000 UTC m=+52.611436924 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8ff520c7-8bbe-42a7-8fc6-dced59fa3098-metrics-tls") pod "dns-default-fw7ch" (UID: "8ff520c7-8bbe-42a7-8fc6-dced59fa3098") : secret "dns-default-metrics-tls" not found Apr 16 20:38:25.626538 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:25.626240 2537 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-657b97c9cd-g52rs: secret "image-registry-tls" not found Apr 16 20:38:25.626538 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:25.626326 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e45e2b17-af71-470b-a92b-013389ef5f6c-cert podName:e45e2b17-af71-470b-a92b-013389ef5f6c nodeName:}" failed. No retries permitted until 2026-04-16 20:38:33.626317441 +0000 UTC m=+52.611461010 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e45e2b17-af71-470b-a92b-013389ef5f6c-cert") pod "ingress-canary-dxvbz" (UID: "e45e2b17-af71-470b-a92b-013389ef5f6c") : secret "canary-serving-cert" not found Apr 16 20:38:25.626538 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:25.626339 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f300a967-50a7-4aca-b7dc-57b123274d17-registry-tls podName:f300a967-50a7-4aca-b7dc-57b123274d17 nodeName:}" failed. No retries permitted until 2026-04-16 20:38:33.626332718 +0000 UTC m=+52.611476287 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f300a967-50a7-4aca-b7dc-57b123274d17-registry-tls") pod "image-registry-657b97c9cd-g52rs" (UID: "f300a967-50a7-4aca-b7dc-57b123274d17") : secret "image-registry-tls" not found Apr 16 20:38:28.767746 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:28.767667 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-fsph8" event={"ID":"7d19a112-53ff-4260-97bb-aaa69848369c","Type":"ContainerStarted","Data":"0647aa1d729f69d52a6c9acf3aee2fd764d1c984bbc3ba416dd3ad8fe9ba0481"} Apr 16 20:38:28.781849 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:28.781807 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-fsph8" podStartSLOduration=33.345582435 podStartE2EDuration="38.781794166s" podCreationTimestamp="2026-04-16 20:37:50 +0000 UTC" firstStartedPulling="2026-04-16 20:38:22.974816517 +0000 UTC m=+41.959960089" lastFinishedPulling="2026-04-16 20:38:28.411028247 +0000 UTC m=+47.396171820" observedRunningTime="2026-04-16 20:38:28.780865616 +0000 UTC m=+47.766009205" watchObservedRunningTime="2026-04-16 20:38:28.781794166 +0000 UTC m=+47.766937757" Apr 16 20:38:33.683115 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:33.683076 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/963c2c1a-13eb-461e-bac5-d7a50b6c68ca-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-s2hb7\" (UID: \"963c2c1a-13eb-461e-bac5-d7a50b6c68ca\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-s2hb7" Apr 16 20:38:33.683115 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:33.683116 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8ff520c7-8bbe-42a7-8fc6-dced59fa3098-metrics-tls\") pod \"dns-default-fw7ch\" (UID: \"8ff520c7-8bbe-42a7-8fc6-dced59fa3098\") " pod="openshift-dns/dns-default-fw7ch" Apr 16 20:38:33.683736 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:33.683139 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f300a967-50a7-4aca-b7dc-57b123274d17-registry-tls\") pod \"image-registry-657b97c9cd-g52rs\" (UID: \"f300a967-50a7-4aca-b7dc-57b123274d17\") " pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" Apr 16 20:38:33.683736 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:33.683158 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e45e2b17-af71-470b-a92b-013389ef5f6c-cert\") pod \"ingress-canary-dxvbz\" (UID: \"e45e2b17-af71-470b-a92b-013389ef5f6c\") " pod="openshift-ingress-canary/ingress-canary-dxvbz" Apr 16 20:38:33.683736 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:33.683212 2537 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 20:38:33.683736 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:33.683238 2537 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:38:33.683736 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:33.683284 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/963c2c1a-13eb-461e-bac5-d7a50b6c68ca-networking-console-plugin-cert podName:963c2c1a-13eb-461e-bac5-d7a50b6c68ca nodeName:}" failed. No retries permitted until 2026-04-16 20:38:49.683267345 +0000 UTC m=+68.668410941 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/963c2c1a-13eb-461e-bac5-d7a50b6c68ca-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-s2hb7" (UID: "963c2c1a-13eb-461e-bac5-d7a50b6c68ca") : secret "networking-console-plugin-cert" not found Apr 16 20:38:33.683736 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:33.683282 2537 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:38:33.683736 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:33.683301 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e45e2b17-af71-470b-a92b-013389ef5f6c-cert podName:e45e2b17-af71-470b-a92b-013389ef5f6c nodeName:}" failed. No retries permitted until 2026-04-16 20:38:49.68329416 +0000 UTC m=+68.668437729 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e45e2b17-af71-470b-a92b-013389ef5f6c-cert") pod "ingress-canary-dxvbz" (UID: "e45e2b17-af71-470b-a92b-013389ef5f6c") : secret "canary-serving-cert" not found Apr 16 20:38:33.683736 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:33.683295 2537 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:38:33.683736 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:33.683316 2537 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-657b97c9cd-g52rs: secret "image-registry-tls" not found Apr 16 20:38:33.683736 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:33.683343 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ff520c7-8bbe-42a7-8fc6-dced59fa3098-metrics-tls podName:8ff520c7-8bbe-42a7-8fc6-dced59fa3098 nodeName:}" failed. No retries permitted until 2026-04-16 20:38:49.683325104 +0000 UTC m=+68.668468673 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8ff520c7-8bbe-42a7-8fc6-dced59fa3098-metrics-tls") pod "dns-default-fw7ch" (UID: "8ff520c7-8bbe-42a7-8fc6-dced59fa3098") : secret "dns-default-metrics-tls" not found Apr 16 20:38:33.683736 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:33.683363 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f300a967-50a7-4aca-b7dc-57b123274d17-registry-tls podName:f300a967-50a7-4aca-b7dc-57b123274d17 nodeName:}" failed. No retries permitted until 2026-04-16 20:38:49.683353269 +0000 UTC m=+68.668496838 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f300a967-50a7-4aca-b7dc-57b123274d17-registry-tls") pod "image-registry-657b97c9cd-g52rs" (UID: "f300a967-50a7-4aca-b7dc-57b123274d17") : secret "image-registry-tls" not found Apr 16 20:38:42.737039 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:42.737010 2537 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5qqqk" Apr 16 20:38:47.277262 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:47.277214 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nvmvf\" (UniqueName: \"kubernetes.io/projected/e8800de1-77ef-480f-81de-cc93318d33b6-kube-api-access-nvmvf\") pod \"network-check-target-4cr2r\" (UID: \"e8800de1-77ef-480f-81de-cc93318d33b6\") " pod="openshift-network-diagnostics/network-check-target-4cr2r" Apr 16 20:38:47.277262 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:47.277265 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/11dcf076-13e5-4128-bf13-7e6c86c6dd5b-metrics-certs\") pod \"network-metrics-daemon-d6xkn\" (UID: \"11dcf076-13e5-4128-bf13-7e6c86c6dd5b\") " pod="openshift-multus/network-metrics-daemon-d6xkn" Apr 16 20:38:47.279828 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:47.279808 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 20:38:47.279902 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:47.279879 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 20:38:47.287566 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:47.287538 2537 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 20:38:47.287618 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:47.287607 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11dcf076-13e5-4128-bf13-7e6c86c6dd5b-metrics-certs podName:11dcf076-13e5-4128-bf13-7e6c86c6dd5b nodeName:}" failed. No retries permitted until 2026-04-16 20:39:51.287591672 +0000 UTC m=+130.272735241 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/11dcf076-13e5-4128-bf13-7e6c86c6dd5b-metrics-certs") pod "network-metrics-daemon-d6xkn" (UID: "11dcf076-13e5-4128-bf13-7e6c86c6dd5b") : secret "metrics-daemon-secret" not found Apr 16 20:38:47.289369 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:47.289346 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 20:38:47.302110 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:47.302091 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvmvf\" (UniqueName: \"kubernetes.io/projected/e8800de1-77ef-480f-81de-cc93318d33b6-kube-api-access-nvmvf\") pod \"network-check-target-4cr2r\" (UID: \"e8800de1-77ef-480f-81de-cc93318d33b6\") " pod="openshift-network-diagnostics/network-check-target-4cr2r" Apr 16 20:38:47.405114 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:47.405094 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-484zf\"" Apr 16 20:38:47.413066 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:47.413045 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4cr2r" Apr 16 20:38:47.539772 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:47.539708 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-4cr2r"] Apr 16 20:38:47.542708 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:38:47.542673 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8800de1_77ef_480f_81de_cc93318d33b6.slice/crio-d5af0ef48914e40e59fa4d6a8583c04ee3ef14cb6212bf73d1c345b46a1cc105 WatchSource:0}: Error finding container d5af0ef48914e40e59fa4d6a8583c04ee3ef14cb6212bf73d1c345b46a1cc105: Status 404 returned error can't find the container with id d5af0ef48914e40e59fa4d6a8583c04ee3ef14cb6212bf73d1c345b46a1cc105 Apr 16 20:38:47.803714 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:47.803656 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-4cr2r" event={"ID":"e8800de1-77ef-480f-81de-cc93318d33b6","Type":"ContainerStarted","Data":"d5af0ef48914e40e59fa4d6a8583c04ee3ef14cb6212bf73d1c345b46a1cc105"} Apr 16 20:38:49.696045 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:49.696007 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/963c2c1a-13eb-461e-bac5-d7a50b6c68ca-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-s2hb7\" (UID: \"963c2c1a-13eb-461e-bac5-d7a50b6c68ca\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-s2hb7" Apr 16 20:38:49.696045 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:49.696050 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8ff520c7-8bbe-42a7-8fc6-dced59fa3098-metrics-tls\") pod \"dns-default-fw7ch\" (UID: \"8ff520c7-8bbe-42a7-8fc6-dced59fa3098\") " pod="openshift-dns/dns-default-fw7ch" Apr 16 20:38:49.696418 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:49.696136 2537 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:38:49.696418 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:49.696139 2537 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 20:38:49.696418 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:49.696178 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f300a967-50a7-4aca-b7dc-57b123274d17-registry-tls\") pod \"image-registry-657b97c9cd-g52rs\" (UID: \"f300a967-50a7-4aca-b7dc-57b123274d17\") " pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" Apr 16 20:38:49.696418 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:49.696190 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ff520c7-8bbe-42a7-8fc6-dced59fa3098-metrics-tls podName:8ff520c7-8bbe-42a7-8fc6-dced59fa3098 nodeName:}" failed. No retries permitted until 2026-04-16 20:39:21.696177521 +0000 UTC m=+100.681321090 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8ff520c7-8bbe-42a7-8fc6-dced59fa3098-metrics-tls") pod "dns-default-fw7ch" (UID: "8ff520c7-8bbe-42a7-8fc6-dced59fa3098") : secret "dns-default-metrics-tls" not found Apr 16 20:38:49.696418 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:49.696218 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/963c2c1a-13eb-461e-bac5-d7a50b6c68ca-networking-console-plugin-cert podName:963c2c1a-13eb-461e-bac5-d7a50b6c68ca nodeName:}" failed. No retries permitted until 2026-04-16 20:39:21.696205364 +0000 UTC m=+100.681348937 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/963c2c1a-13eb-461e-bac5-d7a50b6c68ca-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-s2hb7" (UID: "963c2c1a-13eb-461e-bac5-d7a50b6c68ca") : secret "networking-console-plugin-cert" not found Apr 16 20:38:49.696418 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:49.696229 2537 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:38:49.696418 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:49.696237 2537 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-657b97c9cd-g52rs: secret "image-registry-tls" not found Apr 16 20:38:49.696418 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:49.696248 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e45e2b17-af71-470b-a92b-013389ef5f6c-cert\") pod \"ingress-canary-dxvbz\" (UID: \"e45e2b17-af71-470b-a92b-013389ef5f6c\") " pod="openshift-ingress-canary/ingress-canary-dxvbz" Apr 16 20:38:49.696418 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:49.696263 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f300a967-50a7-4aca-b7dc-57b123274d17-registry-tls podName:f300a967-50a7-4aca-b7dc-57b123274d17 nodeName:}" failed. No retries permitted until 2026-04-16 20:39:21.696253657 +0000 UTC m=+100.681397226 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f300a967-50a7-4aca-b7dc-57b123274d17-registry-tls") pod "image-registry-657b97c9cd-g52rs" (UID: "f300a967-50a7-4aca-b7dc-57b123274d17") : secret "image-registry-tls" not found Apr 16 20:38:49.696418 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:49.696302 2537 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:38:49.696418 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:38:49.696349 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e45e2b17-af71-470b-a92b-013389ef5f6c-cert podName:e45e2b17-af71-470b-a92b-013389ef5f6c nodeName:}" failed. No retries permitted until 2026-04-16 20:39:21.696340233 +0000 UTC m=+100.681483802 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e45e2b17-af71-470b-a92b-013389ef5f6c-cert") pod "ingress-canary-dxvbz" (UID: "e45e2b17-af71-470b-a92b-013389ef5f6c") : secret "canary-serving-cert" not found Apr 16 20:38:51.811460 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:51.811424 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-4cr2r" event={"ID":"e8800de1-77ef-480f-81de-cc93318d33b6","Type":"ContainerStarted","Data":"c89fbf2b2de3f0f980faac98640d9affe7e20e185a5d3a0871b818bcddae30a2"} Apr 16 20:38:51.811787 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:51.811582 2537 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-4cr2r" Apr 16 20:38:51.826057 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:38:51.826012 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-4cr2r" podStartSLOduration=67.239121989 podStartE2EDuration="1m10.826001368s" podCreationTimestamp="2026-04-16 20:37:41 +0000 UTC" firstStartedPulling="2026-04-16 20:38:47.544522713 +0000 UTC m=+66.529666286" lastFinishedPulling="2026-04-16 20:38:51.131402085 +0000 UTC m=+70.116545665" observedRunningTime="2026-04-16 20:38:51.825929421 +0000 UTC m=+70.811073025" watchObservedRunningTime="2026-04-16 20:38:51.826001368 +0000 UTC m=+70.811144957" Apr 16 20:39:21.703333 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:39:21.703300 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f300a967-50a7-4aca-b7dc-57b123274d17-registry-tls\") pod \"image-registry-657b97c9cd-g52rs\" (UID: \"f300a967-50a7-4aca-b7dc-57b123274d17\") " pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" Apr 16 20:39:21.703333 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:39:21.703335 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e45e2b17-af71-470b-a92b-013389ef5f6c-cert\") pod \"ingress-canary-dxvbz\" (UID: \"e45e2b17-af71-470b-a92b-013389ef5f6c\") " pod="openshift-ingress-canary/ingress-canary-dxvbz" Apr 16 20:39:21.703929 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:39:21.703398 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/963c2c1a-13eb-461e-bac5-d7a50b6c68ca-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-s2hb7\" (UID: \"963c2c1a-13eb-461e-bac5-d7a50b6c68ca\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-s2hb7" Apr 16 20:39:21.703929 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:39:21.703427 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8ff520c7-8bbe-42a7-8fc6-dced59fa3098-metrics-tls\") pod \"dns-default-fw7ch\" (UID: \"8ff520c7-8bbe-42a7-8fc6-dced59fa3098\") " pod="openshift-dns/dns-default-fw7ch" Apr 16 20:39:21.703929 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:39:21.703439 2537 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:39:21.703929 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:39:21.703456 2537 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-657b97c9cd-g52rs: secret "image-registry-tls" not found Apr 16 20:39:21.703929 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:39:21.703506 2537 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:39:21.703929 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:39:21.703508 2537 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:39:21.703929 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:39:21.703521 2537 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 20:39:21.703929 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:39:21.703514 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f300a967-50a7-4aca-b7dc-57b123274d17-registry-tls podName:f300a967-50a7-4aca-b7dc-57b123274d17 nodeName:}" failed. No retries permitted until 2026-04-16 20:40:25.703496383 +0000 UTC m=+164.688639967 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f300a967-50a7-4aca-b7dc-57b123274d17-registry-tls") pod "image-registry-657b97c9cd-g52rs" (UID: "f300a967-50a7-4aca-b7dc-57b123274d17") : secret "image-registry-tls" not found Apr 16 20:39:21.703929 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:39:21.703613 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ff520c7-8bbe-42a7-8fc6-dced59fa3098-metrics-tls podName:8ff520c7-8bbe-42a7-8fc6-dced59fa3098 nodeName:}" failed. No retries permitted until 2026-04-16 20:40:25.703582884 +0000 UTC m=+164.688726469 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8ff520c7-8bbe-42a7-8fc6-dced59fa3098-metrics-tls") pod "dns-default-fw7ch" (UID: "8ff520c7-8bbe-42a7-8fc6-dced59fa3098") : secret "dns-default-metrics-tls" not found Apr 16 20:39:21.703929 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:39:21.703628 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e45e2b17-af71-470b-a92b-013389ef5f6c-cert podName:e45e2b17-af71-470b-a92b-013389ef5f6c nodeName:}" failed. No retries permitted until 2026-04-16 20:40:25.703621777 +0000 UTC m=+164.688765346 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e45e2b17-af71-470b-a92b-013389ef5f6c-cert") pod "ingress-canary-dxvbz" (UID: "e45e2b17-af71-470b-a92b-013389ef5f6c") : secret "canary-serving-cert" not found Apr 16 20:39:21.703929 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:39:21.703643 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/963c2c1a-13eb-461e-bac5-d7a50b6c68ca-networking-console-plugin-cert podName:963c2c1a-13eb-461e-bac5-d7a50b6c68ca nodeName:}" failed. No retries permitted until 2026-04-16 20:40:25.70363401 +0000 UTC m=+164.688777581 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/963c2c1a-13eb-461e-bac5-d7a50b6c68ca-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-s2hb7" (UID: "963c2c1a-13eb-461e-bac5-d7a50b6c68ca") : secret "networking-console-plugin-cert" not found Apr 16 20:39:22.816318 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:39:22.816282 2537 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-4cr2r" Apr 16 20:39:51.297867 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:39:51.297827 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/11dcf076-13e5-4128-bf13-7e6c86c6dd5b-metrics-certs\") pod \"network-metrics-daemon-d6xkn\" (UID: \"11dcf076-13e5-4128-bf13-7e6c86c6dd5b\") " pod="openshift-multus/network-metrics-daemon-d6xkn" Apr 16 20:39:51.298252 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:39:51.297943 2537 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 20:39:51.298252 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:39:51.298004 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11dcf076-13e5-4128-bf13-7e6c86c6dd5b-metrics-certs podName:11dcf076-13e5-4128-bf13-7e6c86c6dd5b nodeName:}" failed. No retries permitted until 2026-04-16 20:41:53.297991187 +0000 UTC m=+252.283134756 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/11dcf076-13e5-4128-bf13-7e6c86c6dd5b-metrics-certs") pod "network-metrics-daemon-d6xkn" (UID: "11dcf076-13e5-4128-bf13-7e6c86c6dd5b") : secret "metrics-daemon-secret" not found Apr 16 20:39:56.964488 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:39:56.964452 2537 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8d7wj"] Apr 16 20:39:56.967386 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:39:56.967364 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8d7wj" Apr 16 20:39:56.969631 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:39:56.969609 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 20:39:56.970651 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:39:56.970636 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:39:56.970701 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:39:56.970651 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-mhjp9\"" Apr 16 20:39:56.974983 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:39:56.974964 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8d7wj"] Apr 16 20:39:57.140233 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:39:57.140199 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndz66\" (UniqueName: \"kubernetes.io/projected/7853bade-dca6-4e57-adbd-3fc48629e3ed-kube-api-access-ndz66\") pod \"volume-data-source-validator-7c6cbb6c87-8d7wj\" (UID: \"7853bade-dca6-4e57-adbd-3fc48629e3ed\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8d7wj" Apr 16 20:39:57.240745 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:39:57.240691 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ndz66\" (UniqueName: \"kubernetes.io/projected/7853bade-dca6-4e57-adbd-3fc48629e3ed-kube-api-access-ndz66\") pod \"volume-data-source-validator-7c6cbb6c87-8d7wj\" (UID: \"7853bade-dca6-4e57-adbd-3fc48629e3ed\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8d7wj" Apr 16 20:39:57.248516 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:39:57.248487 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndz66\" (UniqueName: \"kubernetes.io/projected/7853bade-dca6-4e57-adbd-3fc48629e3ed-kube-api-access-ndz66\") pod \"volume-data-source-validator-7c6cbb6c87-8d7wj\" (UID: \"7853bade-dca6-4e57-adbd-3fc48629e3ed\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8d7wj" Apr 16 20:39:57.276531 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:39:57.276509 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8d7wj" Apr 16 20:39:57.379881 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:39:57.379848 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8d7wj"] Apr 16 20:39:57.383724 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:39:57.383699 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7853bade_dca6_4e57_adbd_3fc48629e3ed.slice/crio-4a8f05baaea51de0276742515cea6d6bf4e74503b15bf3f168f9f713b3c14def WatchSource:0}: Error finding container 4a8f05baaea51de0276742515cea6d6bf4e74503b15bf3f168f9f713b3c14def: Status 404 returned error can't find the container with id 4a8f05baaea51de0276742515cea6d6bf4e74503b15bf3f168f9f713b3c14def Apr 16 20:39:57.929387 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:39:57.929351 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8d7wj" event={"ID":"7853bade-dca6-4e57-adbd-3fc48629e3ed","Type":"ContainerStarted","Data":"4a8f05baaea51de0276742515cea6d6bf4e74503b15bf3f168f9f713b3c14def"} Apr 16 20:39:58.932972 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:39:58.932932 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8d7wj" event={"ID":"7853bade-dca6-4e57-adbd-3fc48629e3ed","Type":"ContainerStarted","Data":"cd93f82cfe52af5c6e6e9339d8f2c4f20f9127a59addf845d5fb1d14b0893d24"} Apr 16 20:39:58.947938 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:39:58.947842 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8d7wj" podStartSLOduration=1.645385018 podStartE2EDuration="2.947828023s" podCreationTimestamp="2026-04-16 20:39:56 +0000 UTC" firstStartedPulling="2026-04-16 20:39:57.3852845 +0000 UTC m=+136.370428072" lastFinishedPulling="2026-04-16 20:39:58.687727508 +0000 UTC m=+137.672871077" observedRunningTime="2026-04-16 20:39:58.946982898 +0000 UTC m=+137.932126488" watchObservedRunningTime="2026-04-16 20:39:58.947828023 +0000 UTC m=+137.932971614" Apr 16 20:40:01.751573 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:01.751536 2537 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9k6sg"] Apr 16 20:40:01.754519 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:01.754503 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9k6sg" Apr 16 20:40:01.756674 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:01.756645 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 20:40:01.756674 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:01.756660 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:40:01.757580 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:01.757549 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-ml7j5\"" Apr 16 20:40:01.757713 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:01.757586 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 20:40:01.757713 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:01.757571 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 20:40:01.772818 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:01.772795 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9k6sg"] Apr 16 20:40:01.873410 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:01.873364 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1be75c1-32f2-40ea-9bb4-c921c216c25d-config\") pod \"service-ca-operator-d6fc45fc5-9k6sg\" (UID: \"f1be75c1-32f2-40ea-9bb4-c921c216c25d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9k6sg" Apr 16 20:40:01.873410 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:01.873418 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1be75c1-32f2-40ea-9bb4-c921c216c25d-serving-cert\") pod \"service-ca-operator-d6fc45fc5-9k6sg\" (UID: \"f1be75c1-32f2-40ea-9bb4-c921c216c25d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9k6sg" Apr 16 20:40:01.873637 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:01.873536 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6krq\" (UniqueName: \"kubernetes.io/projected/f1be75c1-32f2-40ea-9bb4-c921c216c25d-kube-api-access-c6krq\") pod \"service-ca-operator-d6fc45fc5-9k6sg\" (UID: \"f1be75c1-32f2-40ea-9bb4-c921c216c25d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9k6sg" Apr 16 20:40:01.974486 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:01.974456 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c6krq\" (UniqueName: \"kubernetes.io/projected/f1be75c1-32f2-40ea-9bb4-c921c216c25d-kube-api-access-c6krq\") pod \"service-ca-operator-d6fc45fc5-9k6sg\" (UID: \"f1be75c1-32f2-40ea-9bb4-c921c216c25d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9k6sg" Apr 16 20:40:01.974638 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:01.974530 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1be75c1-32f2-40ea-9bb4-c921c216c25d-config\") pod \"service-ca-operator-d6fc45fc5-9k6sg\" (UID: \"f1be75c1-32f2-40ea-9bb4-c921c216c25d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9k6sg" Apr 16 20:40:01.974638 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:01.974549 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1be75c1-32f2-40ea-9bb4-c921c216c25d-serving-cert\") pod \"service-ca-operator-d6fc45fc5-9k6sg\" (UID: \"f1be75c1-32f2-40ea-9bb4-c921c216c25d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9k6sg" Apr 16 20:40:01.975134 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:01.975114 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1be75c1-32f2-40ea-9bb4-c921c216c25d-config\") pod \"service-ca-operator-d6fc45fc5-9k6sg\" (UID: \"f1be75c1-32f2-40ea-9bb4-c921c216c25d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9k6sg" Apr 16 20:40:01.976739 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:01.976720 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1be75c1-32f2-40ea-9bb4-c921c216c25d-serving-cert\") pod \"service-ca-operator-d6fc45fc5-9k6sg\" (UID: \"f1be75c1-32f2-40ea-9bb4-c921c216c25d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9k6sg" Apr 16 20:40:01.982179 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:01.982157 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6krq\" (UniqueName: \"kubernetes.io/projected/f1be75c1-32f2-40ea-9bb4-c921c216c25d-kube-api-access-c6krq\") pod \"service-ca-operator-d6fc45fc5-9k6sg\" (UID: \"f1be75c1-32f2-40ea-9bb4-c921c216c25d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9k6sg" Apr 16 20:40:02.062648 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:02.062554 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9k6sg" Apr 16 20:40:02.171964 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:02.171935 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9k6sg"] Apr 16 20:40:02.175523 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:40:02.175495 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1be75c1_32f2_40ea_9bb4_c921c216c25d.slice/crio-35ddef9d1d954e1d86146bed8cef3f66535f86c7b183c0e5a1fc77511158e49e WatchSource:0}: Error finding container 35ddef9d1d954e1d86146bed8cef3f66535f86c7b183c0e5a1fc77511158e49e: Status 404 returned error can't find the container with id 35ddef9d1d954e1d86146bed8cef3f66535f86c7b183c0e5a1fc77511158e49e Apr 16 20:40:02.942452 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:02.942415 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9k6sg" event={"ID":"f1be75c1-32f2-40ea-9bb4-c921c216c25d","Type":"ContainerStarted","Data":"35ddef9d1d954e1d86146bed8cef3f66535f86c7b183c0e5a1fc77511158e49e"} Apr 16 20:40:03.247767 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:03.247693 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-gsrd4_74821fb7-65d5-4396-bc93-add3e5936d13/dns-node-resolver/0.log" Apr 16 20:40:03.847120 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:03.847094 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-2t2sk_81e7762a-1005-4a21-8f55-9dda467004e0/node-ca/0.log" Apr 16 20:40:04.947288 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:04.947255 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9k6sg" event={"ID":"f1be75c1-32f2-40ea-9bb4-c921c216c25d","Type":"ContainerStarted","Data":"78a3dc2df31323beba6e37c59739c27a983d478ba1b33f24732a66af658e670a"} Apr 16 20:40:04.961336 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:04.961290 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9k6sg" podStartSLOduration=1.626646222 podStartE2EDuration="3.961273734s" podCreationTimestamp="2026-04-16 20:40:01 +0000 UTC" firstStartedPulling="2026-04-16 20:40:02.177144008 +0000 UTC m=+141.162287580" lastFinishedPulling="2026-04-16 20:40:04.511771523 +0000 UTC m=+143.496915092" observedRunningTime="2026-04-16 20:40:04.960602912 +0000 UTC m=+143.945746499" watchObservedRunningTime="2026-04-16 20:40:04.961273734 +0000 UTC m=+143.946417325" Apr 16 20:40:20.908623 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:40:20.908590 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-s2hb7" podUID="963c2c1a-13eb-461e-bac5-d7a50b6c68ca" Apr 16 20:40:20.929240 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:40:20.929213 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" podUID="f300a967-50a7-4aca-b7dc-57b123274d17" Apr 16 20:40:20.944386 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:40:20.944361 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-fw7ch" podUID="8ff520c7-8bbe-42a7-8fc6-dced59fa3098" Apr 16 20:40:20.968662 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:40:20.968636 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-dxvbz" podUID="e45e2b17-af71-470b-a92b-013389ef5f6c" Apr 16 20:40:20.977063 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:20.977045 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-s2hb7" Apr 16 20:40:20.977192 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:20.977046 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fw7ch" Apr 16 20:40:21.593260 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:40:21.593225 2537 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-d6xkn" podUID="11dcf076-13e5-4128-bf13-7e6c86c6dd5b" Apr 16 20:40:25.740160 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:25.740123 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/963c2c1a-13eb-461e-bac5-d7a50b6c68ca-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-s2hb7\" (UID: \"963c2c1a-13eb-461e-bac5-d7a50b6c68ca\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-s2hb7" Apr 16 20:40:25.740160 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:25.740163 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8ff520c7-8bbe-42a7-8fc6-dced59fa3098-metrics-tls\") pod \"dns-default-fw7ch\" (UID: \"8ff520c7-8bbe-42a7-8fc6-dced59fa3098\") " pod="openshift-dns/dns-default-fw7ch" Apr 16 20:40:25.740699 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:25.740186 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f300a967-50a7-4aca-b7dc-57b123274d17-registry-tls\") pod \"image-registry-657b97c9cd-g52rs\" (UID: \"f300a967-50a7-4aca-b7dc-57b123274d17\") " pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" Apr 16 20:40:25.740699 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:25.740213 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e45e2b17-af71-470b-a92b-013389ef5f6c-cert\") pod \"ingress-canary-dxvbz\" (UID: \"e45e2b17-af71-470b-a92b-013389ef5f6c\") " pod="openshift-ingress-canary/ingress-canary-dxvbz" Apr 16 20:40:25.742666 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:25.742629 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8ff520c7-8bbe-42a7-8fc6-dced59fa3098-metrics-tls\") pod \"dns-default-fw7ch\" (UID: \"8ff520c7-8bbe-42a7-8fc6-dced59fa3098\") " pod="openshift-dns/dns-default-fw7ch" Apr 16 20:40:25.742666 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:25.742635 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/963c2c1a-13eb-461e-bac5-d7a50b6c68ca-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-s2hb7\" (UID: \"963c2c1a-13eb-461e-bac5-d7a50b6c68ca\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-s2hb7" Apr 16 20:40:25.743037 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:25.743016 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f300a967-50a7-4aca-b7dc-57b123274d17-registry-tls\") pod \"image-registry-657b97c9cd-g52rs\" (UID: \"f300a967-50a7-4aca-b7dc-57b123274d17\") " pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" Apr 16 20:40:25.743037 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:25.743033 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e45e2b17-af71-470b-a92b-013389ef5f6c-cert\") pod \"ingress-canary-dxvbz\" (UID: \"e45e2b17-af71-470b-a92b-013389ef5f6c\") " pod="openshift-ingress-canary/ingress-canary-dxvbz" Apr 16 20:40:25.780407 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:25.780384 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-pgwz8\"" Apr 16 20:40:25.781230 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:25.781214 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-wwh4f\"" Apr 16 20:40:25.788094 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:25.788080 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-s2hb7" Apr 16 20:40:25.788192 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:25.788178 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fw7ch" Apr 16 20:40:25.913919 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:25.913873 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fw7ch"] Apr 16 20:40:25.917382 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:40:25.917361 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ff520c7_8bbe_42a7_8fc6_dced59fa3098.slice/crio-564bd4811d0d56d875375b292e3189129abf77e9123ee97c0c1e8ab51305cbdc WatchSource:0}: Error finding container 564bd4811d0d56d875375b292e3189129abf77e9123ee97c0c1e8ab51305cbdc: Status 404 returned error can't find the container with id 564bd4811d0d56d875375b292e3189129abf77e9123ee97c0c1e8ab51305cbdc Apr 16 20:40:25.924748 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:25.924726 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-s2hb7"] Apr 16 20:40:25.927314 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:40:25.927290 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod963c2c1a_13eb_461e_bac5_d7a50b6c68ca.slice/crio-f899cea412d34226bc165bd75a855e4560a88951dfc171301e6847a522e25e06 WatchSource:0}: Error finding container f899cea412d34226bc165bd75a855e4560a88951dfc171301e6847a522e25e06: Status 404 returned error can't find the container with id f899cea412d34226bc165bd75a855e4560a88951dfc171301e6847a522e25e06 Apr 16 20:40:25.986949 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:25.986923 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-s2hb7" event={"ID":"963c2c1a-13eb-461e-bac5-d7a50b6c68ca","Type":"ContainerStarted","Data":"f899cea412d34226bc165bd75a855e4560a88951dfc171301e6847a522e25e06"} Apr 16 20:40:25.987791 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:25.987773 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fw7ch" event={"ID":"8ff520c7-8bbe-42a7-8fc6-dced59fa3098","Type":"ContainerStarted","Data":"564bd4811d0d56d875375b292e3189129abf77e9123ee97c0c1e8ab51305cbdc"} Apr 16 20:40:27.994763 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:27.994728 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fw7ch" event={"ID":"8ff520c7-8bbe-42a7-8fc6-dced59fa3098","Type":"ContainerStarted","Data":"17491f9f57c2ff79786c8bfeaefc74e016fc4a5e9e9a53870e9ccd4bc5323d69"} Apr 16 20:40:27.995330 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:27.995303 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fw7ch" event={"ID":"8ff520c7-8bbe-42a7-8fc6-dced59fa3098","Type":"ContainerStarted","Data":"1e45ca82d30c81f0d7df65d56f006133e9033c61fb3ed2acea74e46286fb160b"} Apr 16 20:40:27.995536 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:27.995507 2537 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-fw7ch" Apr 16 20:40:27.996769 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:27.996743 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-s2hb7" event={"ID":"963c2c1a-13eb-461e-bac5-d7a50b6c68ca","Type":"ContainerStarted","Data":"69e25fb1c3f8112353df1fd189b147f0b8c6a4f9ae3fd78746e22b3ab3b78f42"} Apr 16 20:40:28.012959 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:28.012924 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-fw7ch" podStartSLOduration=129.546644475 podStartE2EDuration="2m11.012912873s" podCreationTimestamp="2026-04-16 20:38:17 +0000 UTC" firstStartedPulling="2026-04-16 20:40:25.918993545 +0000 UTC m=+164.904137117" lastFinishedPulling="2026-04-16 20:40:27.385261945 +0000 UTC m=+166.370405515" observedRunningTime="2026-04-16 20:40:28.012399596 +0000 UTC m=+166.997543185" watchObservedRunningTime="2026-04-16 20:40:28.012912873 +0000 UTC m=+166.998056457" Apr 16 20:40:28.026281 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:28.026248 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-s2hb7" podStartSLOduration=161.966160998 podStartE2EDuration="2m43.0262377s" podCreationTimestamp="2026-04-16 20:37:45 +0000 UTC" firstStartedPulling="2026-04-16 20:40:25.929051553 +0000 UTC m=+164.914195126" lastFinishedPulling="2026-04-16 20:40:26.989128255 +0000 UTC m=+165.974271828" observedRunningTime="2026-04-16 20:40:28.025704544 +0000 UTC m=+167.010848134" watchObservedRunningTime="2026-04-16 20:40:28.0262377 +0000 UTC m=+167.011381303" Apr 16 20:40:30.140626 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:30.140595 2537 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-fkfvs"] Apr 16 20:40:30.143398 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:30.143381 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fkfvs" Apr 16 20:40:30.147396 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:30.147359 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 20:40:30.147522 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:30.147436 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 20:40:30.147871 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:30.147740 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 20:40:30.148420 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:30.148403 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-khb4d\"" Apr 16 20:40:30.151333 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:30.151314 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 20:40:30.166574 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:30.166540 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fkfvs"] Apr 16 20:40:30.169756 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:30.169734 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8a7d4825-6f48-4c8e-98c8-973ed6e7a0de-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fkfvs\" (UID: \"8a7d4825-6f48-4c8e-98c8-973ed6e7a0de\") " pod="openshift-insights/insights-runtime-extractor-fkfvs" Apr 16 20:40:30.169845 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:30.169789 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nrln\" (UniqueName: \"kubernetes.io/projected/8a7d4825-6f48-4c8e-98c8-973ed6e7a0de-kube-api-access-6nrln\") pod \"insights-runtime-extractor-fkfvs\" (UID: \"8a7d4825-6f48-4c8e-98c8-973ed6e7a0de\") " pod="openshift-insights/insights-runtime-extractor-fkfvs" Apr 16 20:40:30.169845 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:30.169828 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8a7d4825-6f48-4c8e-98c8-973ed6e7a0de-crio-socket\") pod \"insights-runtime-extractor-fkfvs\" (UID: \"8a7d4825-6f48-4c8e-98c8-973ed6e7a0de\") " pod="openshift-insights/insights-runtime-extractor-fkfvs" Apr 16 20:40:30.169983 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:30.169845 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8a7d4825-6f48-4c8e-98c8-973ed6e7a0de-data-volume\") pod \"insights-runtime-extractor-fkfvs\" (UID: \"8a7d4825-6f48-4c8e-98c8-973ed6e7a0de\") " pod="openshift-insights/insights-runtime-extractor-fkfvs" Apr 16 20:40:30.169983 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:30.169868 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8a7d4825-6f48-4c8e-98c8-973ed6e7a0de-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fkfvs\" (UID: \"8a7d4825-6f48-4c8e-98c8-973ed6e7a0de\") " pod="openshift-insights/insights-runtime-extractor-fkfvs" Apr 16 20:40:30.270657 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:30.270595 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8a7d4825-6f48-4c8e-98c8-973ed6e7a0de-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fkfvs\" (UID: \"8a7d4825-6f48-4c8e-98c8-973ed6e7a0de\") " pod="openshift-insights/insights-runtime-extractor-fkfvs" Apr 16 20:40:30.270657 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:30.270631 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8a7d4825-6f48-4c8e-98c8-973ed6e7a0de-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fkfvs\" (UID: \"8a7d4825-6f48-4c8e-98c8-973ed6e7a0de\") " pod="openshift-insights/insights-runtime-extractor-fkfvs" Apr 16 20:40:30.270842 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:30.270658 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6nrln\" (UniqueName: \"kubernetes.io/projected/8a7d4825-6f48-4c8e-98c8-973ed6e7a0de-kube-api-access-6nrln\") pod \"insights-runtime-extractor-fkfvs\" (UID: \"8a7d4825-6f48-4c8e-98c8-973ed6e7a0de\") " pod="openshift-insights/insights-runtime-extractor-fkfvs" Apr 16 20:40:30.270842 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:30.270790 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8a7d4825-6f48-4c8e-98c8-973ed6e7a0de-crio-socket\") pod \"insights-runtime-extractor-fkfvs\" (UID: \"8a7d4825-6f48-4c8e-98c8-973ed6e7a0de\") " pod="openshift-insights/insights-runtime-extractor-fkfvs" Apr 16 20:40:30.270842 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:30.270822 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8a7d4825-6f48-4c8e-98c8-973ed6e7a0de-data-volume\") pod \"insights-runtime-extractor-fkfvs\" (UID: \"8a7d4825-6f48-4c8e-98c8-973ed6e7a0de\") " pod="openshift-insights/insights-runtime-extractor-fkfvs" Apr 16 20:40:30.271002 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:30.270900 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8a7d4825-6f48-4c8e-98c8-973ed6e7a0de-crio-socket\") pod \"insights-runtime-extractor-fkfvs\" (UID: \"8a7d4825-6f48-4c8e-98c8-973ed6e7a0de\") " pod="openshift-insights/insights-runtime-extractor-fkfvs" Apr 16 20:40:30.271105 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:30.271085 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8a7d4825-6f48-4c8e-98c8-973ed6e7a0de-data-volume\") pod \"insights-runtime-extractor-fkfvs\" (UID: \"8a7d4825-6f48-4c8e-98c8-973ed6e7a0de\") " pod="openshift-insights/insights-runtime-extractor-fkfvs" Apr 16 20:40:30.271209 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:30.271191 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8a7d4825-6f48-4c8e-98c8-973ed6e7a0de-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fkfvs\" (UID: \"8a7d4825-6f48-4c8e-98c8-973ed6e7a0de\") " pod="openshift-insights/insights-runtime-extractor-fkfvs" Apr 16 20:40:30.272794 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:30.272774 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8a7d4825-6f48-4c8e-98c8-973ed6e7a0de-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fkfvs\" (UID: \"8a7d4825-6f48-4c8e-98c8-973ed6e7a0de\") " pod="openshift-insights/insights-runtime-extractor-fkfvs" Apr 16 20:40:30.291213 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:30.291187 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nrln\" (UniqueName: \"kubernetes.io/projected/8a7d4825-6f48-4c8e-98c8-973ed6e7a0de-kube-api-access-6nrln\") pod \"insights-runtime-extractor-fkfvs\" (UID: \"8a7d4825-6f48-4c8e-98c8-973ed6e7a0de\") " pod="openshift-insights/insights-runtime-extractor-fkfvs" Apr 16 20:40:30.452292 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:30.452227 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fkfvs" Apr 16 20:40:30.607210 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:30.607167 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fkfvs"] Apr 16 20:40:30.609871 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:40:30.609844 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a7d4825_6f48_4c8e_98c8_973ed6e7a0de.slice/crio-89ba57c0f5d5229a335f5e2b2e18dd3c212769560e86b5787282552d02fc377e WatchSource:0}: Error finding container 89ba57c0f5d5229a335f5e2b2e18dd3c212769560e86b5787282552d02fc377e: Status 404 returned error can't find the container with id 89ba57c0f5d5229a335f5e2b2e18dd3c212769560e86b5787282552d02fc377e Apr 16 20:40:31.006305 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:31.006276 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fkfvs" event={"ID":"8a7d4825-6f48-4c8e-98c8-973ed6e7a0de","Type":"ContainerStarted","Data":"06ff11dc48c26ee95a7fc50fde7af824e2df5f8f7ecf2454f84fb4ecf5cf2ad1"} Apr 16 20:40:31.006305 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:31.006307 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fkfvs" event={"ID":"8a7d4825-6f48-4c8e-98c8-973ed6e7a0de","Type":"ContainerStarted","Data":"89ba57c0f5d5229a335f5e2b2e18dd3c212769560e86b5787282552d02fc377e"} Apr 16 20:40:31.624100 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:31.624068 2537 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bvzlr"] Apr 16 20:40:31.627112 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:31.627091 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bvzlr" Apr 16 20:40:31.629491 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:31.629466 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-c8cck\"" Apr 16 20:40:31.629606 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:31.629507 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 20:40:31.637542 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:31.637524 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bvzlr"] Apr 16 20:40:31.680640 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:31.680553 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/934fbf27-bfcf-48a3-b5ad-351d9c325bab-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-bvzlr\" (UID: \"934fbf27-bfcf-48a3-b5ad-351d9c325bab\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bvzlr" Apr 16 20:40:31.781828 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:31.781805 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/934fbf27-bfcf-48a3-b5ad-351d9c325bab-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-bvzlr\" (UID: \"934fbf27-bfcf-48a3-b5ad-351d9c325bab\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bvzlr" Apr 16 20:40:31.781929 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:40:31.781917 2537 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 16 20:40:31.781972 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:40:31.781966 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/934fbf27-bfcf-48a3-b5ad-351d9c325bab-tls-certificates podName:934fbf27-bfcf-48a3-b5ad-351d9c325bab nodeName:}" failed. No retries permitted until 2026-04-16 20:40:32.281952976 +0000 UTC m=+171.267096545 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/934fbf27-bfcf-48a3-b5ad-351d9c325bab-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-bvzlr" (UID: "934fbf27-bfcf-48a3-b5ad-351d9c325bab") : secret "prometheus-operator-admission-webhook-tls" not found Apr 16 20:40:32.010048 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:32.009967 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fkfvs" event={"ID":"8a7d4825-6f48-4c8e-98c8-973ed6e7a0de","Type":"ContainerStarted","Data":"bfc33f866b8dab400be9c6025dfa4d2c10cbc4b41a373187de5c78737ec7665e"} Apr 16 20:40:32.285676 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:32.285595 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/934fbf27-bfcf-48a3-b5ad-351d9c325bab-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-bvzlr\" (UID: \"934fbf27-bfcf-48a3-b5ad-351d9c325bab\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bvzlr" Apr 16 20:40:32.288147 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:32.288117 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/934fbf27-bfcf-48a3-b5ad-351d9c325bab-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-bvzlr\" (UID: \"934fbf27-bfcf-48a3-b5ad-351d9c325bab\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bvzlr" Apr 16 20:40:32.534886 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:32.534855 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bvzlr" Apr 16 20:40:32.583225 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:32.583190 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dxvbz" Apr 16 20:40:32.585865 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:32.585841 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-7vq8v\"" Apr 16 20:40:32.594335 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:32.594313 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dxvbz" Apr 16 20:40:33.059933 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:33.059911 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dxvbz"] Apr 16 20:40:33.062878 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:40:33.062858 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode45e2b17_af71_470b_a92b_013389ef5f6c.slice/crio-dd6f4c0b27cce4c22ce859b85c18c35262e9b4cae5e68c9e94dd0bdedb88398c WatchSource:0}: Error finding container dd6f4c0b27cce4c22ce859b85c18c35262e9b4cae5e68c9e94dd0bdedb88398c: Status 404 returned error can't find the container with id dd6f4c0b27cce4c22ce859b85c18c35262e9b4cae5e68c9e94dd0bdedb88398c Apr 16 20:40:33.079788 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:33.079767 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bvzlr"] Apr 16 20:40:33.082131 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:40:33.082110 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod934fbf27_bfcf_48a3_b5ad_351d9c325bab.slice/crio-66c391774d3de2b3f1e939832001a4782965c0585b3f79154bb11bc5165f27a0 WatchSource:0}: Error finding container 66c391774d3de2b3f1e939832001a4782965c0585b3f79154bb11bc5165f27a0: Status 404 returned error can't find the container with id 66c391774d3de2b3f1e939832001a4782965c0585b3f79154bb11bc5165f27a0 Apr 16 20:40:33.583710 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:33.583682 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d6xkn" Apr 16 20:40:34.016548 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:34.016511 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dxvbz" event={"ID":"e45e2b17-af71-470b-a92b-013389ef5f6c","Type":"ContainerStarted","Data":"dd6f4c0b27cce4c22ce859b85c18c35262e9b4cae5e68c9e94dd0bdedb88398c"} Apr 16 20:40:34.017740 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:34.017699 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bvzlr" event={"ID":"934fbf27-bfcf-48a3-b5ad-351d9c325bab","Type":"ContainerStarted","Data":"66c391774d3de2b3f1e939832001a4782965c0585b3f79154bb11bc5165f27a0"} Apr 16 20:40:34.019685 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:34.019653 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fkfvs" event={"ID":"8a7d4825-6f48-4c8e-98c8-973ed6e7a0de","Type":"ContainerStarted","Data":"89b2ad39f00f4d42077a387ff796a97db6ef7b70d400e039f43f462d0343b98c"} Apr 16 20:40:34.044116 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:34.044062 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-fkfvs" podStartSLOduration=1.725607221 podStartE2EDuration="4.044049998s" podCreationTimestamp="2026-04-16 20:40:30 +0000 UTC" firstStartedPulling="2026-04-16 20:40:30.659064685 +0000 UTC m=+169.644208254" lastFinishedPulling="2026-04-16 20:40:32.977507443 +0000 UTC m=+171.962651031" observedRunningTime="2026-04-16 20:40:34.042781891 +0000 UTC m=+173.027925483" watchObservedRunningTime="2026-04-16 20:40:34.044049998 +0000 UTC m=+173.029193588" Apr 16 20:40:35.023262 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:35.023171 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dxvbz" event={"ID":"e45e2b17-af71-470b-a92b-013389ef5f6c","Type":"ContainerStarted","Data":"a82a0f83df077a9350dd7f274b7f48e4ee48694a5745fba21101285e9a688bca"} Apr 16 20:40:35.024395 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:35.024367 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bvzlr" event={"ID":"934fbf27-bfcf-48a3-b5ad-351d9c325bab","Type":"ContainerStarted","Data":"87141c0c0240fdc03e2660a8186b5466a213588995993374032c47aabff5ed24"} Apr 16 20:40:35.041030 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:35.040985 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-dxvbz" podStartSLOduration=136.416318259 podStartE2EDuration="2m18.040972342s" podCreationTimestamp="2026-04-16 20:38:17 +0000 UTC" firstStartedPulling="2026-04-16 20:40:33.065406272 +0000 UTC m=+172.050549841" lastFinishedPulling="2026-04-16 20:40:34.690060354 +0000 UTC m=+173.675203924" observedRunningTime="2026-04-16 20:40:35.039571515 +0000 UTC m=+174.024715097" watchObservedRunningTime="2026-04-16 20:40:35.040972342 +0000 UTC m=+174.026115924" Apr 16 20:40:35.057835 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:35.057799 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bvzlr" podStartSLOduration=2.498069878 podStartE2EDuration="4.057785719s" podCreationTimestamp="2026-04-16 20:40:31 +0000 UTC" firstStartedPulling="2026-04-16 20:40:33.08376586 +0000 UTC m=+172.068909432" lastFinishedPulling="2026-04-16 20:40:34.64348169 +0000 UTC m=+173.628625273" observedRunningTime="2026-04-16 20:40:35.057275249 +0000 UTC m=+174.042418827" watchObservedRunningTime="2026-04-16 20:40:35.057785719 +0000 UTC m=+174.042929512" Apr 16 20:40:35.583800 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:35.583768 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" Apr 16 20:40:35.586629 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:35.586605 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-66wkg\"" Apr 16 20:40:35.594480 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:35.594455 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" Apr 16 20:40:35.719000 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:35.718970 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-657b97c9cd-g52rs"] Apr 16 20:40:35.721549 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:40:35.721519 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf300a967_50a7_4aca_b7dc_57b123274d17.slice/crio-99b686e18735cb2d650c865444204b7a44c72dbe1fef7b2df37a4199e6429765 WatchSource:0}: Error finding container 99b686e18735cb2d650c865444204b7a44c72dbe1fef7b2df37a4199e6429765: Status 404 returned error can't find the container with id 99b686e18735cb2d650c865444204b7a44c72dbe1fef7b2df37a4199e6429765 Apr 16 20:40:36.029428 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:36.029391 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" event={"ID":"f300a967-50a7-4aca-b7dc-57b123274d17","Type":"ContainerStarted","Data":"e2e17bb31dcd3ae5236daff93dd55754dd96f10bf51753d0375b01e2a614304e"} Apr 16 20:40:36.029912 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:36.029436 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" event={"ID":"f300a967-50a7-4aca-b7dc-57b123274d17","Type":"ContainerStarted","Data":"99b686e18735cb2d650c865444204b7a44c72dbe1fef7b2df37a4199e6429765"} Apr 16 20:40:36.029912 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:36.029636 2537 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bvzlr" Apr 16 20:40:36.035221 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:36.035199 2537 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bvzlr" Apr 16 20:40:36.052602 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:36.052543 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" podStartSLOduration=174.052531998 podStartE2EDuration="2m54.052531998s" podCreationTimestamp="2026-04-16 20:37:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:40:36.051370399 +0000 UTC m=+175.036514001" watchObservedRunningTime="2026-04-16 20:40:36.052531998 +0000 UTC m=+175.037675589" Apr 16 20:40:36.674926 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:36.674894 2537 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-l5gv9"] Apr 16 20:40:36.679347 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:36.679332 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-l5gv9" Apr 16 20:40:36.681655 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:36.681633 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 20:40:36.681754 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:36.681734 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 20:40:36.682727 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:36.682706 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 20:40:36.682727 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:36.682726 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 20:40:36.682899 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:36.682773 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 20:40:36.682899 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:36.682893 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-7nxkx\"" Apr 16 20:40:36.687753 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:36.687734 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-l5gv9"] Apr 16 20:40:36.718573 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:36.718539 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/108c518b-e8d2-4cba-bed4-69db9efe10ac-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-l5gv9\" (UID: \"108c518b-e8d2-4cba-bed4-69db9efe10ac\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-l5gv9" Apr 16 20:40:36.718654 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:36.718592 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/108c518b-e8d2-4cba-bed4-69db9efe10ac-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-l5gv9\" (UID: \"108c518b-e8d2-4cba-bed4-69db9efe10ac\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-l5gv9" Apr 16 20:40:36.718716 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:36.718698 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfswx\" (UniqueName: \"kubernetes.io/projected/108c518b-e8d2-4cba-bed4-69db9efe10ac-kube-api-access-dfswx\") pod \"prometheus-operator-5676c8c784-l5gv9\" (UID: \"108c518b-e8d2-4cba-bed4-69db9efe10ac\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-l5gv9" Apr 16 20:40:36.718760 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:36.718727 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/108c518b-e8d2-4cba-bed4-69db9efe10ac-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-l5gv9\" (UID: \"108c518b-e8d2-4cba-bed4-69db9efe10ac\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-l5gv9" Apr 16 20:40:36.819431 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:36.819407 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dfswx\" (UniqueName: \"kubernetes.io/projected/108c518b-e8d2-4cba-bed4-69db9efe10ac-kube-api-access-dfswx\") pod \"prometheus-operator-5676c8c784-l5gv9\" (UID: \"108c518b-e8d2-4cba-bed4-69db9efe10ac\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-l5gv9" Apr 16 20:40:36.819545 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:36.819437 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/108c518b-e8d2-4cba-bed4-69db9efe10ac-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-l5gv9\" (UID: \"108c518b-e8d2-4cba-bed4-69db9efe10ac\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-l5gv9" Apr 16 20:40:36.819624 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:36.819585 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/108c518b-e8d2-4cba-bed4-69db9efe10ac-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-l5gv9\" (UID: \"108c518b-e8d2-4cba-bed4-69db9efe10ac\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-l5gv9" Apr 16 20:40:36.819671 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:36.819622 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/108c518b-e8d2-4cba-bed4-69db9efe10ac-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-l5gv9\" (UID: \"108c518b-e8d2-4cba-bed4-69db9efe10ac\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-l5gv9" Apr 16 20:40:36.820208 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:36.820187 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/108c518b-e8d2-4cba-bed4-69db9efe10ac-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-l5gv9\" (UID: \"108c518b-e8d2-4cba-bed4-69db9efe10ac\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-l5gv9" Apr 16 20:40:36.821850 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:36.821823 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/108c518b-e8d2-4cba-bed4-69db9efe10ac-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-l5gv9\" (UID: \"108c518b-e8d2-4cba-bed4-69db9efe10ac\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-l5gv9" Apr 16 20:40:36.821955 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:36.821829 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/108c518b-e8d2-4cba-bed4-69db9efe10ac-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-l5gv9\" (UID: \"108c518b-e8d2-4cba-bed4-69db9efe10ac\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-l5gv9" Apr 16 20:40:36.828216 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:36.828197 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfswx\" (UniqueName: \"kubernetes.io/projected/108c518b-e8d2-4cba-bed4-69db9efe10ac-kube-api-access-dfswx\") pod \"prometheus-operator-5676c8c784-l5gv9\" (UID: \"108c518b-e8d2-4cba-bed4-69db9efe10ac\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-l5gv9" Apr 16 20:40:36.988238 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:36.988190 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-l5gv9" Apr 16 20:40:37.033453 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:37.033418 2537 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" Apr 16 20:40:37.098764 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:37.098719 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-l5gv9"] Apr 16 20:40:37.101079 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:40:37.101056 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod108c518b_e8d2_4cba_bed4_69db9efe10ac.slice/crio-afc5229ad376e815d6db5e8bb0c9da0831f52efd945bfe625f9f5b9d1f769bef WatchSource:0}: Error finding container afc5229ad376e815d6db5e8bb0c9da0831f52efd945bfe625f9f5b9d1f769bef: Status 404 returned error can't find the container with id afc5229ad376e815d6db5e8bb0c9da0831f52efd945bfe625f9f5b9d1f769bef Apr 16 20:40:38.001915 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:38.001888 2537 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-fw7ch" Apr 16 20:40:38.043585 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:38.043534 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-l5gv9" event={"ID":"108c518b-e8d2-4cba-bed4-69db9efe10ac","Type":"ContainerStarted","Data":"afc5229ad376e815d6db5e8bb0c9da0831f52efd945bfe625f9f5b9d1f769bef"} Apr 16 20:40:39.047844 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:39.047802 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-l5gv9" event={"ID":"108c518b-e8d2-4cba-bed4-69db9efe10ac","Type":"ContainerStarted","Data":"0c552f590ba8c2d42feb2f9284ec12b9929f3012128618db305aa784509f9d37"} Apr 16 20:40:39.047844 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:39.047847 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-l5gv9" event={"ID":"108c518b-e8d2-4cba-bed4-69db9efe10ac","Type":"ContainerStarted","Data":"29c68a26db05e254e9b2be2f1c40cd00e4a24420e4d59005dcf8eed0513c2ee1"} Apr 16 20:40:39.065810 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:39.065764 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-l5gv9" podStartSLOduration=1.944062137 podStartE2EDuration="3.065750233s" podCreationTimestamp="2026-04-16 20:40:36 +0000 UTC" firstStartedPulling="2026-04-16 20:40:37.102767631 +0000 UTC m=+176.087911204" lastFinishedPulling="2026-04-16 20:40:38.224455731 +0000 UTC m=+177.209599300" observedRunningTime="2026-04-16 20:40:39.064212156 +0000 UTC m=+178.049355746" watchObservedRunningTime="2026-04-16 20:40:39.065750233 +0000 UTC m=+178.050893857" Apr 16 20:40:41.131000 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.130968 2537 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-fwdtc"] Apr 16 20:40:41.135602 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.135579 2537 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-rzk7j"] Apr 16 20:40:41.135705 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.135665 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-fwdtc" Apr 16 20:40:41.138474 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.138450 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 20:40:41.138615 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.138543 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-fn5z7\"" Apr 16 20:40:41.138690 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.138622 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-rzk7j" Apr 16 20:40:41.138797 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.138755 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 20:40:41.139037 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.139007 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 20:40:41.140780 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.140753 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 20:40:41.140780 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.140772 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-ltzxm\"" Apr 16 20:40:41.141457 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.141435 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 20:40:41.141587 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.141519 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 20:40:41.167919 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.167898 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-rzk7j"] Apr 16 20:40:41.248341 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.248322 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/86469fb3-aefd-4f48-8ab5-d6bbfd40f984-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-rzk7j\" (UID: \"86469fb3-aefd-4f48-8ab5-d6bbfd40f984\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rzk7j" Apr 16 20:40:41.248440 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.248350 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/86469fb3-aefd-4f48-8ab5-d6bbfd40f984-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-rzk7j\" (UID: \"86469fb3-aefd-4f48-8ab5-d6bbfd40f984\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rzk7j" Apr 16 20:40:41.248440 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.248373 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a-node-exporter-accelerators-collector-config\") pod \"node-exporter-fwdtc\" (UID: \"6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a\") " pod="openshift-monitoring/node-exporter-fwdtc" Apr 16 20:40:41.248440 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.248402 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a-node-exporter-textfile\") pod \"node-exporter-fwdtc\" (UID: \"6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a\") " pod="openshift-monitoring/node-exporter-fwdtc" Apr 16 20:40:41.248440 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.248431 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a-metrics-client-ca\") pod \"node-exporter-fwdtc\" (UID: \"6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a\") " pod="openshift-monitoring/node-exporter-fwdtc" Apr 16 20:40:41.248613 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.248472 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a-node-exporter-tls\") pod \"node-exporter-fwdtc\" (UID: \"6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a\") " pod="openshift-monitoring/node-exporter-fwdtc" Apr 16 20:40:41.248613 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.248572 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a-node-exporter-wtmp\") pod \"node-exporter-fwdtc\" (UID: \"6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a\") " pod="openshift-monitoring/node-exporter-fwdtc" Apr 16 20:40:41.248613 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.248597 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-fwdtc\" (UID: \"6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a\") " pod="openshift-monitoring/node-exporter-fwdtc" Apr 16 20:40:41.248739 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.248617 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-746mc\" (UniqueName: \"kubernetes.io/projected/86469fb3-aefd-4f48-8ab5-d6bbfd40f984-kube-api-access-746mc\") pod \"kube-state-metrics-69db897b98-rzk7j\" (UID: \"86469fb3-aefd-4f48-8ab5-d6bbfd40f984\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rzk7j" Apr 16 20:40:41.248739 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.248638 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a-sys\") pod \"node-exporter-fwdtc\" (UID: \"6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a\") " pod="openshift-monitoring/node-exporter-fwdtc" Apr 16 20:40:41.248739 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.248702 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a-root\") pod \"node-exporter-fwdtc\" (UID: \"6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a\") " pod="openshift-monitoring/node-exporter-fwdtc" Apr 16 20:40:41.248877 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.248735 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/86469fb3-aefd-4f48-8ab5-d6bbfd40f984-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-rzk7j\" (UID: \"86469fb3-aefd-4f48-8ab5-d6bbfd40f984\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rzk7j" Apr 16 20:40:41.248877 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.248764 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ch8b\" (UniqueName: \"kubernetes.io/projected/6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a-kube-api-access-2ch8b\") pod \"node-exporter-fwdtc\" (UID: \"6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a\") " pod="openshift-monitoring/node-exporter-fwdtc" Apr 16 20:40:41.248877 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.248800 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/86469fb3-aefd-4f48-8ab5-d6bbfd40f984-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-rzk7j\" (UID: \"86469fb3-aefd-4f48-8ab5-d6bbfd40f984\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rzk7j" Apr 16 20:40:41.249015 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.248882 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/86469fb3-aefd-4f48-8ab5-d6bbfd40f984-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-rzk7j\" (UID: \"86469fb3-aefd-4f48-8ab5-d6bbfd40f984\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rzk7j" Apr 16 20:40:41.349171 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.349149 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a-sys\") pod \"node-exporter-fwdtc\" (UID: \"6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a\") " pod="openshift-monitoring/node-exporter-fwdtc" Apr 16 20:40:41.349243 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.349177 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a-root\") pod \"node-exporter-fwdtc\" (UID: \"6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a\") " pod="openshift-monitoring/node-exporter-fwdtc" Apr 16 20:40:41.349243 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.349192 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/86469fb3-aefd-4f48-8ab5-d6bbfd40f984-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-rzk7j\" (UID: \"86469fb3-aefd-4f48-8ab5-d6bbfd40f984\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rzk7j" Apr 16 20:40:41.349243 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.349208 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2ch8b\" (UniqueName: \"kubernetes.io/projected/6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a-kube-api-access-2ch8b\") pod \"node-exporter-fwdtc\" (UID: \"6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a\") " pod="openshift-monitoring/node-exporter-fwdtc" Apr 16 20:40:41.349243 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.349227 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/86469fb3-aefd-4f48-8ab5-d6bbfd40f984-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-rzk7j\" (UID: \"86469fb3-aefd-4f48-8ab5-d6bbfd40f984\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rzk7j" Apr 16 20:40:41.349373 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.349249 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a-root\") pod \"node-exporter-fwdtc\" (UID: \"6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a\") " pod="openshift-monitoring/node-exporter-fwdtc" Apr 16 20:40:41.349373 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.349252 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a-sys\") pod \"node-exporter-fwdtc\" (UID: \"6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a\") " pod="openshift-monitoring/node-exporter-fwdtc" Apr 16 20:40:41.349373 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:40:41.349323 2537 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 16 20:40:41.349502 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.349374 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/86469fb3-aefd-4f48-8ab5-d6bbfd40f984-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-rzk7j\" (UID: \"86469fb3-aefd-4f48-8ab5-d6bbfd40f984\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rzk7j" Apr 16 20:40:41.349502 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:40:41.349423 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86469fb3-aefd-4f48-8ab5-d6bbfd40f984-kube-state-metrics-tls podName:86469fb3-aefd-4f48-8ab5-d6bbfd40f984 nodeName:}" failed. No retries permitted until 2026-04-16 20:40:41.849404346 +0000 UTC m=+180.834547921 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/86469fb3-aefd-4f48-8ab5-d6bbfd40f984-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-rzk7j" (UID: "86469fb3-aefd-4f48-8ab5-d6bbfd40f984") : secret "kube-state-metrics-tls" not found Apr 16 20:40:41.349502 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.349440 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/86469fb3-aefd-4f48-8ab5-d6bbfd40f984-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-rzk7j\" (UID: \"86469fb3-aefd-4f48-8ab5-d6bbfd40f984\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rzk7j" Apr 16 20:40:41.349502 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.349470 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/86469fb3-aefd-4f48-8ab5-d6bbfd40f984-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-rzk7j\" (UID: \"86469fb3-aefd-4f48-8ab5-d6bbfd40f984\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rzk7j" Apr 16 20:40:41.349502 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.349501 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a-node-exporter-accelerators-collector-config\") pod \"node-exporter-fwdtc\" (UID: \"6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a\") " pod="openshift-monitoring/node-exporter-fwdtc" Apr 16 20:40:41.349755 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.349522 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a-node-exporter-textfile\") pod \"node-exporter-fwdtc\" (UID: \"6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a\") " pod="openshift-monitoring/node-exporter-fwdtc" Apr 16 20:40:41.349755 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.349671 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a-metrics-client-ca\") pod \"node-exporter-fwdtc\" (UID: \"6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a\") " pod="openshift-monitoring/node-exporter-fwdtc" Apr 16 20:40:41.349755 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.349711 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a-node-exporter-tls\") pod \"node-exporter-fwdtc\" (UID: \"6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a\") " pod="openshift-monitoring/node-exporter-fwdtc" Apr 16 20:40:41.349755 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.349743 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a-node-exporter-wtmp\") pod \"node-exporter-fwdtc\" (UID: \"6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a\") " pod="openshift-monitoring/node-exporter-fwdtc" Apr 16 20:40:41.349936 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.349769 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-fwdtc\" (UID: \"6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a\") " pod="openshift-monitoring/node-exporter-fwdtc" Apr 16 20:40:41.349936 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.349795 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-746mc\" (UniqueName: \"kubernetes.io/projected/86469fb3-aefd-4f48-8ab5-d6bbfd40f984-kube-api-access-746mc\") pod \"kube-state-metrics-69db897b98-rzk7j\" (UID: \"86469fb3-aefd-4f48-8ab5-d6bbfd40f984\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rzk7j" Apr 16 20:40:41.349936 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.349838 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a-node-exporter-textfile\") pod \"node-exporter-fwdtc\" (UID: \"6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a\") " pod="openshift-monitoring/node-exporter-fwdtc" Apr 16 20:40:41.350079 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.349958 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/86469fb3-aefd-4f48-8ab5-d6bbfd40f984-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-rzk7j\" (UID: \"86469fb3-aefd-4f48-8ab5-d6bbfd40f984\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rzk7j" Apr 16 20:40:41.350079 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.349972 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a-node-exporter-wtmp\") pod \"node-exporter-fwdtc\" (UID: \"6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a\") " pod="openshift-monitoring/node-exporter-fwdtc" Apr 16 20:40:41.350079 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.350030 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/86469fb3-aefd-4f48-8ab5-d6bbfd40f984-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-rzk7j\" (UID: \"86469fb3-aefd-4f48-8ab5-d6bbfd40f984\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rzk7j" Apr 16 20:40:41.350228 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.350112 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a-node-exporter-accelerators-collector-config\") pod \"node-exporter-fwdtc\" (UID: \"6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a\") " pod="openshift-monitoring/node-exporter-fwdtc" Apr 16 20:40:41.350278 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.350239 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a-metrics-client-ca\") pod \"node-exporter-fwdtc\" (UID: \"6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a\") " pod="openshift-monitoring/node-exporter-fwdtc" Apr 16 20:40:41.350339 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.350321 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/86469fb3-aefd-4f48-8ab5-d6bbfd40f984-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-rzk7j\" (UID: \"86469fb3-aefd-4f48-8ab5-d6bbfd40f984\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rzk7j" Apr 16 20:40:41.352002 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.351980 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a-node-exporter-tls\") pod \"node-exporter-fwdtc\" (UID: \"6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a\") " pod="openshift-monitoring/node-exporter-fwdtc" Apr 16 20:40:41.352090 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.351983 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/86469fb3-aefd-4f48-8ab5-d6bbfd40f984-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-rzk7j\" (UID: \"86469fb3-aefd-4f48-8ab5-d6bbfd40f984\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rzk7j" Apr 16 20:40:41.352090 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.352018 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-fwdtc\" (UID: \"6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a\") " pod="openshift-monitoring/node-exporter-fwdtc" Apr 16 20:40:41.378827 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.378801 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ch8b\" (UniqueName: \"kubernetes.io/projected/6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a-kube-api-access-2ch8b\") pod \"node-exporter-fwdtc\" (UID: \"6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a\") " pod="openshift-monitoring/node-exporter-fwdtc" Apr 16 20:40:41.390179 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.390132 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-746mc\" (UniqueName: \"kubernetes.io/projected/86469fb3-aefd-4f48-8ab5-d6bbfd40f984-kube-api-access-746mc\") pod \"kube-state-metrics-69db897b98-rzk7j\" (UID: \"86469fb3-aefd-4f48-8ab5-d6bbfd40f984\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rzk7j" Apr 16 20:40:41.445346 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.445326 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-fwdtc" Apr 16 20:40:41.853098 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:41.853020 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/86469fb3-aefd-4f48-8ab5-d6bbfd40f984-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-rzk7j\" (UID: \"86469fb3-aefd-4f48-8ab5-d6bbfd40f984\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rzk7j" Apr 16 20:40:41.853245 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:40:41.853195 2537 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 16 20:40:41.853313 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:40:41.853263 2537 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86469fb3-aefd-4f48-8ab5-d6bbfd40f984-kube-state-metrics-tls podName:86469fb3-aefd-4f48-8ab5-d6bbfd40f984 nodeName:}" failed. No retries permitted until 2026-04-16 20:40:42.853246628 +0000 UTC m=+181.838390219 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/86469fb3-aefd-4f48-8ab5-d6bbfd40f984-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-rzk7j" (UID: "86469fb3-aefd-4f48-8ab5-d6bbfd40f984") : secret "kube-state-metrics-tls" not found Apr 16 20:40:42.056352 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:42.056313 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-fwdtc" event={"ID":"6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a","Type":"ContainerStarted","Data":"82b40d93059a12264aa1d14db5aa590bafe4e081273770899eea7746de5458d5"} Apr 16 20:40:42.860609 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:42.860577 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/86469fb3-aefd-4f48-8ab5-d6bbfd40f984-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-rzk7j\" (UID: \"86469fb3-aefd-4f48-8ab5-d6bbfd40f984\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rzk7j" Apr 16 20:40:42.862798 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:42.862770 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/86469fb3-aefd-4f48-8ab5-d6bbfd40f984-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-rzk7j\" (UID: \"86469fb3-aefd-4f48-8ab5-d6bbfd40f984\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rzk7j" Apr 16 20:40:42.953536 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:42.953505 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-ltzxm\"" Apr 16 20:40:42.960768 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:42.960747 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-rzk7j" Apr 16 20:40:43.061734 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:43.061695 2537 generic.go:358] "Generic (PLEG): container finished" podID="6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a" containerID="d23ddde484ab3a72c7d5cdec6498a34da777eaf6e9943190f71b827751e28ccf" exitCode=0 Apr 16 20:40:43.061924 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:43.061779 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-fwdtc" event={"ID":"6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a","Type":"ContainerDied","Data":"d23ddde484ab3a72c7d5cdec6498a34da777eaf6e9943190f71b827751e28ccf"} Apr 16 20:40:43.091462 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:43.091439 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-rzk7j"] Apr 16 20:40:43.097489 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:40:43.097463 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86469fb3_aefd_4f48_8ab5_d6bbfd40f984.slice/crio-4a0655116884378ef7824674ce38d20cfea4ca3331a1ee533a1e4dc79cdb35d4 WatchSource:0}: Error finding container 4a0655116884378ef7824674ce38d20cfea4ca3331a1ee533a1e4dc79cdb35d4: Status 404 returned error can't find the container with id 4a0655116884378ef7824674ce38d20cfea4ca3331a1ee533a1e4dc79cdb35d4 Apr 16 20:40:44.067308 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:44.067269 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-fwdtc" event={"ID":"6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a","Type":"ContainerStarted","Data":"a1ff67ff634614b1543a1f3403583a24e31f4d66e05a19016225be666d5b58b1"} Apr 16 20:40:44.067308 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:44.067314 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-fwdtc" event={"ID":"6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a","Type":"ContainerStarted","Data":"c52dfb9ef36856b4de8f62725f05f27fe60ab7a1dd9de46cf6fea08a3281f94e"} Apr 16 20:40:44.068480 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:44.068447 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-rzk7j" event={"ID":"86469fb3-aefd-4f48-8ab5-d6bbfd40f984","Type":"ContainerStarted","Data":"4a0655116884378ef7824674ce38d20cfea4ca3331a1ee533a1e4dc79cdb35d4"} Apr 16 20:40:44.092739 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:44.092682 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-fwdtc" podStartSLOduration=2.374748428 podStartE2EDuration="3.092668148s" podCreationTimestamp="2026-04-16 20:40:41 +0000 UTC" firstStartedPulling="2026-04-16 20:40:41.456684184 +0000 UTC m=+180.441827763" lastFinishedPulling="2026-04-16 20:40:42.174603898 +0000 UTC m=+181.159747483" observedRunningTime="2026-04-16 20:40:44.091110783 +0000 UTC m=+183.076254395" watchObservedRunningTime="2026-04-16 20:40:44.092668148 +0000 UTC m=+183.077811742" Apr 16 20:40:45.072732 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.072696 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-rzk7j" event={"ID":"86469fb3-aefd-4f48-8ab5-d6bbfd40f984","Type":"ContainerStarted","Data":"076897ef8520694f4ffcac5f125f18b18e46f0d33c6c5472298396c89bf44b9f"} Apr 16 20:40:45.072732 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.072733 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-rzk7j" event={"ID":"86469fb3-aefd-4f48-8ab5-d6bbfd40f984","Type":"ContainerStarted","Data":"13860009c366741cda95f55bbb46d2a7dc11a503203c5885ba0bb35ba06a5c35"} Apr 16 20:40:45.073125 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.072742 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-rzk7j" event={"ID":"86469fb3-aefd-4f48-8ab5-d6bbfd40f984","Type":"ContainerStarted","Data":"aff28a641c5bb7df57d91b35dadef4c381bc75d0671945bc78680300309568de"} Apr 16 20:40:45.092159 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.092113 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-rzk7j" podStartSLOduration=2.890626375 podStartE2EDuration="4.092099853s" podCreationTimestamp="2026-04-16 20:40:41 +0000 UTC" firstStartedPulling="2026-04-16 20:40:43.099270538 +0000 UTC m=+182.084414121" lastFinishedPulling="2026-04-16 20:40:44.300744029 +0000 UTC m=+183.285887599" observedRunningTime="2026-04-16 20:40:45.091064228 +0000 UTC m=+184.076207816" watchObservedRunningTime="2026-04-16 20:40:45.092099853 +0000 UTC m=+184.077243454" Apr 16 20:40:45.239680 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.239655 2537 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-5d9c5db45d-svcm9"] Apr 16 20:40:45.243160 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.243142 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5d9c5db45d-svcm9" Apr 16 20:40:45.244681 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.244660 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5d9c5db45d-svcm9"] Apr 16 20:40:45.246650 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.246632 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 20:40:45.246762 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.246677 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 20:40:45.246762 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.246717 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 20:40:45.246870 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.246771 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 20:40:45.247008 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.246992 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-gshg9\"" Apr 16 20:40:45.248297 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.248279 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-ip3eq7ggirp2\"" Apr 16 20:40:45.248594 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.248576 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 20:40:45.278764 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.278741 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5af0ee5a-f6b9-4e06-8341-8a0ed59e153c-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5d9c5db45d-svcm9\" (UID: \"5af0ee5a-f6b9-4e06-8341-8a0ed59e153c\") " pod="openshift-monitoring/thanos-querier-5d9c5db45d-svcm9" Apr 16 20:40:45.278857 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.278781 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/5af0ee5a-f6b9-4e06-8341-8a0ed59e153c-secret-thanos-querier-tls\") pod \"thanos-querier-5d9c5db45d-svcm9\" (UID: \"5af0ee5a-f6b9-4e06-8341-8a0ed59e153c\") " pod="openshift-monitoring/thanos-querier-5d9c5db45d-svcm9" Apr 16 20:40:45.278857 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.278815 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5af0ee5a-f6b9-4e06-8341-8a0ed59e153c-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5d9c5db45d-svcm9\" (UID: \"5af0ee5a-f6b9-4e06-8341-8a0ed59e153c\") " pod="openshift-monitoring/thanos-querier-5d9c5db45d-svcm9" Apr 16 20:40:45.278932 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.278878 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/5af0ee5a-f6b9-4e06-8341-8a0ed59e153c-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5d9c5db45d-svcm9\" (UID: \"5af0ee5a-f6b9-4e06-8341-8a0ed59e153c\") " pod="openshift-monitoring/thanos-querier-5d9c5db45d-svcm9" Apr 16 20:40:45.278932 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.278907 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5af0ee5a-f6b9-4e06-8341-8a0ed59e153c-metrics-client-ca\") pod \"thanos-querier-5d9c5db45d-svcm9\" (UID: \"5af0ee5a-f6b9-4e06-8341-8a0ed59e153c\") " pod="openshift-monitoring/thanos-querier-5d9c5db45d-svcm9" Apr 16 20:40:45.279022 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.278932 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5af0ee5a-f6b9-4e06-8341-8a0ed59e153c-secret-grpc-tls\") pod \"thanos-querier-5d9c5db45d-svcm9\" (UID: \"5af0ee5a-f6b9-4e06-8341-8a0ed59e153c\") " pod="openshift-monitoring/thanos-querier-5d9c5db45d-svcm9" Apr 16 20:40:45.279022 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.279003 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r6bg\" (UniqueName: \"kubernetes.io/projected/5af0ee5a-f6b9-4e06-8341-8a0ed59e153c-kube-api-access-8r6bg\") pod \"thanos-querier-5d9c5db45d-svcm9\" (UID: \"5af0ee5a-f6b9-4e06-8341-8a0ed59e153c\") " pod="openshift-monitoring/thanos-querier-5d9c5db45d-svcm9" Apr 16 20:40:45.279099 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.279033 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/5af0ee5a-f6b9-4e06-8341-8a0ed59e153c-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5d9c5db45d-svcm9\" (UID: \"5af0ee5a-f6b9-4e06-8341-8a0ed59e153c\") " pod="openshift-monitoring/thanos-querier-5d9c5db45d-svcm9" Apr 16 20:40:45.379505 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.379484 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/5af0ee5a-f6b9-4e06-8341-8a0ed59e153c-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5d9c5db45d-svcm9\" (UID: \"5af0ee5a-f6b9-4e06-8341-8a0ed59e153c\") " pod="openshift-monitoring/thanos-querier-5d9c5db45d-svcm9" Apr 16 20:40:45.379635 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.379529 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5af0ee5a-f6b9-4e06-8341-8a0ed59e153c-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5d9c5db45d-svcm9\" (UID: \"5af0ee5a-f6b9-4e06-8341-8a0ed59e153c\") " pod="openshift-monitoring/thanos-querier-5d9c5db45d-svcm9" Apr 16 20:40:45.379635 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.379552 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/5af0ee5a-f6b9-4e06-8341-8a0ed59e153c-secret-thanos-querier-tls\") pod \"thanos-querier-5d9c5db45d-svcm9\" (UID: \"5af0ee5a-f6b9-4e06-8341-8a0ed59e153c\") " pod="openshift-monitoring/thanos-querier-5d9c5db45d-svcm9" Apr 16 20:40:45.379635 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.379593 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5af0ee5a-f6b9-4e06-8341-8a0ed59e153c-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5d9c5db45d-svcm9\" (UID: \"5af0ee5a-f6b9-4e06-8341-8a0ed59e153c\") " pod="openshift-monitoring/thanos-querier-5d9c5db45d-svcm9" Apr 16 20:40:45.379635 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.379620 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/5af0ee5a-f6b9-4e06-8341-8a0ed59e153c-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5d9c5db45d-svcm9\" (UID: \"5af0ee5a-f6b9-4e06-8341-8a0ed59e153c\") " pod="openshift-monitoring/thanos-querier-5d9c5db45d-svcm9" Apr 16 20:40:45.379780 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.379641 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5af0ee5a-f6b9-4e06-8341-8a0ed59e153c-metrics-client-ca\") pod \"thanos-querier-5d9c5db45d-svcm9\" (UID: \"5af0ee5a-f6b9-4e06-8341-8a0ed59e153c\") " pod="openshift-monitoring/thanos-querier-5d9c5db45d-svcm9" Apr 16 20:40:45.379780 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.379658 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5af0ee5a-f6b9-4e06-8341-8a0ed59e153c-secret-grpc-tls\") pod \"thanos-querier-5d9c5db45d-svcm9\" (UID: \"5af0ee5a-f6b9-4e06-8341-8a0ed59e153c\") " pod="openshift-monitoring/thanos-querier-5d9c5db45d-svcm9" Apr 16 20:40:45.379780 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.379681 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8r6bg\" (UniqueName: \"kubernetes.io/projected/5af0ee5a-f6b9-4e06-8341-8a0ed59e153c-kube-api-access-8r6bg\") pod \"thanos-querier-5d9c5db45d-svcm9\" (UID: \"5af0ee5a-f6b9-4e06-8341-8a0ed59e153c\") " pod="openshift-monitoring/thanos-querier-5d9c5db45d-svcm9" Apr 16 20:40:45.380395 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.380358 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5af0ee5a-f6b9-4e06-8341-8a0ed59e153c-metrics-client-ca\") pod \"thanos-querier-5d9c5db45d-svcm9\" (UID: \"5af0ee5a-f6b9-4e06-8341-8a0ed59e153c\") " pod="openshift-monitoring/thanos-querier-5d9c5db45d-svcm9" Apr 16 20:40:45.382225 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.382196 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5af0ee5a-f6b9-4e06-8341-8a0ed59e153c-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5d9c5db45d-svcm9\" (UID: \"5af0ee5a-f6b9-4e06-8341-8a0ed59e153c\") " pod="openshift-monitoring/thanos-querier-5d9c5db45d-svcm9" Apr 16 20:40:45.382327 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.382239 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/5af0ee5a-f6b9-4e06-8341-8a0ed59e153c-secret-thanos-querier-tls\") pod \"thanos-querier-5d9c5db45d-svcm9\" (UID: \"5af0ee5a-f6b9-4e06-8341-8a0ed59e153c\") " pod="openshift-monitoring/thanos-querier-5d9c5db45d-svcm9" Apr 16 20:40:45.382395 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.382376 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/5af0ee5a-f6b9-4e06-8341-8a0ed59e153c-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5d9c5db45d-svcm9\" (UID: \"5af0ee5a-f6b9-4e06-8341-8a0ed59e153c\") " pod="openshift-monitoring/thanos-querier-5d9c5db45d-svcm9" Apr 16 20:40:45.382434 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.382375 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5af0ee5a-f6b9-4e06-8341-8a0ed59e153c-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5d9c5db45d-svcm9\" (UID: \"5af0ee5a-f6b9-4e06-8341-8a0ed59e153c\") " pod="openshift-monitoring/thanos-querier-5d9c5db45d-svcm9" Apr 16 20:40:45.382544 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.382522 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/5af0ee5a-f6b9-4e06-8341-8a0ed59e153c-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5d9c5db45d-svcm9\" (UID: \"5af0ee5a-f6b9-4e06-8341-8a0ed59e153c\") " pod="openshift-monitoring/thanos-querier-5d9c5db45d-svcm9" Apr 16 20:40:45.382724 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.382708 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5af0ee5a-f6b9-4e06-8341-8a0ed59e153c-secret-grpc-tls\") pod \"thanos-querier-5d9c5db45d-svcm9\" (UID: \"5af0ee5a-f6b9-4e06-8341-8a0ed59e153c\") " pod="openshift-monitoring/thanos-querier-5d9c5db45d-svcm9" Apr 16 20:40:45.387921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.387902 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r6bg\" (UniqueName: \"kubernetes.io/projected/5af0ee5a-f6b9-4e06-8341-8a0ed59e153c-kube-api-access-8r6bg\") pod \"thanos-querier-5d9c5db45d-svcm9\" (UID: \"5af0ee5a-f6b9-4e06-8341-8a0ed59e153c\") " pod="openshift-monitoring/thanos-querier-5d9c5db45d-svcm9" Apr 16 20:40:45.553022 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.552999 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5d9c5db45d-svcm9" Apr 16 20:40:45.682280 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.682206 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5d9c5db45d-svcm9"] Apr 16 20:40:45.684904 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:40:45.684873 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5af0ee5a_f6b9_4e06_8341_8a0ed59e153c.slice/crio-00b3a68fb952207d0833ed22f6ccd025a1bf22918fdbe72bd9dadc2bfca509a0 WatchSource:0}: Error finding container 00b3a68fb952207d0833ed22f6ccd025a1bf22918fdbe72bd9dadc2bfca509a0: Status 404 returned error can't find the container with id 00b3a68fb952207d0833ed22f6ccd025a1bf22918fdbe72bd9dadc2bfca509a0 Apr 16 20:40:45.878761 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.878735 2537 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-t7nch"] Apr 16 20:40:45.884448 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.884420 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-t7nch" Apr 16 20:40:45.887792 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.887777 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-sr57h\"" Apr 16 20:40:45.888004 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.887988 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 20:40:45.898595 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.898573 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-t7nch"] Apr 16 20:40:45.984395 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:45.984346 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/58ddffcf-42f3-4d68-87b2-9bb8f0c60899-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-t7nch\" (UID: \"58ddffcf-42f3-4d68-87b2-9bb8f0c60899\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-t7nch" Apr 16 20:40:46.076441 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:46.076414 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d9c5db45d-svcm9" event={"ID":"5af0ee5a-f6b9-4e06-8341-8a0ed59e153c","Type":"ContainerStarted","Data":"00b3a68fb952207d0833ed22f6ccd025a1bf22918fdbe72bd9dadc2bfca509a0"} Apr 16 20:40:46.084913 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:46.084893 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/58ddffcf-42f3-4d68-87b2-9bb8f0c60899-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-t7nch\" (UID: \"58ddffcf-42f3-4d68-87b2-9bb8f0c60899\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-t7nch" Apr 16 20:40:46.087206 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:46.087182 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/58ddffcf-42f3-4d68-87b2-9bb8f0c60899-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-t7nch\" (UID: \"58ddffcf-42f3-4d68-87b2-9bb8f0c60899\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-t7nch" Apr 16 20:40:46.193450 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:46.193424 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-t7nch" Apr 16 20:40:46.316449 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:46.316424 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-t7nch"] Apr 16 20:40:46.319021 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:40:46.318988 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58ddffcf_42f3_4d68_87b2_9bb8f0c60899.slice/crio-97b583b1eeb45c8baa882e54c6a761eabd187b99d04af55d5aade1cefd1bd6da WatchSource:0}: Error finding container 97b583b1eeb45c8baa882e54c6a761eabd187b99d04af55d5aade1cefd1bd6da: Status 404 returned error can't find the container with id 97b583b1eeb45c8baa882e54c6a761eabd187b99d04af55d5aade1cefd1bd6da Apr 16 20:40:47.080751 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.080709 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-t7nch" event={"ID":"58ddffcf-42f3-4d68-87b2-9bb8f0c60899","Type":"ContainerStarted","Data":"97b583b1eeb45c8baa882e54c6a761eabd187b99d04af55d5aade1cefd1bd6da"} Apr 16 20:40:47.429443 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.429264 2537 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:40:47.434474 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.434432 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.440670 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.439127 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 20:40:47.440670 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.439663 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 20:40:47.440670 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.439923 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-bea04r43g4e0l\"" Apr 16 20:40:47.440670 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.439983 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 20:40:47.440670 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.440107 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 20:40:47.440670 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.440153 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 20:40:47.440670 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.440306 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 20:40:47.440670 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.440339 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-p998x\"" Apr 16 20:40:47.440670 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.440469 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 20:40:47.440670 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.440504 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 20:40:47.445143 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.445122 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 20:40:47.446528 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.446142 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 20:40:47.446695 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.446666 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 20:40:47.447994 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.447973 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 20:40:47.455346 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.455168 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 20:40:47.462827 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.462805 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:40:47.495717 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.495650 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.495717 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.495701 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-web-config\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.495939 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.495789 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5deb3abf-e442-44b4-bea6-61efd1b060c0-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.495939 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.495816 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.495939 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.495843 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5deb3abf-e442-44b4-bea6-61efd1b060c0-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.495939 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.495866 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.495939 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.495898 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5deb3abf-e442-44b4-bea6-61efd1b060c0-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.495939 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.495923 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5deb3abf-e442-44b4-bea6-61efd1b060c0-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.496225 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.495955 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.496225 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.495985 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-config\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.496225 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.496004 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5deb3abf-e442-44b4-bea6-61efd1b060c0-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.496225 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.496026 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.496225 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.496041 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5deb3abf-e442-44b4-bea6-61efd1b060c0-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.496225 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.496063 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74fxh\" (UniqueName: \"kubernetes.io/projected/5deb3abf-e442-44b4-bea6-61efd1b060c0-kube-api-access-74fxh\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.496225 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.496082 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.496225 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.496102 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5deb3abf-e442-44b4-bea6-61efd1b060c0-config-out\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.496225 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.496118 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.496225 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.496144 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5deb3abf-e442-44b4-bea6-61efd1b060c0-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.596859 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.596830 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5deb3abf-e442-44b4-bea6-61efd1b060c0-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.596859 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.596862 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5deb3abf-e442-44b4-bea6-61efd1b060c0-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.597101 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.596891 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.597101 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.596917 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-config\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.597101 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.596944 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5deb3abf-e442-44b4-bea6-61efd1b060c0-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.597101 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.596972 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.597101 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.597000 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5deb3abf-e442-44b4-bea6-61efd1b060c0-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.597101 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.597029 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-74fxh\" (UniqueName: \"kubernetes.io/projected/5deb3abf-e442-44b4-bea6-61efd1b060c0-kube-api-access-74fxh\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.597101 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.597052 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.597101 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.597082 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5deb3abf-e442-44b4-bea6-61efd1b060c0-config-out\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.597101 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.597102 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.597552 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.597125 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5deb3abf-e442-44b4-bea6-61efd1b060c0-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.597552 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.597177 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.597552 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.597212 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-web-config\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.597552 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.597273 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5deb3abf-e442-44b4-bea6-61efd1b060c0-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.597552 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.597299 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.597552 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.597320 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5deb3abf-e442-44b4-bea6-61efd1b060c0-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.597552 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.597342 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.597904 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.597644 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5deb3abf-e442-44b4-bea6-61efd1b060c0-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.598292 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.598239 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5deb3abf-e442-44b4-bea6-61efd1b060c0-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.598292 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.598266 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5deb3abf-e442-44b4-bea6-61efd1b060c0-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.599092 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.598783 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5deb3abf-e442-44b4-bea6-61efd1b060c0-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.602074 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.600925 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.602074 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.601511 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.602074 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.601512 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.602268 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.602083 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.602331 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.602307 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5deb3abf-e442-44b4-bea6-61efd1b060c0-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.604500 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.604393 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.604500 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.604463 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-config\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.605655 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.605588 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5deb3abf-e442-44b4-bea6-61efd1b060c0-config-out\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.606786 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.606739 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.606907 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.606805 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.607007 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.606981 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5deb3abf-e442-44b4-bea6-61efd1b060c0-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.607389 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.607350 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-web-config\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.608121 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.608085 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5deb3abf-e442-44b4-bea6-61efd1b060c0-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.609656 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.609634 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-74fxh\" (UniqueName: \"kubernetes.io/projected/5deb3abf-e442-44b4-bea6-61efd1b060c0-kube-api-access-74fxh\") pod \"prometheus-k8s-0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:47.749377 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:47.749284 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:40:48.363780 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:48.363754 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:40:48.367616 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:40:48.367554 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5deb3abf_e442_44b4_bea6_61efd1b060c0.slice/crio-a6ecc5d7d1e8dcfd85ceb92467839d186a408ba255bb987d240138b1e8c1d5ca WatchSource:0}: Error finding container a6ecc5d7d1e8dcfd85ceb92467839d186a408ba255bb987d240138b1e8c1d5ca: Status 404 returned error can't find the container with id a6ecc5d7d1e8dcfd85ceb92467839d186a408ba255bb987d240138b1e8c1d5ca Apr 16 20:40:49.089729 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:49.089692 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5deb3abf-e442-44b4-bea6-61efd1b060c0","Type":"ContainerStarted","Data":"a6ecc5d7d1e8dcfd85ceb92467839d186a408ba255bb987d240138b1e8c1d5ca"} Apr 16 20:40:49.091346 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:49.091320 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d9c5db45d-svcm9" event={"ID":"5af0ee5a-f6b9-4e06-8341-8a0ed59e153c","Type":"ContainerStarted","Data":"f3bc5373961b6d9ccc51c8ac34b8c47203ff18e4943c905cb063b3db623fa26f"} Apr 16 20:40:49.091346 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:49.091347 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d9c5db45d-svcm9" event={"ID":"5af0ee5a-f6b9-4e06-8341-8a0ed59e153c","Type":"ContainerStarted","Data":"db3613c48bed4fa5418974b89dbb98a1b16950c0451a888cbc117a2f2f5ef950"} Apr 16 20:40:49.091530 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:49.091356 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d9c5db45d-svcm9" event={"ID":"5af0ee5a-f6b9-4e06-8341-8a0ed59e153c","Type":"ContainerStarted","Data":"eb0051ebea58719107ec0b93d60dface925ba08b5f3360a6ccaaaf879cdebdf3"} Apr 16 20:40:49.092498 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:49.092475 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-t7nch" event={"ID":"58ddffcf-42f3-4d68-87b2-9bb8f0c60899","Type":"ContainerStarted","Data":"25b1224f9e08b572257c2a07691d533da4e6832d13d4e45ba35f8c2becc965a5"} Apr 16 20:40:49.092703 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:49.092686 2537 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-t7nch" Apr 16 20:40:49.097823 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:49.097801 2537 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-t7nch" Apr 16 20:40:49.108859 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:49.108822 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-t7nch" podStartSLOduration=2.197838727 podStartE2EDuration="4.108811284s" podCreationTimestamp="2026-04-16 20:40:45 +0000 UTC" firstStartedPulling="2026-04-16 20:40:46.320799494 +0000 UTC m=+185.305943067" lastFinishedPulling="2026-04-16 20:40:48.231772054 +0000 UTC m=+187.216915624" observedRunningTime="2026-04-16 20:40:49.108326345 +0000 UTC m=+188.093469938" watchObservedRunningTime="2026-04-16 20:40:49.108811284 +0000 UTC m=+188.093954874" Apr 16 20:40:50.099583 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:50.099541 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d9c5db45d-svcm9" event={"ID":"5af0ee5a-f6b9-4e06-8341-8a0ed59e153c","Type":"ContainerStarted","Data":"a906c96e5cc898fb7b0514a1bd14beb91b5d508f88913cc19423ce97e7f85646"} Apr 16 20:40:50.099583 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:50.099585 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d9c5db45d-svcm9" event={"ID":"5af0ee5a-f6b9-4e06-8341-8a0ed59e153c","Type":"ContainerStarted","Data":"1d3d4691a1cb6f9042afc2ef21c80fa0a19a29df39a1d574d738fb9879a86087"} Apr 16 20:40:50.100134 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:50.099596 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d9c5db45d-svcm9" event={"ID":"5af0ee5a-f6b9-4e06-8341-8a0ed59e153c","Type":"ContainerStarted","Data":"5f3463561849eb133b1331241c5a3b9579496f11fbc13efc0d577bd5eaba1bde"} Apr 16 20:40:50.100134 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:50.099701 2537 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-5d9c5db45d-svcm9" Apr 16 20:40:50.100782 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:50.100760 2537 generic.go:358] "Generic (PLEG): container finished" podID="5deb3abf-e442-44b4-bea6-61efd1b060c0" containerID="41049a9f41018c10758d08da283e6944f25366716b536a76e2e1bb87296e6f4c" exitCode=0 Apr 16 20:40:50.100892 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:50.100846 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5deb3abf-e442-44b4-bea6-61efd1b060c0","Type":"ContainerDied","Data":"41049a9f41018c10758d08da283e6944f25366716b536a76e2e1bb87296e6f4c"} Apr 16 20:40:50.126912 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:50.126871 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-5d9c5db45d-svcm9" podStartSLOduration=1.453186566 podStartE2EDuration="5.126859772s" podCreationTimestamp="2026-04-16 20:40:45 +0000 UTC" firstStartedPulling="2026-04-16 20:40:45.68717308 +0000 UTC m=+184.672316654" lastFinishedPulling="2026-04-16 20:40:49.360846288 +0000 UTC m=+188.345989860" observedRunningTime="2026-04-16 20:40:50.125522827 +0000 UTC m=+189.110666415" watchObservedRunningTime="2026-04-16 20:40:50.126859772 +0000 UTC m=+189.112003617" Apr 16 20:40:52.977016 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:52.976984 2537 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-657b97c9cd-g52rs"] Apr 16 20:40:52.981480 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:52.981331 2537 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" Apr 16 20:40:53.111900 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:53.111868 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5deb3abf-e442-44b4-bea6-61efd1b060c0","Type":"ContainerStarted","Data":"3f2ea2db1694958569d93b03dd419b64ea256e968bf413f691f9b529ecc6b2da"} Apr 16 20:40:53.111900 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:53.111897 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5deb3abf-e442-44b4-bea6-61efd1b060c0","Type":"ContainerStarted","Data":"aa4a0aae7c1f109e92a245a4a830a92b986ddd80fb513f4c0491f89a5e887440"} Apr 16 20:40:53.111900 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:53.111908 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5deb3abf-e442-44b4-bea6-61efd1b060c0","Type":"ContainerStarted","Data":"7871538fec329b53449de04d5c413d81d024234a5f828750a181feb85f91ec43"} Apr 16 20:40:53.112152 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:53.111917 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5deb3abf-e442-44b4-bea6-61efd1b060c0","Type":"ContainerStarted","Data":"2ab1d9d238738be7c55ee9aa6243530dc33e3b37bdad350039efefb254d87f69"} Apr 16 20:40:53.112152 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:53.111927 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5deb3abf-e442-44b4-bea6-61efd1b060c0","Type":"ContainerStarted","Data":"58fc42f248c1cb275e0b398632fadcd143c9f87bb1c3a73b7373c3a0a00c208b"} Apr 16 20:40:53.112152 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:53.111936 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5deb3abf-e442-44b4-bea6-61efd1b060c0","Type":"ContainerStarted","Data":"db3c2625410e9c686b14be512339488a65e092e73ef05a8656b9fa5b418a7676"} Apr 16 20:40:53.148506 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:53.148415 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.893077217 podStartE2EDuration="6.148398478s" podCreationTimestamp="2026-04-16 20:40:47 +0000 UTC" firstStartedPulling="2026-04-16 20:40:48.369454056 +0000 UTC m=+187.354597625" lastFinishedPulling="2026-04-16 20:40:52.624775313 +0000 UTC m=+191.609918886" observedRunningTime="2026-04-16 20:40:53.146957137 +0000 UTC m=+192.132100738" watchObservedRunningTime="2026-04-16 20:40:53.148398478 +0000 UTC m=+192.133542069" Apr 16 20:40:56.110153 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:56.110123 2537 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-5d9c5db45d-svcm9" Apr 16 20:40:57.750143 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:40:57.750106 2537 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:41:17.995481 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:17.995444 2537 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" podUID="f300a967-50a7-4aca-b7dc-57b123274d17" containerName="registry" containerID="cri-o://e2e17bb31dcd3ae5236daff93dd55754dd96f10bf51753d0375b01e2a614304e" gracePeriod=30 Apr 16 20:41:18.187240 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:18.187158 2537 generic.go:358] "Generic (PLEG): container finished" podID="f300a967-50a7-4aca-b7dc-57b123274d17" containerID="e2e17bb31dcd3ae5236daff93dd55754dd96f10bf51753d0375b01e2a614304e" exitCode=0 Apr 16 20:41:18.187240 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:18.187199 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" event={"ID":"f300a967-50a7-4aca-b7dc-57b123274d17","Type":"ContainerDied","Data":"e2e17bb31dcd3ae5236daff93dd55754dd96f10bf51753d0375b01e2a614304e"} Apr 16 20:41:18.249331 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:18.249272 2537 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" Apr 16 20:41:18.349350 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:18.349313 2537 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s98nw\" (UniqueName: \"kubernetes.io/projected/f300a967-50a7-4aca-b7dc-57b123274d17-kube-api-access-s98nw\") pod \"f300a967-50a7-4aca-b7dc-57b123274d17\" (UID: \"f300a967-50a7-4aca-b7dc-57b123274d17\") " Apr 16 20:41:18.350008 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:18.349985 2537 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f300a967-50a7-4aca-b7dc-57b123274d17-registry-tls\") pod \"f300a967-50a7-4aca-b7dc-57b123274d17\" (UID: \"f300a967-50a7-4aca-b7dc-57b123274d17\") " Apr 16 20:41:18.350167 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:18.350154 2537 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f300a967-50a7-4aca-b7dc-57b123274d17-bound-sa-token\") pod \"f300a967-50a7-4aca-b7dc-57b123274d17\" (UID: \"f300a967-50a7-4aca-b7dc-57b123274d17\") " Apr 16 20:41:18.350271 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:18.350258 2537 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f300a967-50a7-4aca-b7dc-57b123274d17-image-registry-private-configuration\") pod \"f300a967-50a7-4aca-b7dc-57b123274d17\" (UID: \"f300a967-50a7-4aca-b7dc-57b123274d17\") " Apr 16 20:41:18.350379 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:18.350367 2537 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f300a967-50a7-4aca-b7dc-57b123274d17-installation-pull-secrets\") pod \"f300a967-50a7-4aca-b7dc-57b123274d17\" (UID: \"f300a967-50a7-4aca-b7dc-57b123274d17\") " Apr 16 20:41:18.350472 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:18.350461 2537 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f300a967-50a7-4aca-b7dc-57b123274d17-trusted-ca\") pod \"f300a967-50a7-4aca-b7dc-57b123274d17\" (UID: \"f300a967-50a7-4aca-b7dc-57b123274d17\") " Apr 16 20:41:18.350584 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:18.350549 2537 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f300a967-50a7-4aca-b7dc-57b123274d17-registry-certificates\") pod \"f300a967-50a7-4aca-b7dc-57b123274d17\" (UID: \"f300a967-50a7-4aca-b7dc-57b123274d17\") " Apr 16 20:41:18.350699 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:18.350686 2537 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f300a967-50a7-4aca-b7dc-57b123274d17-ca-trust-extracted\") pod \"f300a967-50a7-4aca-b7dc-57b123274d17\" (UID: \"f300a967-50a7-4aca-b7dc-57b123274d17\") " Apr 16 20:41:18.351269 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:18.351235 2537 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f300a967-50a7-4aca-b7dc-57b123274d17-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "f300a967-50a7-4aca-b7dc-57b123274d17" (UID: "f300a967-50a7-4aca-b7dc-57b123274d17"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:41:18.351472 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:18.351451 2537 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f300a967-50a7-4aca-b7dc-57b123274d17-trusted-ca\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 16 20:41:18.351825 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:18.351794 2537 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f300a967-50a7-4aca-b7dc-57b123274d17-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "f300a967-50a7-4aca-b7dc-57b123274d17" (UID: "f300a967-50a7-4aca-b7dc-57b123274d17"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:41:18.353574 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:18.353531 2537 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f300a967-50a7-4aca-b7dc-57b123274d17-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "f300a967-50a7-4aca-b7dc-57b123274d17" (UID: "f300a967-50a7-4aca-b7dc-57b123274d17"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:41:18.353684 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:18.353584 2537 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f300a967-50a7-4aca-b7dc-57b123274d17-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "f300a967-50a7-4aca-b7dc-57b123274d17" (UID: "f300a967-50a7-4aca-b7dc-57b123274d17"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:41:18.353742 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:18.353631 2537 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f300a967-50a7-4aca-b7dc-57b123274d17-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "f300a967-50a7-4aca-b7dc-57b123274d17" (UID: "f300a967-50a7-4aca-b7dc-57b123274d17"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:41:18.353742 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:18.353684 2537 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f300a967-50a7-4aca-b7dc-57b123274d17-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "f300a967-50a7-4aca-b7dc-57b123274d17" (UID: "f300a967-50a7-4aca-b7dc-57b123274d17"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:41:18.353942 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:18.353920 2537 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f300a967-50a7-4aca-b7dc-57b123274d17-kube-api-access-s98nw" (OuterVolumeSpecName: "kube-api-access-s98nw") pod "f300a967-50a7-4aca-b7dc-57b123274d17" (UID: "f300a967-50a7-4aca-b7dc-57b123274d17"). InnerVolumeSpecName "kube-api-access-s98nw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:41:18.363339 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:18.363306 2537 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f300a967-50a7-4aca-b7dc-57b123274d17-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "f300a967-50a7-4aca-b7dc-57b123274d17" (UID: "f300a967-50a7-4aca-b7dc-57b123274d17"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:41:18.452817 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:18.452782 2537 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f300a967-50a7-4aca-b7dc-57b123274d17-registry-certificates\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 16 20:41:18.452817 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:18.452812 2537 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f300a967-50a7-4aca-b7dc-57b123274d17-ca-trust-extracted\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 16 20:41:18.453012 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:18.452822 2537 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s98nw\" (UniqueName: \"kubernetes.io/projected/f300a967-50a7-4aca-b7dc-57b123274d17-kube-api-access-s98nw\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 16 20:41:18.453012 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:18.452839 2537 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f300a967-50a7-4aca-b7dc-57b123274d17-registry-tls\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 16 20:41:18.453012 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:18.452849 2537 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f300a967-50a7-4aca-b7dc-57b123274d17-bound-sa-token\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 16 20:41:18.453012 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:18.452857 2537 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f300a967-50a7-4aca-b7dc-57b123274d17-image-registry-private-configuration\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 16 20:41:18.453012 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:18.452867 2537 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f300a967-50a7-4aca-b7dc-57b123274d17-installation-pull-secrets\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 16 20:41:19.191307 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:19.191273 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" event={"ID":"f300a967-50a7-4aca-b7dc-57b123274d17","Type":"ContainerDied","Data":"99b686e18735cb2d650c865444204b7a44c72dbe1fef7b2df37a4199e6429765"} Apr 16 20:41:19.191307 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:19.191313 2537 scope.go:117] "RemoveContainer" containerID="e2e17bb31dcd3ae5236daff93dd55754dd96f10bf51753d0375b01e2a614304e" Apr 16 20:41:19.191801 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:19.191282 2537 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-657b97c9cd-g52rs" Apr 16 20:41:19.211570 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:19.211534 2537 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-657b97c9cd-g52rs"] Apr 16 20:41:19.221697 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:19.221679 2537 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-657b97c9cd-g52rs"] Apr 16 20:41:19.587307 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:19.587247 2537 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f300a967-50a7-4aca-b7dc-57b123274d17" path="/var/lib/kubelet/pods/f300a967-50a7-4aca-b7dc-57b123274d17/volumes" Apr 16 20:41:20.195726 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:20.195694 2537 generic.go:358] "Generic (PLEG): container finished" podID="f1be75c1-32f2-40ea-9bb4-c921c216c25d" containerID="78a3dc2df31323beba6e37c59739c27a983d478ba1b33f24732a66af658e670a" exitCode=0 Apr 16 20:41:20.196052 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:20.195768 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9k6sg" event={"ID":"f1be75c1-32f2-40ea-9bb4-c921c216c25d","Type":"ContainerDied","Data":"78a3dc2df31323beba6e37c59739c27a983d478ba1b33f24732a66af658e670a"} Apr 16 20:41:20.196117 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:20.196052 2537 scope.go:117] "RemoveContainer" containerID="78a3dc2df31323beba6e37c59739c27a983d478ba1b33f24732a66af658e670a" Apr 16 20:41:21.201081 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:21.201048 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9k6sg" event={"ID":"f1be75c1-32f2-40ea-9bb4-c921c216c25d","Type":"ContainerStarted","Data":"7db78688b40eed415a45aafc293224f81c7e8d92c10b648e34ef0e2f075848a7"} Apr 16 20:41:47.750353 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:47.750318 2537 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:41:47.769517 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:47.769493 2537 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:41:48.297477 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:48.297451 2537 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:41:53.302193 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:53.302156 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/11dcf076-13e5-4128-bf13-7e6c86c6dd5b-metrics-certs\") pod \"network-metrics-daemon-d6xkn\" (UID: \"11dcf076-13e5-4128-bf13-7e6c86c6dd5b\") " pod="openshift-multus/network-metrics-daemon-d6xkn" Apr 16 20:41:53.304298 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:53.304273 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/11dcf076-13e5-4128-bf13-7e6c86c6dd5b-metrics-certs\") pod \"network-metrics-daemon-d6xkn\" (UID: \"11dcf076-13e5-4128-bf13-7e6c86c6dd5b\") " pod="openshift-multus/network-metrics-daemon-d6xkn" Apr 16 20:41:53.386753 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:53.386730 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vdj2n\"" Apr 16 20:41:53.394547 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:53.394532 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d6xkn" Apr 16 20:41:53.503585 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:53.503546 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-d6xkn"] Apr 16 20:41:53.506203 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:41:53.506176 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11dcf076_13e5_4128_bf13_7e6c86c6dd5b.slice/crio-dc9c909e63277bd8b802e2b97d6351a59e390aa81c227760acb11b3d39f3dc84 WatchSource:0}: Error finding container dc9c909e63277bd8b802e2b97d6351a59e390aa81c227760acb11b3d39f3dc84: Status 404 returned error can't find the container with id dc9c909e63277bd8b802e2b97d6351a59e390aa81c227760acb11b3d39f3dc84 Apr 16 20:41:54.302808 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:54.302768 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-d6xkn" event={"ID":"11dcf076-13e5-4128-bf13-7e6c86c6dd5b","Type":"ContainerStarted","Data":"dc9c909e63277bd8b802e2b97d6351a59e390aa81c227760acb11b3d39f3dc84"} Apr 16 20:41:55.306975 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:55.306933 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-d6xkn" event={"ID":"11dcf076-13e5-4128-bf13-7e6c86c6dd5b","Type":"ContainerStarted","Data":"5cf071ccaa871e6bb2d4e1fd92835a52b9c291ec72b3a192e4572982c748916c"} Apr 16 20:41:55.306975 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:55.306979 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-d6xkn" event={"ID":"11dcf076-13e5-4128-bf13-7e6c86c6dd5b","Type":"ContainerStarted","Data":"81048537bdc4da25687c22f901bdb84932cd70ab034ee3ca42cf6e50edb0e29d"} Apr 16 20:41:55.323722 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:41:55.323676 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-d6xkn" podStartSLOduration=253.120359758 podStartE2EDuration="4m14.323661573s" podCreationTimestamp="2026-04-16 20:37:41 +0000 UTC" firstStartedPulling="2026-04-16 20:41:53.507964292 +0000 UTC m=+252.493107865" lastFinishedPulling="2026-04-16 20:41:54.71126611 +0000 UTC m=+253.696409680" observedRunningTime="2026-04-16 20:41:55.32164573 +0000 UTC m=+254.306789320" watchObservedRunningTime="2026-04-16 20:41:55.323661573 +0000 UTC m=+254.308805163" Apr 16 20:42:05.718794 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:05.718704 2537 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:42:05.719255 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:05.719126 2537 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="5deb3abf-e442-44b4-bea6-61efd1b060c0" containerName="prometheus" containerID="cri-o://db3c2625410e9c686b14be512339488a65e092e73ef05a8656b9fa5b418a7676" gracePeriod=600 Apr 16 20:42:05.719255 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:05.719184 2537 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="5deb3abf-e442-44b4-bea6-61efd1b060c0" containerName="kube-rbac-proxy-web" containerID="cri-o://7871538fec329b53449de04d5c413d81d024234a5f828750a181feb85f91ec43" gracePeriod=600 Apr 16 20:42:05.719255 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:05.719238 2537 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="5deb3abf-e442-44b4-bea6-61efd1b060c0" containerName="kube-rbac-proxy-thanos" containerID="cri-o://3f2ea2db1694958569d93b03dd419b64ea256e968bf413f691f9b529ecc6b2da" gracePeriod=600 Apr 16 20:42:05.719409 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:05.719192 2537 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="5deb3abf-e442-44b4-bea6-61efd1b060c0" containerName="config-reloader" containerID="cri-o://58fc42f248c1cb275e0b398632fadcd143c9f87bb1c3a73b7373c3a0a00c208b" gracePeriod=600 Apr 16 20:42:05.719409 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:05.719161 2537 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="5deb3abf-e442-44b4-bea6-61efd1b060c0" containerName="kube-rbac-proxy" containerID="cri-o://aa4a0aae7c1f109e92a245a4a830a92b986ddd80fb513f4c0491f89a5e887440" gracePeriod=600 Apr 16 20:42:05.719409 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:05.719299 2537 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="5deb3abf-e442-44b4-bea6-61efd1b060c0" containerName="thanos-sidecar" containerID="cri-o://2ab1d9d238738be7c55ee9aa6243530dc33e3b37bdad350039efefb254d87f69" gracePeriod=600 Apr 16 20:42:05.958295 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:05.958271 2537 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:05.997333 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:05.997235 2537 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5deb3abf-e442-44b4-bea6-61efd1b060c0-prometheus-trusted-ca-bundle\") pod \"5deb3abf-e442-44b4-bea6-61efd1b060c0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " Apr 16 20:42:05.997333 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:05.997277 2537 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-secret-prometheus-k8s-tls\") pod \"5deb3abf-e442-44b4-bea6-61efd1b060c0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " Apr 16 20:42:05.997333 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:05.997314 2537 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5deb3abf-e442-44b4-bea6-61efd1b060c0-configmap-serving-certs-ca-bundle\") pod \"5deb3abf-e442-44b4-bea6-61efd1b060c0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " Apr 16 20:42:05.997678 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:05.997373 2537 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-thanos-prometheus-http-client-file\") pod \"5deb3abf-e442-44b4-bea6-61efd1b060c0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " Apr 16 20:42:05.997678 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:05.997399 2537 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-secret-kube-rbac-proxy\") pod \"5deb3abf-e442-44b4-bea6-61efd1b060c0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " Apr 16 20:42:05.997678 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:05.997422 2537 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-config\") pod \"5deb3abf-e442-44b4-bea6-61efd1b060c0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " Apr 16 20:42:05.997678 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:05.997455 2537 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5deb3abf-e442-44b4-bea6-61efd1b060c0-configmap-kubelet-serving-ca-bundle\") pod \"5deb3abf-e442-44b4-bea6-61efd1b060c0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " Apr 16 20:42:05.997678 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:05.997485 2537 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"5deb3abf-e442-44b4-bea6-61efd1b060c0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " Apr 16 20:42:05.997678 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:05.997518 2537 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"5deb3abf-e442-44b4-bea6-61efd1b060c0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " Apr 16 20:42:05.997678 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:05.997572 2537 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74fxh\" (UniqueName: \"kubernetes.io/projected/5deb3abf-e442-44b4-bea6-61efd1b060c0-kube-api-access-74fxh\") pod \"5deb3abf-e442-44b4-bea6-61efd1b060c0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " Apr 16 20:42:05.997678 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:05.997614 2537 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5deb3abf-e442-44b4-bea6-61efd1b060c0-prometheus-k8s-db\") pod \"5deb3abf-e442-44b4-bea6-61efd1b060c0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " Apr 16 20:42:05.997678 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:05.997649 2537 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5deb3abf-e442-44b4-bea6-61efd1b060c0-prometheus-k8s-rulefiles-0\") pod \"5deb3abf-e442-44b4-bea6-61efd1b060c0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " Apr 16 20:42:05.997678 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:05.997676 2537 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5deb3abf-e442-44b4-bea6-61efd1b060c0-config-out\") pod \"5deb3abf-e442-44b4-bea6-61efd1b060c0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " Apr 16 20:42:05.998141 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:05.997702 2537 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-secret-metrics-client-certs\") pod \"5deb3abf-e442-44b4-bea6-61efd1b060c0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " Apr 16 20:42:05.998141 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:05.997703 2537 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5deb3abf-e442-44b4-bea6-61efd1b060c0-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "5deb3abf-e442-44b4-bea6-61efd1b060c0" (UID: "5deb3abf-e442-44b4-bea6-61efd1b060c0"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:42:05.998141 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:05.997730 2537 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-secret-grpc-tls\") pod \"5deb3abf-e442-44b4-bea6-61efd1b060c0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " Apr 16 20:42:05.998141 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:05.997766 2537 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5deb3abf-e442-44b4-bea6-61efd1b060c0-configmap-metrics-client-ca\") pod \"5deb3abf-e442-44b4-bea6-61efd1b060c0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " Apr 16 20:42:05.998141 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:05.997797 2537 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-web-config\") pod \"5deb3abf-e442-44b4-bea6-61efd1b060c0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " Apr 16 20:42:05.998141 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:05.997814 2537 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5deb3abf-e442-44b4-bea6-61efd1b060c0-tls-assets\") pod \"5deb3abf-e442-44b4-bea6-61efd1b060c0\" (UID: \"5deb3abf-e442-44b4-bea6-61efd1b060c0\") " Apr 16 20:42:05.998141 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:05.998037 2537 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5deb3abf-e442-44b4-bea6-61efd1b060c0-prometheus-trusted-ca-bundle\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 16 20:42:05.998478 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:05.998331 2537 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5deb3abf-e442-44b4-bea6-61efd1b060c0-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "5deb3abf-e442-44b4-bea6-61efd1b060c0" (UID: "5deb3abf-e442-44b4-bea6-61efd1b060c0"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:42:06.001028 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:05.999632 2537 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5deb3abf-e442-44b4-bea6-61efd1b060c0-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "5deb3abf-e442-44b4-bea6-61efd1b060c0" (UID: "5deb3abf-e442-44b4-bea6-61efd1b060c0"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:42:06.001028 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.000302 2537 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5deb3abf-e442-44b4-bea6-61efd1b060c0-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "5deb3abf-e442-44b4-bea6-61efd1b060c0" (UID: "5deb3abf-e442-44b4-bea6-61efd1b060c0"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:42:06.001028 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.000719 2537 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5deb3abf-e442-44b4-bea6-61efd1b060c0-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "5deb3abf-e442-44b4-bea6-61efd1b060c0" (UID: "5deb3abf-e442-44b4-bea6-61efd1b060c0"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:42:06.001799 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.001767 2537 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5deb3abf-e442-44b4-bea6-61efd1b060c0-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "5deb3abf-e442-44b4-bea6-61efd1b060c0" (UID: "5deb3abf-e442-44b4-bea6-61efd1b060c0"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:42:06.002236 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.002214 2537 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5deb3abf-e442-44b4-bea6-61efd1b060c0-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "5deb3abf-e442-44b4-bea6-61efd1b060c0" (UID: "5deb3abf-e442-44b4-bea6-61efd1b060c0"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:42:06.002874 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.002847 2537 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5deb3abf-e442-44b4-bea6-61efd1b060c0-config-out" (OuterVolumeSpecName: "config-out") pod "5deb3abf-e442-44b4-bea6-61efd1b060c0" (UID: "5deb3abf-e442-44b4-bea6-61efd1b060c0"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:42:06.003270 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.003240 2537 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "5deb3abf-e442-44b4-bea6-61efd1b060c0" (UID: "5deb3abf-e442-44b4-bea6-61efd1b060c0"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:42:06.004038 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.003993 2537 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "5deb3abf-e442-44b4-bea6-61efd1b060c0" (UID: "5deb3abf-e442-44b4-bea6-61efd1b060c0"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:42:06.004176 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.004073 2537 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "5deb3abf-e442-44b4-bea6-61efd1b060c0" (UID: "5deb3abf-e442-44b4-bea6-61efd1b060c0"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:42:06.004176 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.004116 2537 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "5deb3abf-e442-44b4-bea6-61efd1b060c0" (UID: "5deb3abf-e442-44b4-bea6-61efd1b060c0"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:42:06.004176 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.004142 2537 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-config" (OuterVolumeSpecName: "config") pod "5deb3abf-e442-44b4-bea6-61efd1b060c0" (UID: "5deb3abf-e442-44b4-bea6-61efd1b060c0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:42:06.004370 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.004241 2537 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "5deb3abf-e442-44b4-bea6-61efd1b060c0" (UID: "5deb3abf-e442-44b4-bea6-61efd1b060c0"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:42:06.004370 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.004324 2537 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5deb3abf-e442-44b4-bea6-61efd1b060c0-kube-api-access-74fxh" (OuterVolumeSpecName: "kube-api-access-74fxh") pod "5deb3abf-e442-44b4-bea6-61efd1b060c0" (UID: "5deb3abf-e442-44b4-bea6-61efd1b060c0"). InnerVolumeSpecName "kube-api-access-74fxh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:42:06.006787 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.005790 2537 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "5deb3abf-e442-44b4-bea6-61efd1b060c0" (UID: "5deb3abf-e442-44b4-bea6-61efd1b060c0"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:42:06.007023 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.006987 2537 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "5deb3abf-e442-44b4-bea6-61efd1b060c0" (UID: "5deb3abf-e442-44b4-bea6-61efd1b060c0"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:42:06.015602 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.015577 2537 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-web-config" (OuterVolumeSpecName: "web-config") pod "5deb3abf-e442-44b4-bea6-61efd1b060c0" (UID: "5deb3abf-e442-44b4-bea6-61efd1b060c0"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:42:06.098685 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.098651 2537 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-74fxh\" (UniqueName: \"kubernetes.io/projected/5deb3abf-e442-44b4-bea6-61efd1b060c0-kube-api-access-74fxh\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 16 20:42:06.098685 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.098682 2537 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5deb3abf-e442-44b4-bea6-61efd1b060c0-prometheus-k8s-db\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 16 20:42:06.098890 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.098695 2537 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5deb3abf-e442-44b4-bea6-61efd1b060c0-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 16 20:42:06.098890 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.098708 2537 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5deb3abf-e442-44b4-bea6-61efd1b060c0-config-out\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 16 20:42:06.098890 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.098720 2537 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-secret-metrics-client-certs\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 16 20:42:06.098890 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.098731 2537 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-secret-grpc-tls\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 16 20:42:06.098890 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.098744 2537 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5deb3abf-e442-44b4-bea6-61efd1b060c0-configmap-metrics-client-ca\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 16 20:42:06.098890 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.098756 2537 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-web-config\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 16 20:42:06.098890 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.098767 2537 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5deb3abf-e442-44b4-bea6-61efd1b060c0-tls-assets\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 16 20:42:06.098890 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.098778 2537 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-secret-prometheus-k8s-tls\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 16 20:42:06.098890 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.098792 2537 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5deb3abf-e442-44b4-bea6-61efd1b060c0-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 16 20:42:06.098890 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.098804 2537 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-thanos-prometheus-http-client-file\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 16 20:42:06.098890 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.098816 2537 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-secret-kube-rbac-proxy\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 16 20:42:06.098890 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.098828 2537 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-config\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 16 20:42:06.098890 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.098842 2537 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5deb3abf-e442-44b4-bea6-61efd1b060c0-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 16 20:42:06.098890 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.098855 2537 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 16 20:42:06.098890 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.098871 2537 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5deb3abf-e442-44b4-bea6-61efd1b060c0-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 16 20:42:06.347492 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.347408 2537 generic.go:358] "Generic (PLEG): container finished" podID="5deb3abf-e442-44b4-bea6-61efd1b060c0" containerID="3f2ea2db1694958569d93b03dd419b64ea256e968bf413f691f9b529ecc6b2da" exitCode=0 Apr 16 20:42:06.347492 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.347433 2537 generic.go:358] "Generic (PLEG): container finished" podID="5deb3abf-e442-44b4-bea6-61efd1b060c0" containerID="aa4a0aae7c1f109e92a245a4a830a92b986ddd80fb513f4c0491f89a5e887440" exitCode=0 Apr 16 20:42:06.347492 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.347439 2537 generic.go:358] "Generic (PLEG): container finished" podID="5deb3abf-e442-44b4-bea6-61efd1b060c0" containerID="7871538fec329b53449de04d5c413d81d024234a5f828750a181feb85f91ec43" exitCode=0 Apr 16 20:42:06.347492 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.347444 2537 generic.go:358] "Generic (PLEG): container finished" podID="5deb3abf-e442-44b4-bea6-61efd1b060c0" containerID="2ab1d9d238738be7c55ee9aa6243530dc33e3b37bdad350039efefb254d87f69" exitCode=0 Apr 16 20:42:06.347492 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.347449 2537 generic.go:358] "Generic (PLEG): container finished" podID="5deb3abf-e442-44b4-bea6-61efd1b060c0" containerID="58fc42f248c1cb275e0b398632fadcd143c9f87bb1c3a73b7373c3a0a00c208b" exitCode=0 Apr 16 20:42:06.347492 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.347456 2537 generic.go:358] "Generic (PLEG): container finished" podID="5deb3abf-e442-44b4-bea6-61efd1b060c0" containerID="db3c2625410e9c686b14be512339488a65e092e73ef05a8656b9fa5b418a7676" exitCode=0 Apr 16 20:42:06.347492 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.347475 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5deb3abf-e442-44b4-bea6-61efd1b060c0","Type":"ContainerDied","Data":"3f2ea2db1694958569d93b03dd419b64ea256e968bf413f691f9b529ecc6b2da"} Apr 16 20:42:06.347889 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.347501 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5deb3abf-e442-44b4-bea6-61efd1b060c0","Type":"ContainerDied","Data":"aa4a0aae7c1f109e92a245a4a830a92b986ddd80fb513f4c0491f89a5e887440"} Apr 16 20:42:06.347889 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.347511 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5deb3abf-e442-44b4-bea6-61efd1b060c0","Type":"ContainerDied","Data":"7871538fec329b53449de04d5c413d81d024234a5f828750a181feb85f91ec43"} Apr 16 20:42:06.347889 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.347512 2537 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.347889 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.347521 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5deb3abf-e442-44b4-bea6-61efd1b060c0","Type":"ContainerDied","Data":"2ab1d9d238738be7c55ee9aa6243530dc33e3b37bdad350039efefb254d87f69"} Apr 16 20:42:06.347889 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.347532 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5deb3abf-e442-44b4-bea6-61efd1b060c0","Type":"ContainerDied","Data":"58fc42f248c1cb275e0b398632fadcd143c9f87bb1c3a73b7373c3a0a00c208b"} Apr 16 20:42:06.347889 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.347541 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5deb3abf-e442-44b4-bea6-61efd1b060c0","Type":"ContainerDied","Data":"db3c2625410e9c686b14be512339488a65e092e73ef05a8656b9fa5b418a7676"} Apr 16 20:42:06.347889 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.347550 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5deb3abf-e442-44b4-bea6-61efd1b060c0","Type":"ContainerDied","Data":"a6ecc5d7d1e8dcfd85ceb92467839d186a408ba255bb987d240138b1e8c1d5ca"} Apr 16 20:42:06.347889 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.347588 2537 scope.go:117] "RemoveContainer" containerID="3f2ea2db1694958569d93b03dd419b64ea256e968bf413f691f9b529ecc6b2da" Apr 16 20:42:06.355701 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.355680 2537 scope.go:117] "RemoveContainer" containerID="aa4a0aae7c1f109e92a245a4a830a92b986ddd80fb513f4c0491f89a5e887440" Apr 16 20:42:06.362199 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.362179 2537 scope.go:117] "RemoveContainer" containerID="7871538fec329b53449de04d5c413d81d024234a5f828750a181feb85f91ec43" Apr 16 20:42:06.368265 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.368230 2537 scope.go:117] "RemoveContainer" containerID="2ab1d9d238738be7c55ee9aa6243530dc33e3b37bdad350039efefb254d87f69" Apr 16 20:42:06.369619 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.369479 2537 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:42:06.373795 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.373776 2537 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:42:06.375189 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.375171 2537 scope.go:117] "RemoveContainer" containerID="58fc42f248c1cb275e0b398632fadcd143c9f87bb1c3a73b7373c3a0a00c208b" Apr 16 20:42:06.381252 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.381234 2537 scope.go:117] "RemoveContainer" containerID="db3c2625410e9c686b14be512339488a65e092e73ef05a8656b9fa5b418a7676" Apr 16 20:42:06.387483 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.387463 2537 scope.go:117] "RemoveContainer" containerID="41049a9f41018c10758d08da283e6944f25366716b536a76e2e1bb87296e6f4c" Apr 16 20:42:06.393098 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.393082 2537 scope.go:117] "RemoveContainer" containerID="3f2ea2db1694958569d93b03dd419b64ea256e968bf413f691f9b529ecc6b2da" Apr 16 20:42:06.393335 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:42:06.393316 2537 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f2ea2db1694958569d93b03dd419b64ea256e968bf413f691f9b529ecc6b2da\": container with ID starting with 3f2ea2db1694958569d93b03dd419b64ea256e968bf413f691f9b529ecc6b2da not found: ID does not exist" containerID="3f2ea2db1694958569d93b03dd419b64ea256e968bf413f691f9b529ecc6b2da" Apr 16 20:42:06.393378 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.393341 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f2ea2db1694958569d93b03dd419b64ea256e968bf413f691f9b529ecc6b2da"} err="failed to get container status \"3f2ea2db1694958569d93b03dd419b64ea256e968bf413f691f9b529ecc6b2da\": rpc error: code = NotFound desc = could not find container \"3f2ea2db1694958569d93b03dd419b64ea256e968bf413f691f9b529ecc6b2da\": container with ID starting with 3f2ea2db1694958569d93b03dd419b64ea256e968bf413f691f9b529ecc6b2da not found: ID does not exist" Apr 16 20:42:06.393378 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.393369 2537 scope.go:117] "RemoveContainer" containerID="aa4a0aae7c1f109e92a245a4a830a92b986ddd80fb513f4c0491f89a5e887440" Apr 16 20:42:06.393597 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:42:06.393581 2537 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa4a0aae7c1f109e92a245a4a830a92b986ddd80fb513f4c0491f89a5e887440\": container with ID starting with aa4a0aae7c1f109e92a245a4a830a92b986ddd80fb513f4c0491f89a5e887440 not found: ID does not exist" containerID="aa4a0aae7c1f109e92a245a4a830a92b986ddd80fb513f4c0491f89a5e887440" Apr 16 20:42:06.393648 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.393602 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa4a0aae7c1f109e92a245a4a830a92b986ddd80fb513f4c0491f89a5e887440"} err="failed to get container status \"aa4a0aae7c1f109e92a245a4a830a92b986ddd80fb513f4c0491f89a5e887440\": rpc error: code = NotFound desc = could not find container \"aa4a0aae7c1f109e92a245a4a830a92b986ddd80fb513f4c0491f89a5e887440\": container with ID starting with aa4a0aae7c1f109e92a245a4a830a92b986ddd80fb513f4c0491f89a5e887440 not found: ID does not exist" Apr 16 20:42:06.393648 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.393619 2537 scope.go:117] "RemoveContainer" containerID="7871538fec329b53449de04d5c413d81d024234a5f828750a181feb85f91ec43" Apr 16 20:42:06.393836 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:42:06.393821 2537 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7871538fec329b53449de04d5c413d81d024234a5f828750a181feb85f91ec43\": container with ID starting with 7871538fec329b53449de04d5c413d81d024234a5f828750a181feb85f91ec43 not found: ID does not exist" containerID="7871538fec329b53449de04d5c413d81d024234a5f828750a181feb85f91ec43" Apr 16 20:42:06.393876 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.393838 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7871538fec329b53449de04d5c413d81d024234a5f828750a181feb85f91ec43"} err="failed to get container status \"7871538fec329b53449de04d5c413d81d024234a5f828750a181feb85f91ec43\": rpc error: code = NotFound desc = could not find container \"7871538fec329b53449de04d5c413d81d024234a5f828750a181feb85f91ec43\": container with ID starting with 7871538fec329b53449de04d5c413d81d024234a5f828750a181feb85f91ec43 not found: ID does not exist" Apr 16 20:42:06.393876 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.393850 2537 scope.go:117] "RemoveContainer" containerID="2ab1d9d238738be7c55ee9aa6243530dc33e3b37bdad350039efefb254d87f69" Apr 16 20:42:06.394063 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:42:06.394047 2537 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ab1d9d238738be7c55ee9aa6243530dc33e3b37bdad350039efefb254d87f69\": container with ID starting with 2ab1d9d238738be7c55ee9aa6243530dc33e3b37bdad350039efefb254d87f69 not found: ID does not exist" containerID="2ab1d9d238738be7c55ee9aa6243530dc33e3b37bdad350039efefb254d87f69" Apr 16 20:42:06.394106 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.394065 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ab1d9d238738be7c55ee9aa6243530dc33e3b37bdad350039efefb254d87f69"} err="failed to get container status \"2ab1d9d238738be7c55ee9aa6243530dc33e3b37bdad350039efefb254d87f69\": rpc error: code = NotFound desc = could not find container \"2ab1d9d238738be7c55ee9aa6243530dc33e3b37bdad350039efefb254d87f69\": container with ID starting with 2ab1d9d238738be7c55ee9aa6243530dc33e3b37bdad350039efefb254d87f69 not found: ID does not exist" Apr 16 20:42:06.394106 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.394076 2537 scope.go:117] "RemoveContainer" containerID="58fc42f248c1cb275e0b398632fadcd143c9f87bb1c3a73b7373c3a0a00c208b" Apr 16 20:42:06.394244 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:42:06.394231 2537 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58fc42f248c1cb275e0b398632fadcd143c9f87bb1c3a73b7373c3a0a00c208b\": container with ID starting with 58fc42f248c1cb275e0b398632fadcd143c9f87bb1c3a73b7373c3a0a00c208b not found: ID does not exist" containerID="58fc42f248c1cb275e0b398632fadcd143c9f87bb1c3a73b7373c3a0a00c208b" Apr 16 20:42:06.394278 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.394246 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58fc42f248c1cb275e0b398632fadcd143c9f87bb1c3a73b7373c3a0a00c208b"} err="failed to get container status \"58fc42f248c1cb275e0b398632fadcd143c9f87bb1c3a73b7373c3a0a00c208b\": rpc error: code = NotFound desc = could not find container \"58fc42f248c1cb275e0b398632fadcd143c9f87bb1c3a73b7373c3a0a00c208b\": container with ID starting with 58fc42f248c1cb275e0b398632fadcd143c9f87bb1c3a73b7373c3a0a00c208b not found: ID does not exist" Apr 16 20:42:06.394278 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.394256 2537 scope.go:117] "RemoveContainer" containerID="db3c2625410e9c686b14be512339488a65e092e73ef05a8656b9fa5b418a7676" Apr 16 20:42:06.394554 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:42:06.394488 2537 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db3c2625410e9c686b14be512339488a65e092e73ef05a8656b9fa5b418a7676\": container with ID starting with db3c2625410e9c686b14be512339488a65e092e73ef05a8656b9fa5b418a7676 not found: ID does not exist" containerID="db3c2625410e9c686b14be512339488a65e092e73ef05a8656b9fa5b418a7676" Apr 16 20:42:06.394554 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.394521 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db3c2625410e9c686b14be512339488a65e092e73ef05a8656b9fa5b418a7676"} err="failed to get container status \"db3c2625410e9c686b14be512339488a65e092e73ef05a8656b9fa5b418a7676\": rpc error: code = NotFound desc = could not find container \"db3c2625410e9c686b14be512339488a65e092e73ef05a8656b9fa5b418a7676\": container with ID starting with db3c2625410e9c686b14be512339488a65e092e73ef05a8656b9fa5b418a7676 not found: ID does not exist" Apr 16 20:42:06.394554 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.394543 2537 scope.go:117] "RemoveContainer" containerID="41049a9f41018c10758d08da283e6944f25366716b536a76e2e1bb87296e6f4c" Apr 16 20:42:06.395360 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:42:06.395335 2537 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41049a9f41018c10758d08da283e6944f25366716b536a76e2e1bb87296e6f4c\": container with ID starting with 41049a9f41018c10758d08da283e6944f25366716b536a76e2e1bb87296e6f4c not found: ID does not exist" containerID="41049a9f41018c10758d08da283e6944f25366716b536a76e2e1bb87296e6f4c" Apr 16 20:42:06.395443 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.395366 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41049a9f41018c10758d08da283e6944f25366716b536a76e2e1bb87296e6f4c"} err="failed to get container status \"41049a9f41018c10758d08da283e6944f25366716b536a76e2e1bb87296e6f4c\": rpc error: code = NotFound desc = could not find container \"41049a9f41018c10758d08da283e6944f25366716b536a76e2e1bb87296e6f4c\": container with ID starting with 41049a9f41018c10758d08da283e6944f25366716b536a76e2e1bb87296e6f4c not found: ID does not exist" Apr 16 20:42:06.395443 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.395387 2537 scope.go:117] "RemoveContainer" containerID="3f2ea2db1694958569d93b03dd419b64ea256e968bf413f691f9b529ecc6b2da" Apr 16 20:42:06.395646 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.395619 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f2ea2db1694958569d93b03dd419b64ea256e968bf413f691f9b529ecc6b2da"} err="failed to get container status \"3f2ea2db1694958569d93b03dd419b64ea256e968bf413f691f9b529ecc6b2da\": rpc error: code = NotFound desc = could not find container \"3f2ea2db1694958569d93b03dd419b64ea256e968bf413f691f9b529ecc6b2da\": container with ID starting with 3f2ea2db1694958569d93b03dd419b64ea256e968bf413f691f9b529ecc6b2da not found: ID does not exist" Apr 16 20:42:06.395745 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.395648 2537 scope.go:117] "RemoveContainer" containerID="aa4a0aae7c1f109e92a245a4a830a92b986ddd80fb513f4c0491f89a5e887440" Apr 16 20:42:06.395930 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.395908 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa4a0aae7c1f109e92a245a4a830a92b986ddd80fb513f4c0491f89a5e887440"} err="failed to get container status \"aa4a0aae7c1f109e92a245a4a830a92b986ddd80fb513f4c0491f89a5e887440\": rpc error: code = NotFound desc = could not find container \"aa4a0aae7c1f109e92a245a4a830a92b986ddd80fb513f4c0491f89a5e887440\": container with ID starting with aa4a0aae7c1f109e92a245a4a830a92b986ddd80fb513f4c0491f89a5e887440 not found: ID does not exist" Apr 16 20:42:06.396020 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.395932 2537 scope.go:117] "RemoveContainer" containerID="7871538fec329b53449de04d5c413d81d024234a5f828750a181feb85f91ec43" Apr 16 20:42:06.396189 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.396173 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7871538fec329b53449de04d5c413d81d024234a5f828750a181feb85f91ec43"} err="failed to get container status \"7871538fec329b53449de04d5c413d81d024234a5f828750a181feb85f91ec43\": rpc error: code = NotFound desc = could not find container \"7871538fec329b53449de04d5c413d81d024234a5f828750a181feb85f91ec43\": container with ID starting with 7871538fec329b53449de04d5c413d81d024234a5f828750a181feb85f91ec43 not found: ID does not exist" Apr 16 20:42:06.396255 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.396198 2537 scope.go:117] "RemoveContainer" containerID="2ab1d9d238738be7c55ee9aa6243530dc33e3b37bdad350039efefb254d87f69" Apr 16 20:42:06.396441 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.396416 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ab1d9d238738be7c55ee9aa6243530dc33e3b37bdad350039efefb254d87f69"} err="failed to get container status \"2ab1d9d238738be7c55ee9aa6243530dc33e3b37bdad350039efefb254d87f69\": rpc error: code = NotFound desc = could not find container \"2ab1d9d238738be7c55ee9aa6243530dc33e3b37bdad350039efefb254d87f69\": container with ID starting with 2ab1d9d238738be7c55ee9aa6243530dc33e3b37bdad350039efefb254d87f69 not found: ID does not exist" Apr 16 20:42:06.396441 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.396433 2537 scope.go:117] "RemoveContainer" containerID="58fc42f248c1cb275e0b398632fadcd143c9f87bb1c3a73b7373c3a0a00c208b" Apr 16 20:42:06.396536 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.396484 2537 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:42:06.396721 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.396697 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58fc42f248c1cb275e0b398632fadcd143c9f87bb1c3a73b7373c3a0a00c208b"} err="failed to get container status \"58fc42f248c1cb275e0b398632fadcd143c9f87bb1c3a73b7373c3a0a00c208b\": rpc error: code = NotFound desc = could not find container \"58fc42f248c1cb275e0b398632fadcd143c9f87bb1c3a73b7373c3a0a00c208b\": container with ID starting with 58fc42f248c1cb275e0b398632fadcd143c9f87bb1c3a73b7373c3a0a00c208b not found: ID does not exist" Apr 16 20:42:06.396779 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.396722 2537 scope.go:117] "RemoveContainer" containerID="db3c2625410e9c686b14be512339488a65e092e73ef05a8656b9fa5b418a7676" Apr 16 20:42:06.396859 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.396846 2537 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5deb3abf-e442-44b4-bea6-61efd1b060c0" containerName="init-config-reloader" Apr 16 20:42:06.396912 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.396861 2537 state_mem.go:107] "Deleted CPUSet assignment" podUID="5deb3abf-e442-44b4-bea6-61efd1b060c0" containerName="init-config-reloader" Apr 16 20:42:06.396912 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.396873 2537 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5deb3abf-e442-44b4-bea6-61efd1b060c0" containerName="prometheus" Apr 16 20:42:06.396912 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.396881 2537 state_mem.go:107] "Deleted CPUSet assignment" podUID="5deb3abf-e442-44b4-bea6-61efd1b060c0" containerName="prometheus" Apr 16 20:42:06.396912 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.396890 2537 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5deb3abf-e442-44b4-bea6-61efd1b060c0" containerName="config-reloader" Apr 16 20:42:06.396912 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.396898 2537 state_mem.go:107] "Deleted CPUSet assignment" podUID="5deb3abf-e442-44b4-bea6-61efd1b060c0" containerName="config-reloader" Apr 16 20:42:06.396912 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.396907 2537 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5deb3abf-e442-44b4-bea6-61efd1b060c0" containerName="kube-rbac-proxy-web" Apr 16 20:42:06.397104 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.396914 2537 state_mem.go:107] "Deleted CPUSet assignment" podUID="5deb3abf-e442-44b4-bea6-61efd1b060c0" containerName="kube-rbac-proxy-web" Apr 16 20:42:06.397104 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.396923 2537 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5deb3abf-e442-44b4-bea6-61efd1b060c0" containerName="kube-rbac-proxy-thanos" Apr 16 20:42:06.397104 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.396932 2537 state_mem.go:107] "Deleted CPUSet assignment" podUID="5deb3abf-e442-44b4-bea6-61efd1b060c0" containerName="kube-rbac-proxy-thanos" Apr 16 20:42:06.397104 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.396940 2537 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f300a967-50a7-4aca-b7dc-57b123274d17" containerName="registry" Apr 16 20:42:06.397104 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.396945 2537 state_mem.go:107] "Deleted CPUSet assignment" podUID="f300a967-50a7-4aca-b7dc-57b123274d17" containerName="registry" Apr 16 20:42:06.397104 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.396943 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db3c2625410e9c686b14be512339488a65e092e73ef05a8656b9fa5b418a7676"} err="failed to get container status \"db3c2625410e9c686b14be512339488a65e092e73ef05a8656b9fa5b418a7676\": rpc error: code = NotFound desc = could not find container \"db3c2625410e9c686b14be512339488a65e092e73ef05a8656b9fa5b418a7676\": container with ID starting with db3c2625410e9c686b14be512339488a65e092e73ef05a8656b9fa5b418a7676 not found: ID does not exist" Apr 16 20:42:06.397104 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.396961 2537 scope.go:117] "RemoveContainer" containerID="41049a9f41018c10758d08da283e6944f25366716b536a76e2e1bb87296e6f4c" Apr 16 20:42:06.397104 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.396954 2537 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5deb3abf-e442-44b4-bea6-61efd1b060c0" containerName="kube-rbac-proxy" Apr 16 20:42:06.397104 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.397002 2537 state_mem.go:107] "Deleted CPUSet assignment" podUID="5deb3abf-e442-44b4-bea6-61efd1b060c0" containerName="kube-rbac-proxy" Apr 16 20:42:06.397104 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.397018 2537 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5deb3abf-e442-44b4-bea6-61efd1b060c0" containerName="thanos-sidecar" Apr 16 20:42:06.397104 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.397024 2537 state_mem.go:107] "Deleted CPUSet assignment" podUID="5deb3abf-e442-44b4-bea6-61efd1b060c0" containerName="thanos-sidecar" Apr 16 20:42:06.397104 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.397085 2537 memory_manager.go:356] "RemoveStaleState removing state" podUID="5deb3abf-e442-44b4-bea6-61efd1b060c0" containerName="config-reloader" Apr 16 20:42:06.397104 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.397096 2537 memory_manager.go:356] "RemoveStaleState removing state" podUID="5deb3abf-e442-44b4-bea6-61efd1b060c0" containerName="prometheus" Apr 16 20:42:06.397104 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.397105 2537 memory_manager.go:356] "RemoveStaleState removing state" podUID="f300a967-50a7-4aca-b7dc-57b123274d17" containerName="registry" Apr 16 20:42:06.397104 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.397112 2537 memory_manager.go:356] "RemoveStaleState removing state" podUID="5deb3abf-e442-44b4-bea6-61efd1b060c0" containerName="kube-rbac-proxy" Apr 16 20:42:06.397660 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.397120 2537 memory_manager.go:356] "RemoveStaleState removing state" podUID="5deb3abf-e442-44b4-bea6-61efd1b060c0" containerName="kube-rbac-proxy-web" Apr 16 20:42:06.397660 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.397126 2537 memory_manager.go:356] "RemoveStaleState removing state" podUID="5deb3abf-e442-44b4-bea6-61efd1b060c0" containerName="thanos-sidecar" Apr 16 20:42:06.397660 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.397135 2537 memory_manager.go:356] "RemoveStaleState removing state" podUID="5deb3abf-e442-44b4-bea6-61efd1b060c0" containerName="kube-rbac-proxy-thanos" Apr 16 20:42:06.397660 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.397190 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41049a9f41018c10758d08da283e6944f25366716b536a76e2e1bb87296e6f4c"} err="failed to get container status \"41049a9f41018c10758d08da283e6944f25366716b536a76e2e1bb87296e6f4c\": rpc error: code = NotFound desc = could not find container \"41049a9f41018c10758d08da283e6944f25366716b536a76e2e1bb87296e6f4c\": container with ID starting with 41049a9f41018c10758d08da283e6944f25366716b536a76e2e1bb87296e6f4c not found: ID does not exist" Apr 16 20:42:06.397660 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.397207 2537 scope.go:117] "RemoveContainer" containerID="3f2ea2db1694958569d93b03dd419b64ea256e968bf413f691f9b529ecc6b2da" Apr 16 20:42:06.397660 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.397422 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f2ea2db1694958569d93b03dd419b64ea256e968bf413f691f9b529ecc6b2da"} err="failed to get container status \"3f2ea2db1694958569d93b03dd419b64ea256e968bf413f691f9b529ecc6b2da\": rpc error: code = NotFound desc = could not find container \"3f2ea2db1694958569d93b03dd419b64ea256e968bf413f691f9b529ecc6b2da\": container with ID starting with 3f2ea2db1694958569d93b03dd419b64ea256e968bf413f691f9b529ecc6b2da not found: ID does not exist" Apr 16 20:42:06.397660 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.397444 2537 scope.go:117] "RemoveContainer" containerID="aa4a0aae7c1f109e92a245a4a830a92b986ddd80fb513f4c0491f89a5e887440" Apr 16 20:42:06.397918 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.397675 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa4a0aae7c1f109e92a245a4a830a92b986ddd80fb513f4c0491f89a5e887440"} err="failed to get container status \"aa4a0aae7c1f109e92a245a4a830a92b986ddd80fb513f4c0491f89a5e887440\": rpc error: code = NotFound desc = could not find container \"aa4a0aae7c1f109e92a245a4a830a92b986ddd80fb513f4c0491f89a5e887440\": container with ID starting with aa4a0aae7c1f109e92a245a4a830a92b986ddd80fb513f4c0491f89a5e887440 not found: ID does not exist" Apr 16 20:42:06.397918 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.397690 2537 scope.go:117] "RemoveContainer" containerID="7871538fec329b53449de04d5c413d81d024234a5f828750a181feb85f91ec43" Apr 16 20:42:06.397918 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.397886 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7871538fec329b53449de04d5c413d81d024234a5f828750a181feb85f91ec43"} err="failed to get container status \"7871538fec329b53449de04d5c413d81d024234a5f828750a181feb85f91ec43\": rpc error: code = NotFound desc = could not find container \"7871538fec329b53449de04d5c413d81d024234a5f828750a181feb85f91ec43\": container with ID starting with 7871538fec329b53449de04d5c413d81d024234a5f828750a181feb85f91ec43 not found: ID does not exist" Apr 16 20:42:06.397918 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.397898 2537 scope.go:117] "RemoveContainer" containerID="2ab1d9d238738be7c55ee9aa6243530dc33e3b37bdad350039efefb254d87f69" Apr 16 20:42:06.398107 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.398092 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ab1d9d238738be7c55ee9aa6243530dc33e3b37bdad350039efefb254d87f69"} err="failed to get container status \"2ab1d9d238738be7c55ee9aa6243530dc33e3b37bdad350039efefb254d87f69\": rpc error: code = NotFound desc = could not find container \"2ab1d9d238738be7c55ee9aa6243530dc33e3b37bdad350039efefb254d87f69\": container with ID starting with 2ab1d9d238738be7c55ee9aa6243530dc33e3b37bdad350039efefb254d87f69 not found: ID does not exist" Apr 16 20:42:06.398145 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.398106 2537 scope.go:117] "RemoveContainer" containerID="58fc42f248c1cb275e0b398632fadcd143c9f87bb1c3a73b7373c3a0a00c208b" Apr 16 20:42:06.398317 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.398300 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58fc42f248c1cb275e0b398632fadcd143c9f87bb1c3a73b7373c3a0a00c208b"} err="failed to get container status \"58fc42f248c1cb275e0b398632fadcd143c9f87bb1c3a73b7373c3a0a00c208b\": rpc error: code = NotFound desc = could not find container \"58fc42f248c1cb275e0b398632fadcd143c9f87bb1c3a73b7373c3a0a00c208b\": container with ID starting with 58fc42f248c1cb275e0b398632fadcd143c9f87bb1c3a73b7373c3a0a00c208b not found: ID does not exist" Apr 16 20:42:06.398354 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.398318 2537 scope.go:117] "RemoveContainer" containerID="db3c2625410e9c686b14be512339488a65e092e73ef05a8656b9fa5b418a7676" Apr 16 20:42:06.398528 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.398509 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db3c2625410e9c686b14be512339488a65e092e73ef05a8656b9fa5b418a7676"} err="failed to get container status \"db3c2625410e9c686b14be512339488a65e092e73ef05a8656b9fa5b418a7676\": rpc error: code = NotFound desc = could not find container \"db3c2625410e9c686b14be512339488a65e092e73ef05a8656b9fa5b418a7676\": container with ID starting with db3c2625410e9c686b14be512339488a65e092e73ef05a8656b9fa5b418a7676 not found: ID does not exist" Apr 16 20:42:06.398607 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.398528 2537 scope.go:117] "RemoveContainer" containerID="41049a9f41018c10758d08da283e6944f25366716b536a76e2e1bb87296e6f4c" Apr 16 20:42:06.398732 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.398716 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41049a9f41018c10758d08da283e6944f25366716b536a76e2e1bb87296e6f4c"} err="failed to get container status \"41049a9f41018c10758d08da283e6944f25366716b536a76e2e1bb87296e6f4c\": rpc error: code = NotFound desc = could not find container \"41049a9f41018c10758d08da283e6944f25366716b536a76e2e1bb87296e6f4c\": container with ID starting with 41049a9f41018c10758d08da283e6944f25366716b536a76e2e1bb87296e6f4c not found: ID does not exist" Apr 16 20:42:06.398773 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.398732 2537 scope.go:117] "RemoveContainer" containerID="3f2ea2db1694958569d93b03dd419b64ea256e968bf413f691f9b529ecc6b2da" Apr 16 20:42:06.398944 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.398925 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f2ea2db1694958569d93b03dd419b64ea256e968bf413f691f9b529ecc6b2da"} err="failed to get container status \"3f2ea2db1694958569d93b03dd419b64ea256e968bf413f691f9b529ecc6b2da\": rpc error: code = NotFound desc = could not find container \"3f2ea2db1694958569d93b03dd419b64ea256e968bf413f691f9b529ecc6b2da\": container with ID starting with 3f2ea2db1694958569d93b03dd419b64ea256e968bf413f691f9b529ecc6b2da not found: ID does not exist" Apr 16 20:42:06.399010 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.398946 2537 scope.go:117] "RemoveContainer" containerID="aa4a0aae7c1f109e92a245a4a830a92b986ddd80fb513f4c0491f89a5e887440" Apr 16 20:42:06.399183 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.399164 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa4a0aae7c1f109e92a245a4a830a92b986ddd80fb513f4c0491f89a5e887440"} err="failed to get container status \"aa4a0aae7c1f109e92a245a4a830a92b986ddd80fb513f4c0491f89a5e887440\": rpc error: code = NotFound desc = could not find container \"aa4a0aae7c1f109e92a245a4a830a92b986ddd80fb513f4c0491f89a5e887440\": container with ID starting with aa4a0aae7c1f109e92a245a4a830a92b986ddd80fb513f4c0491f89a5e887440 not found: ID does not exist" Apr 16 20:42:06.399230 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.399184 2537 scope.go:117] "RemoveContainer" containerID="7871538fec329b53449de04d5c413d81d024234a5f828750a181feb85f91ec43" Apr 16 20:42:06.399406 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.399390 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7871538fec329b53449de04d5c413d81d024234a5f828750a181feb85f91ec43"} err="failed to get container status \"7871538fec329b53449de04d5c413d81d024234a5f828750a181feb85f91ec43\": rpc error: code = NotFound desc = could not find container \"7871538fec329b53449de04d5c413d81d024234a5f828750a181feb85f91ec43\": container with ID starting with 7871538fec329b53449de04d5c413d81d024234a5f828750a181feb85f91ec43 not found: ID does not exist" Apr 16 20:42:06.399462 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.399407 2537 scope.go:117] "RemoveContainer" containerID="2ab1d9d238738be7c55ee9aa6243530dc33e3b37bdad350039efefb254d87f69" Apr 16 20:42:06.399610 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.399594 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ab1d9d238738be7c55ee9aa6243530dc33e3b37bdad350039efefb254d87f69"} err="failed to get container status \"2ab1d9d238738be7c55ee9aa6243530dc33e3b37bdad350039efefb254d87f69\": rpc error: code = NotFound desc = could not find container \"2ab1d9d238738be7c55ee9aa6243530dc33e3b37bdad350039efefb254d87f69\": container with ID starting with 2ab1d9d238738be7c55ee9aa6243530dc33e3b37bdad350039efefb254d87f69 not found: ID does not exist" Apr 16 20:42:06.399610 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.399609 2537 scope.go:117] "RemoveContainer" containerID="58fc42f248c1cb275e0b398632fadcd143c9f87bb1c3a73b7373c3a0a00c208b" Apr 16 20:42:06.399812 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.399796 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58fc42f248c1cb275e0b398632fadcd143c9f87bb1c3a73b7373c3a0a00c208b"} err="failed to get container status \"58fc42f248c1cb275e0b398632fadcd143c9f87bb1c3a73b7373c3a0a00c208b\": rpc error: code = NotFound desc = could not find container \"58fc42f248c1cb275e0b398632fadcd143c9f87bb1c3a73b7373c3a0a00c208b\": container with ID starting with 58fc42f248c1cb275e0b398632fadcd143c9f87bb1c3a73b7373c3a0a00c208b not found: ID does not exist" Apr 16 20:42:06.399854 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.399813 2537 scope.go:117] "RemoveContainer" containerID="db3c2625410e9c686b14be512339488a65e092e73ef05a8656b9fa5b418a7676" Apr 16 20:42:06.400013 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.399990 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db3c2625410e9c686b14be512339488a65e092e73ef05a8656b9fa5b418a7676"} err="failed to get container status \"db3c2625410e9c686b14be512339488a65e092e73ef05a8656b9fa5b418a7676\": rpc error: code = NotFound desc = could not find container \"db3c2625410e9c686b14be512339488a65e092e73ef05a8656b9fa5b418a7676\": container with ID starting with db3c2625410e9c686b14be512339488a65e092e73ef05a8656b9fa5b418a7676 not found: ID does not exist" Apr 16 20:42:06.400013 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.400009 2537 scope.go:117] "RemoveContainer" containerID="41049a9f41018c10758d08da283e6944f25366716b536a76e2e1bb87296e6f4c" Apr 16 20:42:06.400232 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.400209 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41049a9f41018c10758d08da283e6944f25366716b536a76e2e1bb87296e6f4c"} err="failed to get container status \"41049a9f41018c10758d08da283e6944f25366716b536a76e2e1bb87296e6f4c\": rpc error: code = NotFound desc = could not find container \"41049a9f41018c10758d08da283e6944f25366716b536a76e2e1bb87296e6f4c\": container with ID starting with 41049a9f41018c10758d08da283e6944f25366716b536a76e2e1bb87296e6f4c not found: ID does not exist" Apr 16 20:42:06.400232 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.400231 2537 scope.go:117] "RemoveContainer" containerID="3f2ea2db1694958569d93b03dd419b64ea256e968bf413f691f9b529ecc6b2da" Apr 16 20:42:06.400551 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.400524 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f2ea2db1694958569d93b03dd419b64ea256e968bf413f691f9b529ecc6b2da"} err="failed to get container status \"3f2ea2db1694958569d93b03dd419b64ea256e968bf413f691f9b529ecc6b2da\": rpc error: code = NotFound desc = could not find container \"3f2ea2db1694958569d93b03dd419b64ea256e968bf413f691f9b529ecc6b2da\": container with ID starting with 3f2ea2db1694958569d93b03dd419b64ea256e968bf413f691f9b529ecc6b2da not found: ID does not exist" Apr 16 20:42:06.400654 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.400550 2537 scope.go:117] "RemoveContainer" containerID="aa4a0aae7c1f109e92a245a4a830a92b986ddd80fb513f4c0491f89a5e887440" Apr 16 20:42:06.400822 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.400793 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa4a0aae7c1f109e92a245a4a830a92b986ddd80fb513f4c0491f89a5e887440"} err="failed to get container status \"aa4a0aae7c1f109e92a245a4a830a92b986ddd80fb513f4c0491f89a5e887440\": rpc error: code = NotFound desc = could not find container \"aa4a0aae7c1f109e92a245a4a830a92b986ddd80fb513f4c0491f89a5e887440\": container with ID starting with aa4a0aae7c1f109e92a245a4a830a92b986ddd80fb513f4c0491f89a5e887440 not found: ID does not exist" Apr 16 20:42:06.400822 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.400819 2537 scope.go:117] "RemoveContainer" containerID="7871538fec329b53449de04d5c413d81d024234a5f828750a181feb85f91ec43" Apr 16 20:42:06.401061 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.401041 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7871538fec329b53449de04d5c413d81d024234a5f828750a181feb85f91ec43"} err="failed to get container status \"7871538fec329b53449de04d5c413d81d024234a5f828750a181feb85f91ec43\": rpc error: code = NotFound desc = could not find container \"7871538fec329b53449de04d5c413d81d024234a5f828750a181feb85f91ec43\": container with ID starting with 7871538fec329b53449de04d5c413d81d024234a5f828750a181feb85f91ec43 not found: ID does not exist" Apr 16 20:42:06.401141 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.401063 2537 scope.go:117] "RemoveContainer" containerID="2ab1d9d238738be7c55ee9aa6243530dc33e3b37bdad350039efefb254d87f69" Apr 16 20:42:06.401328 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.401306 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ab1d9d238738be7c55ee9aa6243530dc33e3b37bdad350039efefb254d87f69"} err="failed to get container status \"2ab1d9d238738be7c55ee9aa6243530dc33e3b37bdad350039efefb254d87f69\": rpc error: code = NotFound desc = could not find container \"2ab1d9d238738be7c55ee9aa6243530dc33e3b37bdad350039efefb254d87f69\": container with ID starting with 2ab1d9d238738be7c55ee9aa6243530dc33e3b37bdad350039efefb254d87f69 not found: ID does not exist" Apr 16 20:42:06.401373 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.401329 2537 scope.go:117] "RemoveContainer" containerID="58fc42f248c1cb275e0b398632fadcd143c9f87bb1c3a73b7373c3a0a00c208b" Apr 16 20:42:06.401594 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.401554 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58fc42f248c1cb275e0b398632fadcd143c9f87bb1c3a73b7373c3a0a00c208b"} err="failed to get container status \"58fc42f248c1cb275e0b398632fadcd143c9f87bb1c3a73b7373c3a0a00c208b\": rpc error: code = NotFound desc = could not find container \"58fc42f248c1cb275e0b398632fadcd143c9f87bb1c3a73b7373c3a0a00c208b\": container with ID starting with 58fc42f248c1cb275e0b398632fadcd143c9f87bb1c3a73b7373c3a0a00c208b not found: ID does not exist" Apr 16 20:42:06.401663 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.401596 2537 scope.go:117] "RemoveContainer" containerID="db3c2625410e9c686b14be512339488a65e092e73ef05a8656b9fa5b418a7676" Apr 16 20:42:06.401844 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.401826 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db3c2625410e9c686b14be512339488a65e092e73ef05a8656b9fa5b418a7676"} err="failed to get container status \"db3c2625410e9c686b14be512339488a65e092e73ef05a8656b9fa5b418a7676\": rpc error: code = NotFound desc = could not find container \"db3c2625410e9c686b14be512339488a65e092e73ef05a8656b9fa5b418a7676\": container with ID starting with db3c2625410e9c686b14be512339488a65e092e73ef05a8656b9fa5b418a7676 not found: ID does not exist" Apr 16 20:42:06.401895 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.401845 2537 scope.go:117] "RemoveContainer" containerID="41049a9f41018c10758d08da283e6944f25366716b536a76e2e1bb87296e6f4c" Apr 16 20:42:06.402073 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.402051 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41049a9f41018c10758d08da283e6944f25366716b536a76e2e1bb87296e6f4c"} err="failed to get container status \"41049a9f41018c10758d08da283e6944f25366716b536a76e2e1bb87296e6f4c\": rpc error: code = NotFound desc = could not find container \"41049a9f41018c10758d08da283e6944f25366716b536a76e2e1bb87296e6f4c\": container with ID starting with 41049a9f41018c10758d08da283e6944f25366716b536a76e2e1bb87296e6f4c not found: ID does not exist" Apr 16 20:42:06.402073 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.402073 2537 scope.go:117] "RemoveContainer" containerID="3f2ea2db1694958569d93b03dd419b64ea256e968bf413f691f9b529ecc6b2da" Apr 16 20:42:06.402280 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.402262 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f2ea2db1694958569d93b03dd419b64ea256e968bf413f691f9b529ecc6b2da"} err="failed to get container status \"3f2ea2db1694958569d93b03dd419b64ea256e968bf413f691f9b529ecc6b2da\": rpc error: code = NotFound desc = could not find container \"3f2ea2db1694958569d93b03dd419b64ea256e968bf413f691f9b529ecc6b2da\": container with ID starting with 3f2ea2db1694958569d93b03dd419b64ea256e968bf413f691f9b529ecc6b2da not found: ID does not exist" Apr 16 20:42:06.402324 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.402280 2537 scope.go:117] "RemoveContainer" containerID="aa4a0aae7c1f109e92a245a4a830a92b986ddd80fb513f4c0491f89a5e887440" Apr 16 20:42:06.402470 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.402449 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa4a0aae7c1f109e92a245a4a830a92b986ddd80fb513f4c0491f89a5e887440"} err="failed to get container status \"aa4a0aae7c1f109e92a245a4a830a92b986ddd80fb513f4c0491f89a5e887440\": rpc error: code = NotFound desc = could not find container \"aa4a0aae7c1f109e92a245a4a830a92b986ddd80fb513f4c0491f89a5e887440\": container with ID starting with aa4a0aae7c1f109e92a245a4a830a92b986ddd80fb513f4c0491f89a5e887440 not found: ID does not exist" Apr 16 20:42:06.402470 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.402470 2537 scope.go:117] "RemoveContainer" containerID="7871538fec329b53449de04d5c413d81d024234a5f828750a181feb85f91ec43" Apr 16 20:42:06.402649 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.402635 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.402692 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.402678 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7871538fec329b53449de04d5c413d81d024234a5f828750a181feb85f91ec43"} err="failed to get container status \"7871538fec329b53449de04d5c413d81d024234a5f828750a181feb85f91ec43\": rpc error: code = NotFound desc = could not find container \"7871538fec329b53449de04d5c413d81d024234a5f828750a181feb85f91ec43\": container with ID starting with 7871538fec329b53449de04d5c413d81d024234a5f828750a181feb85f91ec43 not found: ID does not exist" Apr 16 20:42:06.402730 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.402696 2537 scope.go:117] "RemoveContainer" containerID="2ab1d9d238738be7c55ee9aa6243530dc33e3b37bdad350039efefb254d87f69" Apr 16 20:42:06.402929 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.402895 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ab1d9d238738be7c55ee9aa6243530dc33e3b37bdad350039efefb254d87f69"} err="failed to get container status \"2ab1d9d238738be7c55ee9aa6243530dc33e3b37bdad350039efefb254d87f69\": rpc error: code = NotFound desc = could not find container \"2ab1d9d238738be7c55ee9aa6243530dc33e3b37bdad350039efefb254d87f69\": container with ID starting with 2ab1d9d238738be7c55ee9aa6243530dc33e3b37bdad350039efefb254d87f69 not found: ID does not exist" Apr 16 20:42:06.402929 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.402920 2537 scope.go:117] "RemoveContainer" containerID="58fc42f248c1cb275e0b398632fadcd143c9f87bb1c3a73b7373c3a0a00c208b" Apr 16 20:42:06.403196 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.403178 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58fc42f248c1cb275e0b398632fadcd143c9f87bb1c3a73b7373c3a0a00c208b"} err="failed to get container status \"58fc42f248c1cb275e0b398632fadcd143c9f87bb1c3a73b7373c3a0a00c208b\": rpc error: code = NotFound desc = could not find container \"58fc42f248c1cb275e0b398632fadcd143c9f87bb1c3a73b7373c3a0a00c208b\": container with ID starting with 58fc42f248c1cb275e0b398632fadcd143c9f87bb1c3a73b7373c3a0a00c208b not found: ID does not exist" Apr 16 20:42:06.403298 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.403198 2537 scope.go:117] "RemoveContainer" containerID="db3c2625410e9c686b14be512339488a65e092e73ef05a8656b9fa5b418a7676" Apr 16 20:42:06.403479 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.403456 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db3c2625410e9c686b14be512339488a65e092e73ef05a8656b9fa5b418a7676"} err="failed to get container status \"db3c2625410e9c686b14be512339488a65e092e73ef05a8656b9fa5b418a7676\": rpc error: code = NotFound desc = could not find container \"db3c2625410e9c686b14be512339488a65e092e73ef05a8656b9fa5b418a7676\": container with ID starting with db3c2625410e9c686b14be512339488a65e092e73ef05a8656b9fa5b418a7676 not found: ID does not exist" Apr 16 20:42:06.403536 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.403481 2537 scope.go:117] "RemoveContainer" containerID="41049a9f41018c10758d08da283e6944f25366716b536a76e2e1bb87296e6f4c" Apr 16 20:42:06.403700 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.403682 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41049a9f41018c10758d08da283e6944f25366716b536a76e2e1bb87296e6f4c"} err="failed to get container status \"41049a9f41018c10758d08da283e6944f25366716b536a76e2e1bb87296e6f4c\": rpc error: code = NotFound desc = could not find container \"41049a9f41018c10758d08da283e6944f25366716b536a76e2e1bb87296e6f4c\": container with ID starting with 41049a9f41018c10758d08da283e6944f25366716b536a76e2e1bb87296e6f4c not found: ID does not exist" Apr 16 20:42:06.404910 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.404891 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 20:42:06.405403 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.405384 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 20:42:06.407060 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.405791 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 20:42:06.407060 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.406796 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 20:42:06.407233 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.407062 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 20:42:06.407233 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.407079 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 20:42:06.407233 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.407161 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 20:42:06.407381 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.407266 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 20:42:06.407381 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.407304 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-bea04r43g4e0l\"" Apr 16 20:42:06.407480 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.407404 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 20:42:06.407480 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.407307 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 20:42:06.407599 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.407534 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-p998x\"" Apr 16 20:42:06.407696 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.407679 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 20:42:06.408927 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.408908 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 20:42:06.411852 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.411804 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 20:42:06.413605 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.413583 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:42:06.501506 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.501477 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/20f067b7-64cc-4569-9397-01e5937afcfa-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.501647 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.501532 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/20f067b7-64cc-4569-9397-01e5937afcfa-config\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.501647 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.501604 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/20f067b7-64cc-4569-9397-01e5937afcfa-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.501647 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.501634 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20f067b7-64cc-4569-9397-01e5937afcfa-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.501808 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.501669 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/20f067b7-64cc-4569-9397-01e5937afcfa-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.501808 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.501700 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/20f067b7-64cc-4569-9397-01e5937afcfa-web-config\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.501808 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.501727 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/20f067b7-64cc-4569-9397-01e5937afcfa-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.501808 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.501760 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/20f067b7-64cc-4569-9397-01e5937afcfa-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.501808 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.501799 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/20f067b7-64cc-4569-9397-01e5937afcfa-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.502014 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.501822 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/20f067b7-64cc-4569-9397-01e5937afcfa-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.502014 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.501846 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/20f067b7-64cc-4569-9397-01e5937afcfa-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.502014 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.501873 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20f067b7-64cc-4569-9397-01e5937afcfa-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.502014 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.501902 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/20f067b7-64cc-4569-9397-01e5937afcfa-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.502014 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.501939 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20f067b7-64cc-4569-9397-01e5937afcfa-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.502014 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.501989 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/20f067b7-64cc-4569-9397-01e5937afcfa-config-out\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.502202 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.502017 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvq96\" (UniqueName: \"kubernetes.io/projected/20f067b7-64cc-4569-9397-01e5937afcfa-kube-api-access-dvq96\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.502202 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.502042 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/20f067b7-64cc-4569-9397-01e5937afcfa-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.502202 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.502059 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/20f067b7-64cc-4569-9397-01e5937afcfa-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.602342 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.602284 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/20f067b7-64cc-4569-9397-01e5937afcfa-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.602342 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.602312 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/20f067b7-64cc-4569-9397-01e5937afcfa-config\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.602342 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.602339 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/20f067b7-64cc-4569-9397-01e5937afcfa-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.602543 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.602363 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20f067b7-64cc-4569-9397-01e5937afcfa-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.602543 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.602397 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/20f067b7-64cc-4569-9397-01e5937afcfa-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.602543 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.602427 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/20f067b7-64cc-4569-9397-01e5937afcfa-web-config\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.602543 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.602452 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/20f067b7-64cc-4569-9397-01e5937afcfa-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.602543 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.602509 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/20f067b7-64cc-4569-9397-01e5937afcfa-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.602729 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.602555 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/20f067b7-64cc-4569-9397-01e5937afcfa-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.602729 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.602606 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/20f067b7-64cc-4569-9397-01e5937afcfa-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.602729 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.602631 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/20f067b7-64cc-4569-9397-01e5937afcfa-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.602729 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.602662 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20f067b7-64cc-4569-9397-01e5937afcfa-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.602729 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.602687 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/20f067b7-64cc-4569-9397-01e5937afcfa-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.602958 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.602731 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20f067b7-64cc-4569-9397-01e5937afcfa-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.602958 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.602757 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/20f067b7-64cc-4569-9397-01e5937afcfa-config-out\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.602958 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.602785 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dvq96\" (UniqueName: \"kubernetes.io/projected/20f067b7-64cc-4569-9397-01e5937afcfa-kube-api-access-dvq96\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.602958 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.602819 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/20f067b7-64cc-4569-9397-01e5937afcfa-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.602958 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.602849 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/20f067b7-64cc-4569-9397-01e5937afcfa-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.605836 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.605806 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20f067b7-64cc-4569-9397-01e5937afcfa-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.607172 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.606165 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/20f067b7-64cc-4569-9397-01e5937afcfa-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.607172 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.606470 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/20f067b7-64cc-4569-9397-01e5937afcfa-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.607172 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.606948 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/20f067b7-64cc-4569-9397-01e5937afcfa-web-config\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.607172 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.607025 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/20f067b7-64cc-4569-9397-01e5937afcfa-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.607421 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.607201 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/20f067b7-64cc-4569-9397-01e5937afcfa-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.607475 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.607432 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/20f067b7-64cc-4569-9397-01e5937afcfa-config\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.608033 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.607902 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/20f067b7-64cc-4569-9397-01e5937afcfa-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.608033 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.607930 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20f067b7-64cc-4569-9397-01e5937afcfa-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.608377 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.608353 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/20f067b7-64cc-4569-9397-01e5937afcfa-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.608661 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.608643 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/20f067b7-64cc-4569-9397-01e5937afcfa-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.608737 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.608706 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/20f067b7-64cc-4569-9397-01e5937afcfa-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.608737 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.608716 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/20f067b7-64cc-4569-9397-01e5937afcfa-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.608834 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.608816 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/20f067b7-64cc-4569-9397-01e5937afcfa-config-out\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.608921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.608903 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/20f067b7-64cc-4569-9397-01e5937afcfa-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.611629 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.610368 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20f067b7-64cc-4569-9397-01e5937afcfa-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.612901 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.612878 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/20f067b7-64cc-4569-9397-01e5937afcfa-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.614013 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.613994 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvq96\" (UniqueName: \"kubernetes.io/projected/20f067b7-64cc-4569-9397-01e5937afcfa-kube-api-access-dvq96\") pod \"prometheus-k8s-0\" (UID: \"20f067b7-64cc-4569-9397-01e5937afcfa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.714498 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.714475 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:06.835101 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:06.835076 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:42:06.837345 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:42:06.837313 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20f067b7_64cc_4569_9397_01e5937afcfa.slice/crio-de3b06704397ea43c3e84651ca207340486f742a418cdaf6b0f5adfec5097e87 WatchSource:0}: Error finding container de3b06704397ea43c3e84651ca207340486f742a418cdaf6b0f5adfec5097e87: Status 404 returned error can't find the container with id de3b06704397ea43c3e84651ca207340486f742a418cdaf6b0f5adfec5097e87 Apr 16 20:42:07.352043 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:07.352012 2537 generic.go:358] "Generic (PLEG): container finished" podID="20f067b7-64cc-4569-9397-01e5937afcfa" containerID="0fc3588c4e4272daa7bb7a492a2bb33f8c2ee769af083a1b3f158d318336e6c7" exitCode=0 Apr 16 20:42:07.352043 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:07.352047 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"20f067b7-64cc-4569-9397-01e5937afcfa","Type":"ContainerDied","Data":"0fc3588c4e4272daa7bb7a492a2bb33f8c2ee769af083a1b3f158d318336e6c7"} Apr 16 20:42:07.352244 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:07.352066 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"20f067b7-64cc-4569-9397-01e5937afcfa","Type":"ContainerStarted","Data":"de3b06704397ea43c3e84651ca207340486f742a418cdaf6b0f5adfec5097e87"} Apr 16 20:42:07.588854 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:07.588778 2537 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5deb3abf-e442-44b4-bea6-61efd1b060c0" path="/var/lib/kubelet/pods/5deb3abf-e442-44b4-bea6-61efd1b060c0/volumes" Apr 16 20:42:08.358986 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:08.358957 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"20f067b7-64cc-4569-9397-01e5937afcfa","Type":"ContainerStarted","Data":"52180dbea96004bca138ef8e9cc50bf64a69d6bd0552f4979a466babfb3efd4c"} Apr 16 20:42:08.358986 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:08.358987 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"20f067b7-64cc-4569-9397-01e5937afcfa","Type":"ContainerStarted","Data":"e45f5608dc94c12d2561684b48c7b361680a3c4b7189358eb787f1b3cbddd9d6"} Apr 16 20:42:08.359370 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:08.358999 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"20f067b7-64cc-4569-9397-01e5937afcfa","Type":"ContainerStarted","Data":"a0d675c83bef4236c767b6620781f5b28e8d5931a61b21527158c041375a2b1c"} Apr 16 20:42:08.359370 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:08.359008 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"20f067b7-64cc-4569-9397-01e5937afcfa","Type":"ContainerStarted","Data":"c5c48e33594b9770a8abaf1e4a02c6b4dc13c3c7f5fd1c5c511231d1987f1f5b"} Apr 16 20:42:08.359370 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:08.359015 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"20f067b7-64cc-4569-9397-01e5937afcfa","Type":"ContainerStarted","Data":"2d2c47514259d83106352df6dbae2400bf0875bb4ca0be9ab1f9637561f5aeef"} Apr 16 20:42:08.359370 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:08.359023 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"20f067b7-64cc-4569-9397-01e5937afcfa","Type":"ContainerStarted","Data":"ec90c7a0910760637756018438ff4d5870a34deec80151de8e2b07e6d64aabcb"} Apr 16 20:42:08.383773 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:08.383729 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.3837167790000002 podStartE2EDuration="2.383716779s" podCreationTimestamp="2026-04-16 20:42:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:42:08.382616519 +0000 UTC m=+267.367760109" watchObservedRunningTime="2026-04-16 20:42:08.383716779 +0000 UTC m=+267.368860369" Apr 16 20:42:11.714745 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:11.714712 2537 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:42:41.503254 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:42:41.503223 2537 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 20:43:06.715250 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:43:06.715212 2537 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:43:06.730921 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:43:06.730888 2537 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:43:07.537982 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:43:07.537957 2537 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:44:45.343136 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:44:45.343096 2537 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-n789h"] Apr 16 20:44:45.346356 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:44:45.346336 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-n789h" Apr 16 20:44:45.349374 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:44:45.349353 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:44:45.349495 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:44:45.349405 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-jscw5\"" Apr 16 20:44:45.349495 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:44:45.349405 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 16 20:44:45.357961 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:44:45.357941 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-n789h"] Apr 16 20:44:45.417595 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:44:45.417546 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwsbz\" (UniqueName: \"kubernetes.io/projected/84301946-d4b6-4a37-9cd5-067343bfd8ff-kube-api-access-jwsbz\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-n789h\" (UID: \"84301946-d4b6-4a37-9cd5-067343bfd8ff\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-n789h" Apr 16 20:44:45.417719 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:44:45.417673 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/84301946-d4b6-4a37-9cd5-067343bfd8ff-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-n789h\" (UID: \"84301946-d4b6-4a37-9cd5-067343bfd8ff\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-n789h" Apr 16 20:44:45.518382 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:44:45.518352 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/84301946-d4b6-4a37-9cd5-067343bfd8ff-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-n789h\" (UID: \"84301946-d4b6-4a37-9cd5-067343bfd8ff\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-n789h" Apr 16 20:44:45.518533 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:44:45.518398 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwsbz\" (UniqueName: \"kubernetes.io/projected/84301946-d4b6-4a37-9cd5-067343bfd8ff-kube-api-access-jwsbz\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-n789h\" (UID: \"84301946-d4b6-4a37-9cd5-067343bfd8ff\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-n789h" Apr 16 20:44:45.518746 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:44:45.518725 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/84301946-d4b6-4a37-9cd5-067343bfd8ff-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-n789h\" (UID: \"84301946-d4b6-4a37-9cd5-067343bfd8ff\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-n789h" Apr 16 20:44:45.526160 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:44:45.526137 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwsbz\" (UniqueName: \"kubernetes.io/projected/84301946-d4b6-4a37-9cd5-067343bfd8ff-kube-api-access-jwsbz\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-n789h\" (UID: \"84301946-d4b6-4a37-9cd5-067343bfd8ff\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-n789h" Apr 16 20:44:45.655481 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:44:45.655449 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-n789h" Apr 16 20:44:45.776009 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:44:45.775982 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-n789h"] Apr 16 20:44:45.779383 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:44:45.779359 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84301946_d4b6_4a37_9cd5_067343bfd8ff.slice/crio-1982cfe640cf1d5a51f97a24d34f55cfb211d3937f054d63bd24b0aefe381190 WatchSource:0}: Error finding container 1982cfe640cf1d5a51f97a24d34f55cfb211d3937f054d63bd24b0aefe381190: Status 404 returned error can't find the container with id 1982cfe640cf1d5a51f97a24d34f55cfb211d3937f054d63bd24b0aefe381190 Apr 16 20:44:45.782330 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:44:45.782315 2537 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:44:45.788285 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:44:45.788250 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-n789h" event={"ID":"84301946-d4b6-4a37-9cd5-067343bfd8ff","Type":"ContainerStarted","Data":"1982cfe640cf1d5a51f97a24d34f55cfb211d3937f054d63bd24b0aefe381190"} Apr 16 20:44:48.805982 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:44:48.805904 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-n789h" event={"ID":"84301946-d4b6-4a37-9cd5-067343bfd8ff","Type":"ContainerStarted","Data":"0ca95e88f7164720d6e7f28393fa458f70dc115710e4a8d1b9100e6050585577"} Apr 16 20:44:48.825102 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:44:48.825037 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-n789h" podStartSLOduration=1.151501811 podStartE2EDuration="3.825023687s" podCreationTimestamp="2026-04-16 20:44:45 +0000 UTC" firstStartedPulling="2026-04-16 20:44:45.782437241 +0000 UTC m=+424.767580813" lastFinishedPulling="2026-04-16 20:44:48.455959113 +0000 UTC m=+427.441102689" observedRunningTime="2026-04-16 20:44:48.823521699 +0000 UTC m=+427.808665289" watchObservedRunningTime="2026-04-16 20:44:48.825023687 +0000 UTC m=+427.810167260" Apr 16 20:44:54.352452 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:44:54.352419 2537 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-m6wsb"] Apr 16 20:44:54.355746 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:44:54.355726 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-m6wsb" Apr 16 20:44:54.358196 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:44:54.358176 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 20:44:54.358328 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:44:54.358281 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 20:44:54.359050 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:44:54.359035 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-s9xsz\"" Apr 16 20:44:54.363877 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:44:54.363858 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-m6wsb"] Apr 16 20:44:54.387424 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:44:54.387398 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brzbg\" (UniqueName: \"kubernetes.io/projected/ec3e0320-d317-4e98-b905-cd8ec0ab86e0-kube-api-access-brzbg\") pod \"cert-manager-cainjector-8966b78d4-m6wsb\" (UID: \"ec3e0320-d317-4e98-b905-cd8ec0ab86e0\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-m6wsb" Apr 16 20:44:54.387511 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:44:54.387461 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ec3e0320-d317-4e98-b905-cd8ec0ab86e0-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-m6wsb\" (UID: \"ec3e0320-d317-4e98-b905-cd8ec0ab86e0\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-m6wsb" Apr 16 20:44:54.488067 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:44:54.488031 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ec3e0320-d317-4e98-b905-cd8ec0ab86e0-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-m6wsb\" (UID: \"ec3e0320-d317-4e98-b905-cd8ec0ab86e0\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-m6wsb" Apr 16 20:44:54.488228 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:44:54.488092 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-brzbg\" (UniqueName: \"kubernetes.io/projected/ec3e0320-d317-4e98-b905-cd8ec0ab86e0-kube-api-access-brzbg\") pod \"cert-manager-cainjector-8966b78d4-m6wsb\" (UID: \"ec3e0320-d317-4e98-b905-cd8ec0ab86e0\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-m6wsb" Apr 16 20:44:54.496802 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:44:54.496778 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ec3e0320-d317-4e98-b905-cd8ec0ab86e0-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-m6wsb\" (UID: \"ec3e0320-d317-4e98-b905-cd8ec0ab86e0\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-m6wsb" Apr 16 20:44:54.496922 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:44:54.496869 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-brzbg\" (UniqueName: \"kubernetes.io/projected/ec3e0320-d317-4e98-b905-cd8ec0ab86e0-kube-api-access-brzbg\") pod \"cert-manager-cainjector-8966b78d4-m6wsb\" (UID: \"ec3e0320-d317-4e98-b905-cd8ec0ab86e0\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-m6wsb" Apr 16 20:44:54.672817 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:44:54.672784 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-m6wsb" Apr 16 20:44:54.785270 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:44:54.785246 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-m6wsb"] Apr 16 20:44:54.787484 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:44:54.787454 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec3e0320_d317_4e98_b905_cd8ec0ab86e0.slice/crio-92acc908ed67f27edf8e47c26f8fc9227ffa50d9c95d894298ff59567711c1da WatchSource:0}: Error finding container 92acc908ed67f27edf8e47c26f8fc9227ffa50d9c95d894298ff59567711c1da: Status 404 returned error can't find the container with id 92acc908ed67f27edf8e47c26f8fc9227ffa50d9c95d894298ff59567711c1da Apr 16 20:44:54.828065 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:44:54.828038 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-m6wsb" event={"ID":"ec3e0320-d317-4e98-b905-cd8ec0ab86e0","Type":"ContainerStarted","Data":"92acc908ed67f27edf8e47c26f8fc9227ffa50d9c95d894298ff59567711c1da"} Apr 16 20:44:58.842257 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:44:58.842215 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-m6wsb" event={"ID":"ec3e0320-d317-4e98-b905-cd8ec0ab86e0","Type":"ContainerStarted","Data":"732b7227fa93954769f5b002555d3ace2f53bf45fc9c877433c5b430d89e623a"} Apr 16 20:45:24.732896 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:24.732842 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-m6wsb" podStartSLOduration=27.471421727 podStartE2EDuration="30.732823257s" podCreationTimestamp="2026-04-16 20:44:54 +0000 UTC" firstStartedPulling="2026-04-16 20:44:54.789189342 +0000 UTC m=+433.774332914" lastFinishedPulling="2026-04-16 20:44:58.050590862 +0000 UTC m=+437.035734444" observedRunningTime="2026-04-16 20:44:58.858712592 +0000 UTC m=+437.843856183" watchObservedRunningTime="2026-04-16 20:45:24.732823257 +0000 UTC m=+463.717966852" Apr 16 20:45:24.733270 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:24.733163 2537 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-5494fc4578-p6zmk"] Apr 16 20:45:24.738204 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:24.738184 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5494fc4578-p6zmk" Apr 16 20:45:24.741198 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:24.741176 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-d5gf4\"" Apr 16 20:45:24.742118 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:24.742098 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 20:45:24.742248 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:24.742119 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 20:45:24.742248 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:24.742138 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:45:24.742248 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:24.742164 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 20:45:24.742248 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:24.742119 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 20:45:24.745464 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:24.745446 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5494fc4578-p6zmk"] Apr 16 20:45:24.809288 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:24.809266 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/fdaddecb-774b-4a02-bd24-e82de6235dbb-metrics-cert\") pod \"lws-controller-manager-5494fc4578-p6zmk\" (UID: \"fdaddecb-774b-4a02-bd24-e82de6235dbb\") " pod="openshift-lws-operator/lws-controller-manager-5494fc4578-p6zmk" Apr 16 20:45:24.809383 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:24.809294 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fdaddecb-774b-4a02-bd24-e82de6235dbb-cert\") pod \"lws-controller-manager-5494fc4578-p6zmk\" (UID: \"fdaddecb-774b-4a02-bd24-e82de6235dbb\") " pod="openshift-lws-operator/lws-controller-manager-5494fc4578-p6zmk" Apr 16 20:45:24.809383 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:24.809331 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/fdaddecb-774b-4a02-bd24-e82de6235dbb-manager-config\") pod \"lws-controller-manager-5494fc4578-p6zmk\" (UID: \"fdaddecb-774b-4a02-bd24-e82de6235dbb\") " pod="openshift-lws-operator/lws-controller-manager-5494fc4578-p6zmk" Apr 16 20:45:24.809456 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:24.809383 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmz8h\" (UniqueName: \"kubernetes.io/projected/fdaddecb-774b-4a02-bd24-e82de6235dbb-kube-api-access-zmz8h\") pod \"lws-controller-manager-5494fc4578-p6zmk\" (UID: \"fdaddecb-774b-4a02-bd24-e82de6235dbb\") " pod="openshift-lws-operator/lws-controller-manager-5494fc4578-p6zmk" Apr 16 20:45:24.910552 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:24.910530 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/fdaddecb-774b-4a02-bd24-e82de6235dbb-manager-config\") pod \"lws-controller-manager-5494fc4578-p6zmk\" (UID: \"fdaddecb-774b-4a02-bd24-e82de6235dbb\") " pod="openshift-lws-operator/lws-controller-manager-5494fc4578-p6zmk" Apr 16 20:45:24.910678 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:24.910578 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zmz8h\" (UniqueName: \"kubernetes.io/projected/fdaddecb-774b-4a02-bd24-e82de6235dbb-kube-api-access-zmz8h\") pod \"lws-controller-manager-5494fc4578-p6zmk\" (UID: \"fdaddecb-774b-4a02-bd24-e82de6235dbb\") " pod="openshift-lws-operator/lws-controller-manager-5494fc4578-p6zmk" Apr 16 20:45:24.910678 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:24.910626 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/fdaddecb-774b-4a02-bd24-e82de6235dbb-metrics-cert\") pod \"lws-controller-manager-5494fc4578-p6zmk\" (UID: \"fdaddecb-774b-4a02-bd24-e82de6235dbb\") " pod="openshift-lws-operator/lws-controller-manager-5494fc4578-p6zmk" Apr 16 20:45:24.910678 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:24.910646 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fdaddecb-774b-4a02-bd24-e82de6235dbb-cert\") pod \"lws-controller-manager-5494fc4578-p6zmk\" (UID: \"fdaddecb-774b-4a02-bd24-e82de6235dbb\") " pod="openshift-lws-operator/lws-controller-manager-5494fc4578-p6zmk" Apr 16 20:45:24.911156 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:24.911132 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/fdaddecb-774b-4a02-bd24-e82de6235dbb-manager-config\") pod \"lws-controller-manager-5494fc4578-p6zmk\" (UID: \"fdaddecb-774b-4a02-bd24-e82de6235dbb\") " pod="openshift-lws-operator/lws-controller-manager-5494fc4578-p6zmk" Apr 16 20:45:24.912953 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:24.912935 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/fdaddecb-774b-4a02-bd24-e82de6235dbb-metrics-cert\") pod \"lws-controller-manager-5494fc4578-p6zmk\" (UID: \"fdaddecb-774b-4a02-bd24-e82de6235dbb\") " pod="openshift-lws-operator/lws-controller-manager-5494fc4578-p6zmk" Apr 16 20:45:24.913034 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:24.913015 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fdaddecb-774b-4a02-bd24-e82de6235dbb-cert\") pod \"lws-controller-manager-5494fc4578-p6zmk\" (UID: \"fdaddecb-774b-4a02-bd24-e82de6235dbb\") " pod="openshift-lws-operator/lws-controller-manager-5494fc4578-p6zmk" Apr 16 20:45:24.919052 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:24.919021 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmz8h\" (UniqueName: \"kubernetes.io/projected/fdaddecb-774b-4a02-bd24-e82de6235dbb-kube-api-access-zmz8h\") pod \"lws-controller-manager-5494fc4578-p6zmk\" (UID: \"fdaddecb-774b-4a02-bd24-e82de6235dbb\") " pod="openshift-lws-operator/lws-controller-manager-5494fc4578-p6zmk" Apr 16 20:45:25.047387 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:25.047316 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5494fc4578-p6zmk" Apr 16 20:45:25.165002 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:25.164974 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5494fc4578-p6zmk"] Apr 16 20:45:25.167669 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:45:25.167640 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdaddecb_774b_4a02_bd24_e82de6235dbb.slice/crio-a434bf96badbafa7d867c608b69122f10eca9c2ba52bc5c1773c7c660c2bc191 WatchSource:0}: Error finding container a434bf96badbafa7d867c608b69122f10eca9c2ba52bc5c1773c7c660c2bc191: Status 404 returned error can't find the container with id a434bf96badbafa7d867c608b69122f10eca9c2ba52bc5c1773c7c660c2bc191 Apr 16 20:45:25.920186 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:25.920146 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5494fc4578-p6zmk" event={"ID":"fdaddecb-774b-4a02-bd24-e82de6235dbb","Type":"ContainerStarted","Data":"a434bf96badbafa7d867c608b69122f10eca9c2ba52bc5c1773c7c660c2bc191"} Apr 16 20:45:27.927509 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:27.927475 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5494fc4578-p6zmk" event={"ID":"fdaddecb-774b-4a02-bd24-e82de6235dbb","Type":"ContainerStarted","Data":"d6ef8ad679b78a44d7c1e6e197cbfb276c51a8e0831a44c919a4dccfceebcdb1"} Apr 16 20:45:27.927916 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:27.927531 2537 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-5494fc4578-p6zmk" Apr 16 20:45:27.944513 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:27.944473 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-5494fc4578-p6zmk" podStartSLOduration=1.5510897799999999 podStartE2EDuration="3.944459545s" podCreationTimestamp="2026-04-16 20:45:24 +0000 UTC" firstStartedPulling="2026-04-16 20:45:25.169402175 +0000 UTC m=+464.154545748" lastFinishedPulling="2026-04-16 20:45:27.562771939 +0000 UTC m=+466.547915513" observedRunningTime="2026-04-16 20:45:27.942308297 +0000 UTC m=+466.927451887" watchObservedRunningTime="2026-04-16 20:45:27.944459545 +0000 UTC m=+466.929603135" Apr 16 20:45:32.774309 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:32.774270 2537 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-7cd8df7dd5-b68ld"] Apr 16 20:45:32.776640 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:32.776615 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-7cd8df7dd5-b68ld" Apr 16 20:45:32.778947 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:32.778921 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 16 20:45:32.779056 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:32.779003 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 16 20:45:32.779110 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:32.779070 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 16 20:45:32.779110 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:32.779070 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-lt7rc\"" Apr 16 20:45:32.779212 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:32.779078 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 16 20:45:32.797579 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:32.797541 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-7cd8df7dd5-b68ld"] Apr 16 20:45:32.876001 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:32.875975 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxq2s\" (UniqueName: \"kubernetes.io/projected/13b78c68-8f8b-4fa4-b851-935fe80ea781-kube-api-access-jxq2s\") pod \"opendatahub-operator-controller-manager-7cd8df7dd5-b68ld\" (UID: \"13b78c68-8f8b-4fa4-b851-935fe80ea781\") " pod="opendatahub/opendatahub-operator-controller-manager-7cd8df7dd5-b68ld" Apr 16 20:45:32.876107 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:32.876006 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/13b78c68-8f8b-4fa4-b851-935fe80ea781-apiservice-cert\") pod \"opendatahub-operator-controller-manager-7cd8df7dd5-b68ld\" (UID: \"13b78c68-8f8b-4fa4-b851-935fe80ea781\") " pod="opendatahub/opendatahub-operator-controller-manager-7cd8df7dd5-b68ld" Apr 16 20:45:32.876107 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:32.876080 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/13b78c68-8f8b-4fa4-b851-935fe80ea781-webhook-cert\") pod \"opendatahub-operator-controller-manager-7cd8df7dd5-b68ld\" (UID: \"13b78c68-8f8b-4fa4-b851-935fe80ea781\") " pod="opendatahub/opendatahub-operator-controller-manager-7cd8df7dd5-b68ld" Apr 16 20:45:32.977284 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:32.977249 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jxq2s\" (UniqueName: \"kubernetes.io/projected/13b78c68-8f8b-4fa4-b851-935fe80ea781-kube-api-access-jxq2s\") pod \"opendatahub-operator-controller-manager-7cd8df7dd5-b68ld\" (UID: \"13b78c68-8f8b-4fa4-b851-935fe80ea781\") " pod="opendatahub/opendatahub-operator-controller-manager-7cd8df7dd5-b68ld" Apr 16 20:45:32.977284 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:32.977288 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/13b78c68-8f8b-4fa4-b851-935fe80ea781-apiservice-cert\") pod \"opendatahub-operator-controller-manager-7cd8df7dd5-b68ld\" (UID: \"13b78c68-8f8b-4fa4-b851-935fe80ea781\") " pod="opendatahub/opendatahub-operator-controller-manager-7cd8df7dd5-b68ld" Apr 16 20:45:32.977499 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:32.977472 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/13b78c68-8f8b-4fa4-b851-935fe80ea781-webhook-cert\") pod \"opendatahub-operator-controller-manager-7cd8df7dd5-b68ld\" (UID: \"13b78c68-8f8b-4fa4-b851-935fe80ea781\") " pod="opendatahub/opendatahub-operator-controller-manager-7cd8df7dd5-b68ld" Apr 16 20:45:32.979679 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:32.979657 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/13b78c68-8f8b-4fa4-b851-935fe80ea781-apiservice-cert\") pod \"opendatahub-operator-controller-manager-7cd8df7dd5-b68ld\" (UID: \"13b78c68-8f8b-4fa4-b851-935fe80ea781\") " pod="opendatahub/opendatahub-operator-controller-manager-7cd8df7dd5-b68ld" Apr 16 20:45:32.979756 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:32.979663 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/13b78c68-8f8b-4fa4-b851-935fe80ea781-webhook-cert\") pod \"opendatahub-operator-controller-manager-7cd8df7dd5-b68ld\" (UID: \"13b78c68-8f8b-4fa4-b851-935fe80ea781\") " pod="opendatahub/opendatahub-operator-controller-manager-7cd8df7dd5-b68ld" Apr 16 20:45:32.985615 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:32.985593 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxq2s\" (UniqueName: \"kubernetes.io/projected/13b78c68-8f8b-4fa4-b851-935fe80ea781-kube-api-access-jxq2s\") pod \"opendatahub-operator-controller-manager-7cd8df7dd5-b68ld\" (UID: \"13b78c68-8f8b-4fa4-b851-935fe80ea781\") " pod="opendatahub/opendatahub-operator-controller-manager-7cd8df7dd5-b68ld" Apr 16 20:45:33.087864 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:33.087793 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-7cd8df7dd5-b68ld" Apr 16 20:45:33.209161 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:33.209114 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-7cd8df7dd5-b68ld"] Apr 16 20:45:33.213652 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:45:33.213620 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13b78c68_8f8b_4fa4_b851_935fe80ea781.slice/crio-565154ad1d07a92984da9a022b03bdca81f1bf453d1028fea101459158b012e0 WatchSource:0}: Error finding container 565154ad1d07a92984da9a022b03bdca81f1bf453d1028fea101459158b012e0: Status 404 returned error can't find the container with id 565154ad1d07a92984da9a022b03bdca81f1bf453d1028fea101459158b012e0 Apr 16 20:45:33.948088 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:33.948053 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-7cd8df7dd5-b68ld" event={"ID":"13b78c68-8f8b-4fa4-b851-935fe80ea781","Type":"ContainerStarted","Data":"565154ad1d07a92984da9a022b03bdca81f1bf453d1028fea101459158b012e0"} Apr 16 20:45:35.956441 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:35.956360 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-7cd8df7dd5-b68ld" event={"ID":"13b78c68-8f8b-4fa4-b851-935fe80ea781","Type":"ContainerStarted","Data":"fae643015e1c942af3b231b2adaf732c33f4edb0f74957c56c717ecda84f0e0a"} Apr 16 20:45:35.956832 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:35.956525 2537 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-7cd8df7dd5-b68ld" Apr 16 20:45:35.977169 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:35.977082 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-7cd8df7dd5-b68ld" podStartSLOduration=1.549560675 podStartE2EDuration="3.977063979s" podCreationTimestamp="2026-04-16 20:45:32 +0000 UTC" firstStartedPulling="2026-04-16 20:45:33.215877491 +0000 UTC m=+472.201021060" lastFinishedPulling="2026-04-16 20:45:35.643380779 +0000 UTC m=+474.628524364" observedRunningTime="2026-04-16 20:45:35.975386882 +0000 UTC m=+474.960530486" watchObservedRunningTime="2026-04-16 20:45:35.977063979 +0000 UTC m=+474.962207571" Apr 16 20:45:38.933733 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:38.933687 2537 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-5494fc4578-p6zmk" Apr 16 20:45:46.961696 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:46.961665 2537 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-7cd8df7dd5-b68ld" Apr 16 20:45:51.079371 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:51.079334 2537 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-588879f674-dss9z"] Apr 16 20:45:51.081937 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:51.081914 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-588879f674-dss9z" Apr 16 20:45:51.084780 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:51.084755 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 20:45:51.089874 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:51.089221 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 16 20:45:51.089874 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:51.089438 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-tlw27\"" Apr 16 20:45:51.089874 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:51.089734 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 16 20:45:51.089874 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:51.089757 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 20:45:51.094299 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:51.094277 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-588879f674-dss9z"] Apr 16 20:45:51.118392 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:51.118371 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4c85833a-86bb-4bcf-ad25-a815d8e4ad37-tmp\") pod \"kube-auth-proxy-588879f674-dss9z\" (UID: \"4c85833a-86bb-4bcf-ad25-a815d8e4ad37\") " pod="openshift-ingress/kube-auth-proxy-588879f674-dss9z" Apr 16 20:45:51.118506 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:51.118404 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg8dj\" (UniqueName: \"kubernetes.io/projected/4c85833a-86bb-4bcf-ad25-a815d8e4ad37-kube-api-access-vg8dj\") pod \"kube-auth-proxy-588879f674-dss9z\" (UID: \"4c85833a-86bb-4bcf-ad25-a815d8e4ad37\") " pod="openshift-ingress/kube-auth-proxy-588879f674-dss9z" Apr 16 20:45:51.118506 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:51.118442 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4c85833a-86bb-4bcf-ad25-a815d8e4ad37-tls-certs\") pod \"kube-auth-proxy-588879f674-dss9z\" (UID: \"4c85833a-86bb-4bcf-ad25-a815d8e4ad37\") " pod="openshift-ingress/kube-auth-proxy-588879f674-dss9z" Apr 16 20:45:51.219173 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:51.219148 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4c85833a-86bb-4bcf-ad25-a815d8e4ad37-tmp\") pod \"kube-auth-proxy-588879f674-dss9z\" (UID: \"4c85833a-86bb-4bcf-ad25-a815d8e4ad37\") " pod="openshift-ingress/kube-auth-proxy-588879f674-dss9z" Apr 16 20:45:51.219320 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:51.219188 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vg8dj\" (UniqueName: \"kubernetes.io/projected/4c85833a-86bb-4bcf-ad25-a815d8e4ad37-kube-api-access-vg8dj\") pod \"kube-auth-proxy-588879f674-dss9z\" (UID: \"4c85833a-86bb-4bcf-ad25-a815d8e4ad37\") " pod="openshift-ingress/kube-auth-proxy-588879f674-dss9z" Apr 16 20:45:51.219320 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:51.219249 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4c85833a-86bb-4bcf-ad25-a815d8e4ad37-tls-certs\") pod \"kube-auth-proxy-588879f674-dss9z\" (UID: \"4c85833a-86bb-4bcf-ad25-a815d8e4ad37\") " pod="openshift-ingress/kube-auth-proxy-588879f674-dss9z" Apr 16 20:45:51.221415 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:51.221390 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4c85833a-86bb-4bcf-ad25-a815d8e4ad37-tmp\") pod \"kube-auth-proxy-588879f674-dss9z\" (UID: \"4c85833a-86bb-4bcf-ad25-a815d8e4ad37\") " pod="openshift-ingress/kube-auth-proxy-588879f674-dss9z" Apr 16 20:45:51.221729 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:51.221710 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4c85833a-86bb-4bcf-ad25-a815d8e4ad37-tls-certs\") pod \"kube-auth-proxy-588879f674-dss9z\" (UID: \"4c85833a-86bb-4bcf-ad25-a815d8e4ad37\") " pod="openshift-ingress/kube-auth-proxy-588879f674-dss9z" Apr 16 20:45:51.229329 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:51.229303 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg8dj\" (UniqueName: \"kubernetes.io/projected/4c85833a-86bb-4bcf-ad25-a815d8e4ad37-kube-api-access-vg8dj\") pod \"kube-auth-proxy-588879f674-dss9z\" (UID: \"4c85833a-86bb-4bcf-ad25-a815d8e4ad37\") " pod="openshift-ingress/kube-auth-proxy-588879f674-dss9z" Apr 16 20:45:51.393412 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:51.393379 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-588879f674-dss9z" Apr 16 20:45:51.505075 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:51.505051 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-588879f674-dss9z"] Apr 16 20:45:51.507352 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:45:51.507322 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c85833a_86bb_4bcf_ad25_a815d8e4ad37.slice/crio-33bc5bc0576a8abe1d4e52c8095a252f4fe348b735d9688d0b9d6e79d376c971 WatchSource:0}: Error finding container 33bc5bc0576a8abe1d4e52c8095a252f4fe348b735d9688d0b9d6e79d376c971: Status 404 returned error can't find the container with id 33bc5bc0576a8abe1d4e52c8095a252f4fe348b735d9688d0b9d6e79d376c971 Apr 16 20:45:52.007230 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:52.007186 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-588879f674-dss9z" event={"ID":"4c85833a-86bb-4bcf-ad25-a815d8e4ad37","Type":"ContainerStarted","Data":"33bc5bc0576a8abe1d4e52c8095a252f4fe348b735d9688d0b9d6e79d376c971"} Apr 16 20:45:55.018870 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:55.018783 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-588879f674-dss9z" event={"ID":"4c85833a-86bb-4bcf-ad25-a815d8e4ad37","Type":"ContainerStarted","Data":"7f04a783cdacce8cfd6581f664a3e2c85ad8bbda5f40a794b4617a2bbf318443"} Apr 16 20:45:55.037141 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:45:55.037093 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-588879f674-dss9z" podStartSLOduration=0.903072379 podStartE2EDuration="4.037079527s" podCreationTimestamp="2026-04-16 20:45:51 +0000 UTC" firstStartedPulling="2026-04-16 20:45:51.5089236 +0000 UTC m=+490.494067172" lastFinishedPulling="2026-04-16 20:45:54.642930751 +0000 UTC m=+493.628074320" observedRunningTime="2026-04-16 20:45:55.035158927 +0000 UTC m=+494.020302519" watchObservedRunningTime="2026-04-16 20:45:55.037079527 +0000 UTC m=+494.022223117" Apr 16 20:47:34.420550 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:34.420519 2537 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-tq9bh"] Apr 16 20:47:34.423679 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:34.423658 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-tq9bh" Apr 16 20:47:34.426746 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:34.426718 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 20:47:34.426892 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:34.426870 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-kbkct\"" Apr 16 20:47:34.427007 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:34.426985 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 20:47:34.435267 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:34.435246 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-tq9bh"] Apr 16 20:47:34.531569 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:34.531523 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/a6057d4d-a187-4d48-9e96-842d434e3b1e-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-tq9bh\" (UID: \"a6057d4d-a187-4d48-9e96-842d434e3b1e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-tq9bh" Apr 16 20:47:34.531736 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:34.531676 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvn4g\" (UniqueName: \"kubernetes.io/projected/a6057d4d-a187-4d48-9e96-842d434e3b1e-kube-api-access-lvn4g\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-tq9bh\" (UID: \"a6057d4d-a187-4d48-9e96-842d434e3b1e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-tq9bh" Apr 16 20:47:34.632673 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:34.632629 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lvn4g\" (UniqueName: \"kubernetes.io/projected/a6057d4d-a187-4d48-9e96-842d434e3b1e-kube-api-access-lvn4g\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-tq9bh\" (UID: \"a6057d4d-a187-4d48-9e96-842d434e3b1e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-tq9bh" Apr 16 20:47:34.632848 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:34.632689 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/a6057d4d-a187-4d48-9e96-842d434e3b1e-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-tq9bh\" (UID: \"a6057d4d-a187-4d48-9e96-842d434e3b1e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-tq9bh" Apr 16 20:47:34.633049 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:34.633013 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/a6057d4d-a187-4d48-9e96-842d434e3b1e-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-tq9bh\" (UID: \"a6057d4d-a187-4d48-9e96-842d434e3b1e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-tq9bh" Apr 16 20:47:34.641515 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:34.641481 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvn4g\" (UniqueName: \"kubernetes.io/projected/a6057d4d-a187-4d48-9e96-842d434e3b1e-kube-api-access-lvn4g\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-tq9bh\" (UID: \"a6057d4d-a187-4d48-9e96-842d434e3b1e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-tq9bh" Apr 16 20:47:34.733469 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:34.733376 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-tq9bh" Apr 16 20:47:34.862775 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:34.862749 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-tq9bh"] Apr 16 20:47:34.865483 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:47:34.865457 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6057d4d_a187_4d48_9e96_842d434e3b1e.slice/crio-6f9bd4e49288c8b1f6813dc5212d0f3d3959d6db9a7778c3f96c905ebf2c615c WatchSource:0}: Error finding container 6f9bd4e49288c8b1f6813dc5212d0f3d3959d6db9a7778c3f96c905ebf2c615c: Status 404 returned error can't find the container with id 6f9bd4e49288c8b1f6813dc5212d0f3d3959d6db9a7778c3f96c905ebf2c615c Apr 16 20:47:35.345074 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:35.345044 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-tq9bh" event={"ID":"a6057d4d-a187-4d48-9e96-842d434e3b1e","Type":"ContainerStarted","Data":"6f9bd4e49288c8b1f6813dc5212d0f3d3959d6db9a7778c3f96c905ebf2c615c"} Apr 16 20:47:40.368423 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:40.368351 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-tq9bh" event={"ID":"a6057d4d-a187-4d48-9e96-842d434e3b1e","Type":"ContainerStarted","Data":"b427018db9f1cd41e55b8fb5214b01de12f4056ed0adfc925397a183d7480137"} Apr 16 20:47:40.368423 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:40.368417 2537 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-tq9bh" Apr 16 20:47:40.386860 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:40.386817 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-tq9bh" podStartSLOduration=1.292770039 podStartE2EDuration="6.386768768s" podCreationTimestamp="2026-04-16 20:47:34 +0000 UTC" firstStartedPulling="2026-04-16 20:47:34.867793383 +0000 UTC m=+593.852936952" lastFinishedPulling="2026-04-16 20:47:39.961792108 +0000 UTC m=+598.946935681" observedRunningTime="2026-04-16 20:47:40.385591489 +0000 UTC m=+599.370735080" watchObservedRunningTime="2026-04-16 20:47:40.386768768 +0000 UTC m=+599.371912358" Apr 16 20:47:51.374343 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:51.374314 2537 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-tq9bh" Apr 16 20:47:52.326037 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:52.326004 2537 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-tq9bh"] Apr 16 20:47:52.326266 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:52.326242 2537 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-tq9bh" podUID="a6057d4d-a187-4d48-9e96-842d434e3b1e" containerName="manager" containerID="cri-o://b427018db9f1cd41e55b8fb5214b01de12f4056ed0adfc925397a183d7480137" gracePeriod=2 Apr 16 20:47:52.336132 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:52.336105 2537 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-tq9bh"] Apr 16 20:47:52.392723 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:52.392690 2537 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vnxh4"] Apr 16 20:47:52.393125 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:52.393107 2537 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a6057d4d-a187-4d48-9e96-842d434e3b1e" containerName="manager" Apr 16 20:47:52.393169 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:52.393130 2537 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6057d4d-a187-4d48-9e96-842d434e3b1e" containerName="manager" Apr 16 20:47:52.393266 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:52.393255 2537 memory_manager.go:356] "RemoveStaleState removing state" podUID="a6057d4d-a187-4d48-9e96-842d434e3b1e" containerName="manager" Apr 16 20:47:52.396570 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:52.396543 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vnxh4" Apr 16 20:47:52.398733 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:52.398698 2537 status_manager.go:895] "Failed to get status for pod" podUID="a6057d4d-a187-4d48-9e96-842d434e3b1e" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-tq9bh" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-tq9bh\" is forbidden: User \"system:node:ip-10-0-129-199.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-199.ec2.internal' and this object" Apr 16 20:47:52.425755 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:52.425729 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vnxh4"] Apr 16 20:47:52.478881 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:52.478851 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3c72d40d-af7b-433b-ae82-7edd71f8b76f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-vnxh4\" (UID: \"3c72d40d-af7b-433b-ae82-7edd71f8b76f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vnxh4" Apr 16 20:47:52.479004 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:52.478902 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzrzs\" (UniqueName: \"kubernetes.io/projected/3c72d40d-af7b-433b-ae82-7edd71f8b76f-kube-api-access-rzrzs\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-vnxh4\" (UID: \"3c72d40d-af7b-433b-ae82-7edd71f8b76f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vnxh4" Apr 16 20:47:52.553011 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:52.552991 2537 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-tq9bh" Apr 16 20:47:52.555389 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:52.555363 2537 status_manager.go:895] "Failed to get status for pod" podUID="a6057d4d-a187-4d48-9e96-842d434e3b1e" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-tq9bh" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-tq9bh\" is forbidden: User \"system:node:ip-10-0-129-199.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-199.ec2.internal' and this object" Apr 16 20:47:52.579379 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:52.579336 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3c72d40d-af7b-433b-ae82-7edd71f8b76f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-vnxh4\" (UID: \"3c72d40d-af7b-433b-ae82-7edd71f8b76f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vnxh4" Apr 16 20:47:52.579379 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:52.579370 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rzrzs\" (UniqueName: \"kubernetes.io/projected/3c72d40d-af7b-433b-ae82-7edd71f8b76f-kube-api-access-rzrzs\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-vnxh4\" (UID: \"3c72d40d-af7b-433b-ae82-7edd71f8b76f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vnxh4" Apr 16 20:47:52.579678 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:52.579658 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3c72d40d-af7b-433b-ae82-7edd71f8b76f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-vnxh4\" (UID: \"3c72d40d-af7b-433b-ae82-7edd71f8b76f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vnxh4" Apr 16 20:47:52.590220 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:52.590200 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzrzs\" (UniqueName: \"kubernetes.io/projected/3c72d40d-af7b-433b-ae82-7edd71f8b76f-kube-api-access-rzrzs\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-vnxh4\" (UID: \"3c72d40d-af7b-433b-ae82-7edd71f8b76f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vnxh4" Apr 16 20:47:52.679912 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:52.679890 2537 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvn4g\" (UniqueName: \"kubernetes.io/projected/a6057d4d-a187-4d48-9e96-842d434e3b1e-kube-api-access-lvn4g\") pod \"a6057d4d-a187-4d48-9e96-842d434e3b1e\" (UID: \"a6057d4d-a187-4d48-9e96-842d434e3b1e\") " Apr 16 20:47:52.679992 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:52.679933 2537 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/a6057d4d-a187-4d48-9e96-842d434e3b1e-extensions-socket-volume\") pod \"a6057d4d-a187-4d48-9e96-842d434e3b1e\" (UID: \"a6057d4d-a187-4d48-9e96-842d434e3b1e\") " Apr 16 20:47:52.680283 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:52.680264 2537 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6057d4d-a187-4d48-9e96-842d434e3b1e-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "a6057d4d-a187-4d48-9e96-842d434e3b1e" (UID: "a6057d4d-a187-4d48-9e96-842d434e3b1e"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:47:52.681727 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:52.681707 2537 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6057d4d-a187-4d48-9e96-842d434e3b1e-kube-api-access-lvn4g" (OuterVolumeSpecName: "kube-api-access-lvn4g") pod "a6057d4d-a187-4d48-9e96-842d434e3b1e" (UID: "a6057d4d-a187-4d48-9e96-842d434e3b1e"). InnerVolumeSpecName "kube-api-access-lvn4g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:47:52.712003 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:52.711982 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vnxh4" Apr 16 20:47:52.781089 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:52.781044 2537 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lvn4g\" (UniqueName: \"kubernetes.io/projected/a6057d4d-a187-4d48-9e96-842d434e3b1e-kube-api-access-lvn4g\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 16 20:47:52.781089 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:52.781068 2537 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/a6057d4d-a187-4d48-9e96-842d434e3b1e-extensions-socket-volume\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 16 20:47:52.842547 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:52.842521 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vnxh4"] Apr 16 20:47:52.844983 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:47:52.844959 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c72d40d_af7b_433b_ae82_7edd71f8b76f.slice/crio-9db4a9703578e5cf4e01fe7fa58bcb525241aded07abfc6ced40c91b8e8c93b5 WatchSource:0}: Error finding container 9db4a9703578e5cf4e01fe7fa58bcb525241aded07abfc6ced40c91b8e8c93b5: Status 404 returned error can't find the container with id 9db4a9703578e5cf4e01fe7fa58bcb525241aded07abfc6ced40c91b8e8c93b5 Apr 16 20:47:53.265185 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:53.265143 2537 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n6f2j"] Apr 16 20:47:53.268427 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:53.268412 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n6f2j" Apr 16 20:47:53.281388 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:53.281365 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n6f2j"] Apr 16 20:47:53.297618 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:53.297590 2537 status_manager.go:895] "Failed to get status for pod" podUID="a6057d4d-a187-4d48-9e96-842d434e3b1e" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-tq9bh" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-tq9bh\" is forbidden: User \"system:node:ip-10-0-129-199.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-199.ec2.internal' and this object" Apr 16 20:47:53.386111 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:53.386086 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z64xk\" (UniqueName: \"kubernetes.io/projected/80e4cc9c-6a18-4776-9940-378ed89ddd1d-kube-api-access-z64xk\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-n6f2j\" (UID: \"80e4cc9c-6a18-4776-9940-378ed89ddd1d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n6f2j" Apr 16 20:47:53.386207 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:53.386173 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/80e4cc9c-6a18-4776-9940-378ed89ddd1d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-n6f2j\" (UID: \"80e4cc9c-6a18-4776-9940-378ed89ddd1d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n6f2j" Apr 16 20:47:53.422870 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:53.422837 2537 generic.go:358] "Generic (PLEG): container finished" podID="a6057d4d-a187-4d48-9e96-842d434e3b1e" containerID="b427018db9f1cd41e55b8fb5214b01de12f4056ed0adfc925397a183d7480137" exitCode=0 Apr 16 20:47:53.423194 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:53.422896 2537 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-tq9bh" Apr 16 20:47:53.423194 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:53.422928 2537 scope.go:117] "RemoveContainer" containerID="b427018db9f1cd41e55b8fb5214b01de12f4056ed0adfc925397a183d7480137" Apr 16 20:47:53.424552 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:53.424523 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vnxh4" event={"ID":"3c72d40d-af7b-433b-ae82-7edd71f8b76f","Type":"ContainerStarted","Data":"105f14c4d36baf4e1d4e4adb06cdc14810bcf7714fbcf3ff705a6da0f89c150d"} Apr 16 20:47:53.424651 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:53.424577 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vnxh4" event={"ID":"3c72d40d-af7b-433b-ae82-7edd71f8b76f","Type":"ContainerStarted","Data":"9db4a9703578e5cf4e01fe7fa58bcb525241aded07abfc6ced40c91b8e8c93b5"} Apr 16 20:47:53.424720 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:53.424708 2537 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vnxh4" Apr 16 20:47:53.425741 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:53.425715 2537 status_manager.go:895] "Failed to get status for pod" podUID="a6057d4d-a187-4d48-9e96-842d434e3b1e" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-tq9bh" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-tq9bh\" is forbidden: User \"system:node:ip-10-0-129-199.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-199.ec2.internal' and this object" Apr 16 20:47:53.427627 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:53.427601 2537 status_manager.go:895] "Failed to get status for pod" podUID="a6057d4d-a187-4d48-9e96-842d434e3b1e" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-tq9bh" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-tq9bh\" is forbidden: User \"system:node:ip-10-0-129-199.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-199.ec2.internal' and this object" Apr 16 20:47:53.431021 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:53.431005 2537 scope.go:117] "RemoveContainer" containerID="b427018db9f1cd41e55b8fb5214b01de12f4056ed0adfc925397a183d7480137" Apr 16 20:47:53.431265 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:47:53.431249 2537 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b427018db9f1cd41e55b8fb5214b01de12f4056ed0adfc925397a183d7480137\": container with ID starting with b427018db9f1cd41e55b8fb5214b01de12f4056ed0adfc925397a183d7480137 not found: ID does not exist" containerID="b427018db9f1cd41e55b8fb5214b01de12f4056ed0adfc925397a183d7480137" Apr 16 20:47:53.431302 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:53.431272 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b427018db9f1cd41e55b8fb5214b01de12f4056ed0adfc925397a183d7480137"} err="failed to get container status \"b427018db9f1cd41e55b8fb5214b01de12f4056ed0adfc925397a183d7480137\": rpc error: code = NotFound desc = could not find container \"b427018db9f1cd41e55b8fb5214b01de12f4056ed0adfc925397a183d7480137\": container with ID starting with b427018db9f1cd41e55b8fb5214b01de12f4056ed0adfc925397a183d7480137 not found: ID does not exist" Apr 16 20:47:53.454677 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:53.454636 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vnxh4" podStartSLOduration=1.454625979 podStartE2EDuration="1.454625979s" podCreationTimestamp="2026-04-16 20:47:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:47:53.453982618 +0000 UTC m=+612.439126210" watchObservedRunningTime="2026-04-16 20:47:53.454625979 +0000 UTC m=+612.439769569" Apr 16 20:47:53.455758 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:53.455736 2537 status_manager.go:895] "Failed to get status for pod" podUID="a6057d4d-a187-4d48-9e96-842d434e3b1e" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-tq9bh" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-tq9bh\" is forbidden: User \"system:node:ip-10-0-129-199.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-199.ec2.internal' and this object" Apr 16 20:47:53.486830 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:53.486809 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z64xk\" (UniqueName: \"kubernetes.io/projected/80e4cc9c-6a18-4776-9940-378ed89ddd1d-kube-api-access-z64xk\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-n6f2j\" (UID: \"80e4cc9c-6a18-4776-9940-378ed89ddd1d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n6f2j" Apr 16 20:47:53.486907 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:53.486872 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/80e4cc9c-6a18-4776-9940-378ed89ddd1d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-n6f2j\" (UID: \"80e4cc9c-6a18-4776-9940-378ed89ddd1d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n6f2j" Apr 16 20:47:53.487138 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:53.487124 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/80e4cc9c-6a18-4776-9940-378ed89ddd1d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-n6f2j\" (UID: \"80e4cc9c-6a18-4776-9940-378ed89ddd1d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n6f2j" Apr 16 20:47:53.495421 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:53.495397 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z64xk\" (UniqueName: \"kubernetes.io/projected/80e4cc9c-6a18-4776-9940-378ed89ddd1d-kube-api-access-z64xk\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-n6f2j\" (UID: \"80e4cc9c-6a18-4776-9940-378ed89ddd1d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n6f2j" Apr 16 20:47:53.578983 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:53.578914 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n6f2j" Apr 16 20:47:53.591585 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:53.591544 2537 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6057d4d-a187-4d48-9e96-842d434e3b1e" path="/var/lib/kubelet/pods/a6057d4d-a187-4d48-9e96-842d434e3b1e/volumes" Apr 16 20:47:53.701758 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:53.701730 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n6f2j"] Apr 16 20:47:53.703657 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:47:53.703630 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80e4cc9c_6a18_4776_9940_378ed89ddd1d.slice/crio-1d306b77c357bbe2a92e79e880f9a470ff84dd3f2b384f7fa59b33cd79d5b3f1 WatchSource:0}: Error finding container 1d306b77c357bbe2a92e79e880f9a470ff84dd3f2b384f7fa59b33cd79d5b3f1: Status 404 returned error can't find the container with id 1d306b77c357bbe2a92e79e880f9a470ff84dd3f2b384f7fa59b33cd79d5b3f1 Apr 16 20:47:54.430329 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:54.430298 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n6f2j" event={"ID":"80e4cc9c-6a18-4776-9940-378ed89ddd1d","Type":"ContainerStarted","Data":"115ace02cbe84f0327cb795f6c781e4980c5de62a58f163a27e1ac628cbf5cea"} Apr 16 20:47:54.430329 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:54.430333 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n6f2j" event={"ID":"80e4cc9c-6a18-4776-9940-378ed89ddd1d","Type":"ContainerStarted","Data":"1d306b77c357bbe2a92e79e880f9a470ff84dd3f2b384f7fa59b33cd79d5b3f1"} Apr 16 20:47:54.430766 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:54.430582 2537 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n6f2j" Apr 16 20:47:54.462332 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:47:54.462278 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n6f2j" podStartSLOduration=1.462252544 podStartE2EDuration="1.462252544s" podCreationTimestamp="2026-04-16 20:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:47:54.461227324 +0000 UTC m=+613.446370915" watchObservedRunningTime="2026-04-16 20:47:54.462252544 +0000 UTC m=+613.447396134" Apr 16 20:48:04.432826 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:04.432762 2537 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vnxh4" Apr 16 20:48:05.435859 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:05.435829 2537 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n6f2j" Apr 16 20:48:05.513146 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:05.513116 2537 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vnxh4"] Apr 16 20:48:05.513417 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:05.513393 2537 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vnxh4" podUID="3c72d40d-af7b-433b-ae82-7edd71f8b76f" containerName="manager" containerID="cri-o://105f14c4d36baf4e1d4e4adb06cdc14810bcf7714fbcf3ff705a6da0f89c150d" gracePeriod=10 Apr 16 20:48:05.755847 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:05.755827 2537 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vnxh4" Apr 16 20:48:05.876283 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:05.876258 2537 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3c72d40d-af7b-433b-ae82-7edd71f8b76f-extensions-socket-volume\") pod \"3c72d40d-af7b-433b-ae82-7edd71f8b76f\" (UID: \"3c72d40d-af7b-433b-ae82-7edd71f8b76f\") " Apr 16 20:48:05.876393 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:05.876298 2537 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzrzs\" (UniqueName: \"kubernetes.io/projected/3c72d40d-af7b-433b-ae82-7edd71f8b76f-kube-api-access-rzrzs\") pod \"3c72d40d-af7b-433b-ae82-7edd71f8b76f\" (UID: \"3c72d40d-af7b-433b-ae82-7edd71f8b76f\") " Apr 16 20:48:05.876678 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:05.876648 2537 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c72d40d-af7b-433b-ae82-7edd71f8b76f-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "3c72d40d-af7b-433b-ae82-7edd71f8b76f" (UID: "3c72d40d-af7b-433b-ae82-7edd71f8b76f"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:48:05.878317 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:05.878296 2537 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c72d40d-af7b-433b-ae82-7edd71f8b76f-kube-api-access-rzrzs" (OuterVolumeSpecName: "kube-api-access-rzrzs") pod "3c72d40d-af7b-433b-ae82-7edd71f8b76f" (UID: "3c72d40d-af7b-433b-ae82-7edd71f8b76f"). InnerVolumeSpecName "kube-api-access-rzrzs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:48:05.977797 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:05.977767 2537 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3c72d40d-af7b-433b-ae82-7edd71f8b76f-extensions-socket-volume\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 16 20:48:05.977797 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:05.977793 2537 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rzrzs\" (UniqueName: \"kubernetes.io/projected/3c72d40d-af7b-433b-ae82-7edd71f8b76f-kube-api-access-rzrzs\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 16 20:48:06.468812 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:06.468771 2537 generic.go:358] "Generic (PLEG): container finished" podID="3c72d40d-af7b-433b-ae82-7edd71f8b76f" containerID="105f14c4d36baf4e1d4e4adb06cdc14810bcf7714fbcf3ff705a6da0f89c150d" exitCode=0 Apr 16 20:48:06.469261 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:06.468829 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vnxh4" event={"ID":"3c72d40d-af7b-433b-ae82-7edd71f8b76f","Type":"ContainerDied","Data":"105f14c4d36baf4e1d4e4adb06cdc14810bcf7714fbcf3ff705a6da0f89c150d"} Apr 16 20:48:06.469261 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:06.468854 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vnxh4" event={"ID":"3c72d40d-af7b-433b-ae82-7edd71f8b76f","Type":"ContainerDied","Data":"9db4a9703578e5cf4e01fe7fa58bcb525241aded07abfc6ced40c91b8e8c93b5"} Apr 16 20:48:06.469261 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:06.468861 2537 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vnxh4" Apr 16 20:48:06.469261 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:06.468877 2537 scope.go:117] "RemoveContainer" containerID="105f14c4d36baf4e1d4e4adb06cdc14810bcf7714fbcf3ff705a6da0f89c150d" Apr 16 20:48:06.476757 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:06.476741 2537 scope.go:117] "RemoveContainer" containerID="105f14c4d36baf4e1d4e4adb06cdc14810bcf7714fbcf3ff705a6da0f89c150d" Apr 16 20:48:06.477000 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:48:06.476983 2537 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"105f14c4d36baf4e1d4e4adb06cdc14810bcf7714fbcf3ff705a6da0f89c150d\": container with ID starting with 105f14c4d36baf4e1d4e4adb06cdc14810bcf7714fbcf3ff705a6da0f89c150d not found: ID does not exist" containerID="105f14c4d36baf4e1d4e4adb06cdc14810bcf7714fbcf3ff705a6da0f89c150d" Apr 16 20:48:06.477062 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:06.477008 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"105f14c4d36baf4e1d4e4adb06cdc14810bcf7714fbcf3ff705a6da0f89c150d"} err="failed to get container status \"105f14c4d36baf4e1d4e4adb06cdc14810bcf7714fbcf3ff705a6da0f89c150d\": rpc error: code = NotFound desc = could not find container \"105f14c4d36baf4e1d4e4adb06cdc14810bcf7714fbcf3ff705a6da0f89c150d\": container with ID starting with 105f14c4d36baf4e1d4e4adb06cdc14810bcf7714fbcf3ff705a6da0f89c150d not found: ID does not exist" Apr 16 20:48:06.495678 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:06.495654 2537 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vnxh4"] Apr 16 20:48:06.501620 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:06.501601 2537 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vnxh4"] Apr 16 20:48:07.587239 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:07.587195 2537 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c72d40d-af7b-433b-ae82-7edd71f8b76f" path="/var/lib/kubelet/pods/3c72d40d-af7b-433b-ae82-7edd71f8b76f/volumes" Apr 16 20:48:27.476856 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:27.476825 2537 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-md7nt"] Apr 16 20:48:27.477305 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:27.477143 2537 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3c72d40d-af7b-433b-ae82-7edd71f8b76f" containerName="manager" Apr 16 20:48:27.477305 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:27.477155 2537 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c72d40d-af7b-433b-ae82-7edd71f8b76f" containerName="manager" Apr 16 20:48:27.477305 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:27.477212 2537 memory_manager.go:356] "RemoveStaleState removing state" podUID="3c72d40d-af7b-433b-ae82-7edd71f8b76f" containerName="manager" Apr 16 20:48:27.483795 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:27.483771 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-md7nt" Apr 16 20:48:27.487151 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:27.486939 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-7bscv\"" Apr 16 20:48:27.487649 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:27.487607 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-md7nt"] Apr 16 20:48:27.532333 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:27.532309 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qqlq\" (UniqueName: \"kubernetes.io/projected/21b7af19-8552-4601-8eb6-47ea942edd6f-kube-api-access-4qqlq\") pod \"authorino-f99f4b5cd-md7nt\" (UID: \"21b7af19-8552-4601-8eb6-47ea942edd6f\") " pod="kuadrant-system/authorino-f99f4b5cd-md7nt" Apr 16 20:48:27.632972 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:27.632942 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4qqlq\" (UniqueName: \"kubernetes.io/projected/21b7af19-8552-4601-8eb6-47ea942edd6f-kube-api-access-4qqlq\") pod \"authorino-f99f4b5cd-md7nt\" (UID: \"21b7af19-8552-4601-8eb6-47ea942edd6f\") " pod="kuadrant-system/authorino-f99f4b5cd-md7nt" Apr 16 20:48:27.641236 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:27.641213 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qqlq\" (UniqueName: \"kubernetes.io/projected/21b7af19-8552-4601-8eb6-47ea942edd6f-kube-api-access-4qqlq\") pod \"authorino-f99f4b5cd-md7nt\" (UID: \"21b7af19-8552-4601-8eb6-47ea942edd6f\") " pod="kuadrant-system/authorino-f99f4b5cd-md7nt" Apr 16 20:48:27.725283 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:27.725258 2537 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-g8vk2"] Apr 16 20:48:27.730577 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:27.730495 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-g8vk2" Apr 16 20:48:27.735631 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:27.735604 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-g8vk2"] Apr 16 20:48:27.794079 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:27.794042 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-md7nt" Apr 16 20:48:27.835099 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:27.835071 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gftxq\" (UniqueName: \"kubernetes.io/projected/f294644d-e4e9-4495-b665-ade78a8a82d4-kube-api-access-gftxq\") pod \"authorino-7498df8756-g8vk2\" (UID: \"f294644d-e4e9-4495-b665-ade78a8a82d4\") " pod="kuadrant-system/authorino-7498df8756-g8vk2" Apr 16 20:48:27.908446 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:27.908406 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-md7nt"] Apr 16 20:48:27.911363 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:48:27.911336 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21b7af19_8552_4601_8eb6_47ea942edd6f.slice/crio-b63604fcd2f2dc096c6ecffbca79d56d76816dc4ae24a34d8e47b0f5aa5efcf4 WatchSource:0}: Error finding container b63604fcd2f2dc096c6ecffbca79d56d76816dc4ae24a34d8e47b0f5aa5efcf4: Status 404 returned error can't find the container with id b63604fcd2f2dc096c6ecffbca79d56d76816dc4ae24a34d8e47b0f5aa5efcf4 Apr 16 20:48:27.936506 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:27.936478 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gftxq\" (UniqueName: \"kubernetes.io/projected/f294644d-e4e9-4495-b665-ade78a8a82d4-kube-api-access-gftxq\") pod \"authorino-7498df8756-g8vk2\" (UID: \"f294644d-e4e9-4495-b665-ade78a8a82d4\") " pod="kuadrant-system/authorino-7498df8756-g8vk2" Apr 16 20:48:27.945516 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:27.945492 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gftxq\" (UniqueName: \"kubernetes.io/projected/f294644d-e4e9-4495-b665-ade78a8a82d4-kube-api-access-gftxq\") pod \"authorino-7498df8756-g8vk2\" (UID: \"f294644d-e4e9-4495-b665-ade78a8a82d4\") " pod="kuadrant-system/authorino-7498df8756-g8vk2" Apr 16 20:48:28.040956 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:28.040895 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-g8vk2" Apr 16 20:48:28.155007 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:28.154876 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-g8vk2"] Apr 16 20:48:28.157023 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:48:28.156995 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf294644d_e4e9_4495_b665_ade78a8a82d4.slice/crio-bdb8a7af17f9f1f7ea039692634ca5efebad49ec9008159e0392d47f3ebfd95b WatchSource:0}: Error finding container bdb8a7af17f9f1f7ea039692634ca5efebad49ec9008159e0392d47f3ebfd95b: Status 404 returned error can't find the container with id bdb8a7af17f9f1f7ea039692634ca5efebad49ec9008159e0392d47f3ebfd95b Apr 16 20:48:28.541544 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:28.541511 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-g8vk2" event={"ID":"f294644d-e4e9-4495-b665-ade78a8a82d4","Type":"ContainerStarted","Data":"bdb8a7af17f9f1f7ea039692634ca5efebad49ec9008159e0392d47f3ebfd95b"} Apr 16 20:48:28.542516 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:28.542491 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-md7nt" event={"ID":"21b7af19-8552-4601-8eb6-47ea942edd6f","Type":"ContainerStarted","Data":"b63604fcd2f2dc096c6ecffbca79d56d76816dc4ae24a34d8e47b0f5aa5efcf4"} Apr 16 20:48:30.550754 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:30.550672 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-md7nt" event={"ID":"21b7af19-8552-4601-8eb6-47ea942edd6f","Type":"ContainerStarted","Data":"0f3c203f6338a238b38180b84061473061753bd9e053bc336713c935f37590c6"} Apr 16 20:48:30.551960 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:30.551929 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-g8vk2" event={"ID":"f294644d-e4e9-4495-b665-ade78a8a82d4","Type":"ContainerStarted","Data":"81a585f0adc225f1f0a15d4231832582f46c169b49c43d5a11766485c23c3fa4"} Apr 16 20:48:30.570213 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:30.570153 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-md7nt" podStartSLOduration=1.2590547939999999 podStartE2EDuration="3.570141799s" podCreationTimestamp="2026-04-16 20:48:27 +0000 UTC" firstStartedPulling="2026-04-16 20:48:27.912646987 +0000 UTC m=+646.897790566" lastFinishedPulling="2026-04-16 20:48:30.223733998 +0000 UTC m=+649.208877571" observedRunningTime="2026-04-16 20:48:30.567780081 +0000 UTC m=+649.552923672" watchObservedRunningTime="2026-04-16 20:48:30.570141799 +0000 UTC m=+649.555285407" Apr 16 20:48:30.582873 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:30.582824 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-g8vk2" podStartSLOduration=1.5237441760000001 podStartE2EDuration="3.582810568s" podCreationTimestamp="2026-04-16 20:48:27 +0000 UTC" firstStartedPulling="2026-04-16 20:48:28.158252799 +0000 UTC m=+647.143396374" lastFinishedPulling="2026-04-16 20:48:30.217319183 +0000 UTC m=+649.202462766" observedRunningTime="2026-04-16 20:48:30.582323215 +0000 UTC m=+649.567466806" watchObservedRunningTime="2026-04-16 20:48:30.582810568 +0000 UTC m=+649.567954161" Apr 16 20:48:30.604511 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:30.604485 2537 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-md7nt"] Apr 16 20:48:32.560019 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:32.559977 2537 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-md7nt" podUID="21b7af19-8552-4601-8eb6-47ea942edd6f" containerName="authorino" containerID="cri-o://0f3c203f6338a238b38180b84061473061753bd9e053bc336713c935f37590c6" gracePeriod=30 Apr 16 20:48:32.791110 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:32.791089 2537 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-md7nt" Apr 16 20:48:32.876745 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:32.876721 2537 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qqlq\" (UniqueName: \"kubernetes.io/projected/21b7af19-8552-4601-8eb6-47ea942edd6f-kube-api-access-4qqlq\") pod \"21b7af19-8552-4601-8eb6-47ea942edd6f\" (UID: \"21b7af19-8552-4601-8eb6-47ea942edd6f\") " Apr 16 20:48:32.878683 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:32.878660 2537 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21b7af19-8552-4601-8eb6-47ea942edd6f-kube-api-access-4qqlq" (OuterVolumeSpecName: "kube-api-access-4qqlq") pod "21b7af19-8552-4601-8eb6-47ea942edd6f" (UID: "21b7af19-8552-4601-8eb6-47ea942edd6f"). InnerVolumeSpecName "kube-api-access-4qqlq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:48:32.977510 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:32.977486 2537 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4qqlq\" (UniqueName: \"kubernetes.io/projected/21b7af19-8552-4601-8eb6-47ea942edd6f-kube-api-access-4qqlq\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 16 20:48:33.565661 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:33.565623 2537 generic.go:358] "Generic (PLEG): container finished" podID="21b7af19-8552-4601-8eb6-47ea942edd6f" containerID="0f3c203f6338a238b38180b84061473061753bd9e053bc336713c935f37590c6" exitCode=0 Apr 16 20:48:33.566100 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:33.565690 2537 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-md7nt" Apr 16 20:48:33.566100 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:33.565700 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-md7nt" event={"ID":"21b7af19-8552-4601-8eb6-47ea942edd6f","Type":"ContainerDied","Data":"0f3c203f6338a238b38180b84061473061753bd9e053bc336713c935f37590c6"} Apr 16 20:48:33.566100 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:33.565730 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-md7nt" event={"ID":"21b7af19-8552-4601-8eb6-47ea942edd6f","Type":"ContainerDied","Data":"b63604fcd2f2dc096c6ecffbca79d56d76816dc4ae24a34d8e47b0f5aa5efcf4"} Apr 16 20:48:33.566100 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:33.565746 2537 scope.go:117] "RemoveContainer" containerID="0f3c203f6338a238b38180b84061473061753bd9e053bc336713c935f37590c6" Apr 16 20:48:33.574499 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:33.574477 2537 scope.go:117] "RemoveContainer" containerID="0f3c203f6338a238b38180b84061473061753bd9e053bc336713c935f37590c6" Apr 16 20:48:33.574811 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:48:33.574786 2537 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f3c203f6338a238b38180b84061473061753bd9e053bc336713c935f37590c6\": container with ID starting with 0f3c203f6338a238b38180b84061473061753bd9e053bc336713c935f37590c6 not found: ID does not exist" containerID="0f3c203f6338a238b38180b84061473061753bd9e053bc336713c935f37590c6" Apr 16 20:48:33.574867 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:33.574813 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f3c203f6338a238b38180b84061473061753bd9e053bc336713c935f37590c6"} err="failed to get container status \"0f3c203f6338a238b38180b84061473061753bd9e053bc336713c935f37590c6\": rpc error: code = NotFound desc = could not find container \"0f3c203f6338a238b38180b84061473061753bd9e053bc336713c935f37590c6\": container with ID starting with 0f3c203f6338a238b38180b84061473061753bd9e053bc336713c935f37590c6 not found: ID does not exist" Apr 16 20:48:33.588995 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:33.588971 2537 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-md7nt"] Apr 16 20:48:33.589103 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:33.589077 2537 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-md7nt"] Apr 16 20:48:35.587183 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:48:35.587150 2537 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21b7af19-8552-4601-8eb6-47ea942edd6f" path="/var/lib/kubelet/pods/21b7af19-8552-4601-8eb6-47ea942edd6f/volumes" Apr 16 20:49:01.309313 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:01.309279 2537 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-7nc8x"] Apr 16 20:49:01.309762 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:01.309622 2537 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="21b7af19-8552-4601-8eb6-47ea942edd6f" containerName="authorino" Apr 16 20:49:01.309762 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:01.309634 2537 state_mem.go:107] "Deleted CPUSet assignment" podUID="21b7af19-8552-4601-8eb6-47ea942edd6f" containerName="authorino" Apr 16 20:49:01.309762 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:01.309691 2537 memory_manager.go:356] "RemoveStaleState removing state" podUID="21b7af19-8552-4601-8eb6-47ea942edd6f" containerName="authorino" Apr 16 20:49:01.314214 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:01.314193 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-7nc8x" Apr 16 20:49:01.317444 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:01.317409 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-7nc8x"] Apr 16 20:49:01.396595 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:01.396540 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlkhx\" (UniqueName: \"kubernetes.io/projected/6b6c1948-cc96-4857-a9c6-8099a08202c6-kube-api-access-wlkhx\") pod \"authorino-8b475cf9f-7nc8x\" (UID: \"6b6c1948-cc96-4857-a9c6-8099a08202c6\") " pod="kuadrant-system/authorino-8b475cf9f-7nc8x" Apr 16 20:49:01.497763 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:01.497733 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wlkhx\" (UniqueName: \"kubernetes.io/projected/6b6c1948-cc96-4857-a9c6-8099a08202c6-kube-api-access-wlkhx\") pod \"authorino-8b475cf9f-7nc8x\" (UID: \"6b6c1948-cc96-4857-a9c6-8099a08202c6\") " pod="kuadrant-system/authorino-8b475cf9f-7nc8x" Apr 16 20:49:01.505734 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:01.505709 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlkhx\" (UniqueName: \"kubernetes.io/projected/6b6c1948-cc96-4857-a9c6-8099a08202c6-kube-api-access-wlkhx\") pod \"authorino-8b475cf9f-7nc8x\" (UID: \"6b6c1948-cc96-4857-a9c6-8099a08202c6\") " pod="kuadrant-system/authorino-8b475cf9f-7nc8x" Apr 16 20:49:01.548258 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:01.548232 2537 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-7nc8x"] Apr 16 20:49:01.548446 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:01.548434 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-7nc8x" Apr 16 20:49:01.573751 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:01.573676 2537 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-6cff784565-smjvw"] Apr 16 20:49:01.578999 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:01.578969 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6cff784565-smjvw" Apr 16 20:49:01.590308 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:01.590280 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-6cff784565-smjvw"] Apr 16 20:49:01.671752 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:01.671662 2537 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-7nc8x"] Apr 16 20:49:01.674884 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:49:01.674855 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b6c1948_cc96_4857_a9c6_8099a08202c6.slice/crio-52183abea5363e20a761c789493848e589016c076415c03a0da8ed6c2fe16587 WatchSource:0}: Error finding container 52183abea5363e20a761c789493848e589016c076415c03a0da8ed6c2fe16587: Status 404 returned error can't find the container with id 52183abea5363e20a761c789493848e589016c076415c03a0da8ed6c2fe16587 Apr 16 20:49:01.699644 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:01.699621 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jztr\" (UniqueName: \"kubernetes.io/projected/b9704ac0-9128-41f3-a31d-b3a862a36fab-kube-api-access-8jztr\") pod \"authorino-6cff784565-smjvw\" (UID: \"b9704ac0-9128-41f3-a31d-b3a862a36fab\") " pod="kuadrant-system/authorino-6cff784565-smjvw" Apr 16 20:49:01.800503 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:01.800463 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8jztr\" (UniqueName: \"kubernetes.io/projected/b9704ac0-9128-41f3-a31d-b3a862a36fab-kube-api-access-8jztr\") pod \"authorino-6cff784565-smjvw\" (UID: \"b9704ac0-9128-41f3-a31d-b3a862a36fab\") " pod="kuadrant-system/authorino-6cff784565-smjvw" Apr 16 20:49:01.808256 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:01.808235 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jztr\" (UniqueName: \"kubernetes.io/projected/b9704ac0-9128-41f3-a31d-b3a862a36fab-kube-api-access-8jztr\") pod \"authorino-6cff784565-smjvw\" (UID: \"b9704ac0-9128-41f3-a31d-b3a862a36fab\") " pod="kuadrant-system/authorino-6cff784565-smjvw" Apr 16 20:49:01.894854 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:01.894820 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6cff784565-smjvw" Apr 16 20:49:02.007792 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:02.007770 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-6cff784565-smjvw"] Apr 16 20:49:02.013644 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:49:02.013620 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9704ac0_9128_41f3_a31d_b3a862a36fab.slice/crio-c2f37f9bbafae8fa70b17310aa86d7cd62cb0ab4f501b14b9d3783b1c85267a3 WatchSource:0}: Error finding container c2f37f9bbafae8fa70b17310aa86d7cd62cb0ab4f501b14b9d3783b1c85267a3: Status 404 returned error can't find the container with id c2f37f9bbafae8fa70b17310aa86d7cd62cb0ab4f501b14b9d3783b1c85267a3 Apr 16 20:49:02.441286 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:02.441260 2537 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-6cff784565-smjvw"] Apr 16 20:49:02.473660 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:02.473624 2537 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-5544d6787c-dhx56"] Apr 16 20:49:02.478747 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:02.478716 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5544d6787c-dhx56" Apr 16 20:49:02.481233 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:02.481213 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 16 20:49:02.491309 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:02.491278 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-5544d6787c-dhx56"] Apr 16 20:49:02.608085 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:02.608046 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f46dm\" (UniqueName: \"kubernetes.io/projected/6332ba3b-f489-4524-9e98-d5fab9a13f4a-kube-api-access-f46dm\") pod \"authorino-5544d6787c-dhx56\" (UID: \"6332ba3b-f489-4524-9e98-d5fab9a13f4a\") " pod="kuadrant-system/authorino-5544d6787c-dhx56" Apr 16 20:49:02.608259 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:02.608107 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/6332ba3b-f489-4524-9e98-d5fab9a13f4a-tls-cert\") pod \"authorino-5544d6787c-dhx56\" (UID: \"6332ba3b-f489-4524-9e98-d5fab9a13f4a\") " pod="kuadrant-system/authorino-5544d6787c-dhx56" Apr 16 20:49:02.661172 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:02.661097 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-7nc8x" event={"ID":"6b6c1948-cc96-4857-a9c6-8099a08202c6","Type":"ContainerStarted","Data":"83a9f7d9b605e5fe905fccac9ede8784e75459da8a32181f6836c61884246733"} Apr 16 20:49:02.661172 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:02.661136 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-7nc8x" event={"ID":"6b6c1948-cc96-4857-a9c6-8099a08202c6","Type":"ContainerStarted","Data":"52183abea5363e20a761c789493848e589016c076415c03a0da8ed6c2fe16587"} Apr 16 20:49:02.661172 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:02.661106 2537 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-8b475cf9f-7nc8x" podUID="6b6c1948-cc96-4857-a9c6-8099a08202c6" containerName="authorino" containerID="cri-o://83a9f7d9b605e5fe905fccac9ede8784e75459da8a32181f6836c61884246733" gracePeriod=30 Apr 16 20:49:02.662930 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:02.662898 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-6cff784565-smjvw" event={"ID":"b9704ac0-9128-41f3-a31d-b3a862a36fab","Type":"ContainerStarted","Data":"0144c81e7560ce0c7990febb692cd04497caeffc806b6f6ae6e136889a1a9efe"} Apr 16 20:49:02.663034 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:02.662938 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-6cff784565-smjvw" event={"ID":"b9704ac0-9128-41f3-a31d-b3a862a36fab","Type":"ContainerStarted","Data":"c2f37f9bbafae8fa70b17310aa86d7cd62cb0ab4f501b14b9d3783b1c85267a3"} Apr 16 20:49:02.663034 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:02.662971 2537 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-6cff784565-smjvw" podUID="b9704ac0-9128-41f3-a31d-b3a862a36fab" containerName="authorino" containerID="cri-o://0144c81e7560ce0c7990febb692cd04497caeffc806b6f6ae6e136889a1a9efe" gracePeriod=30 Apr 16 20:49:02.677537 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:02.677498 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-8b475cf9f-7nc8x" podStartSLOduration=1.371560658 podStartE2EDuration="1.677485959s" podCreationTimestamp="2026-04-16 20:49:01 +0000 UTC" firstStartedPulling="2026-04-16 20:49:01.67619872 +0000 UTC m=+680.661342294" lastFinishedPulling="2026-04-16 20:49:01.982124013 +0000 UTC m=+680.967267595" observedRunningTime="2026-04-16 20:49:02.675134223 +0000 UTC m=+681.660277816" watchObservedRunningTime="2026-04-16 20:49:02.677485959 +0000 UTC m=+681.662629549" Apr 16 20:49:02.690183 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:02.690141 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-6cff784565-smjvw" podStartSLOduration=1.280679375 podStartE2EDuration="1.690131519s" podCreationTimestamp="2026-04-16 20:49:01 +0000 UTC" firstStartedPulling="2026-04-16 20:49:02.014863235 +0000 UTC m=+681.000006809" lastFinishedPulling="2026-04-16 20:49:02.424315381 +0000 UTC m=+681.409458953" observedRunningTime="2026-04-16 20:49:02.68885156 +0000 UTC m=+681.673995150" watchObservedRunningTime="2026-04-16 20:49:02.690131519 +0000 UTC m=+681.675275108" Apr 16 20:49:02.709358 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:02.709333 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f46dm\" (UniqueName: \"kubernetes.io/projected/6332ba3b-f489-4524-9e98-d5fab9a13f4a-kube-api-access-f46dm\") pod \"authorino-5544d6787c-dhx56\" (UID: \"6332ba3b-f489-4524-9e98-d5fab9a13f4a\") " pod="kuadrant-system/authorino-5544d6787c-dhx56" Apr 16 20:49:02.709447 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:02.709384 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/6332ba3b-f489-4524-9e98-d5fab9a13f4a-tls-cert\") pod \"authorino-5544d6787c-dhx56\" (UID: \"6332ba3b-f489-4524-9e98-d5fab9a13f4a\") " pod="kuadrant-system/authorino-5544d6787c-dhx56" Apr 16 20:49:02.711464 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:02.711447 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/6332ba3b-f489-4524-9e98-d5fab9a13f4a-tls-cert\") pod \"authorino-5544d6787c-dhx56\" (UID: \"6332ba3b-f489-4524-9e98-d5fab9a13f4a\") " pod="kuadrant-system/authorino-5544d6787c-dhx56" Apr 16 20:49:02.716292 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:02.716275 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f46dm\" (UniqueName: \"kubernetes.io/projected/6332ba3b-f489-4524-9e98-d5fab9a13f4a-kube-api-access-f46dm\") pod \"authorino-5544d6787c-dhx56\" (UID: \"6332ba3b-f489-4524-9e98-d5fab9a13f4a\") " pod="kuadrant-system/authorino-5544d6787c-dhx56" Apr 16 20:49:02.788751 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:02.788726 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5544d6787c-dhx56" Apr 16 20:49:02.923932 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:02.923765 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-5544d6787c-dhx56"] Apr 16 20:49:02.926104 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:49:02.926045 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6332ba3b_f489_4524_9e98_d5fab9a13f4a.slice/crio-cd26d67a1070eb20d5e1305cceeffb570c612538812316a062d0bb5c9a090a63 WatchSource:0}: Error finding container cd26d67a1070eb20d5e1305cceeffb570c612538812316a062d0bb5c9a090a63: Status 404 returned error can't find the container with id cd26d67a1070eb20d5e1305cceeffb570c612538812316a062d0bb5c9a090a63 Apr 16 20:49:02.926437 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:02.926419 2537 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-7nc8x" Apr 16 20:49:02.930011 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:02.929996 2537 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6cff784565-smjvw" Apr 16 20:49:03.112308 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:03.112277 2537 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlkhx\" (UniqueName: \"kubernetes.io/projected/6b6c1948-cc96-4857-a9c6-8099a08202c6-kube-api-access-wlkhx\") pod \"6b6c1948-cc96-4857-a9c6-8099a08202c6\" (UID: \"6b6c1948-cc96-4857-a9c6-8099a08202c6\") " Apr 16 20:49:03.112481 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:03.112324 2537 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jztr\" (UniqueName: \"kubernetes.io/projected/b9704ac0-9128-41f3-a31d-b3a862a36fab-kube-api-access-8jztr\") pod \"b9704ac0-9128-41f3-a31d-b3a862a36fab\" (UID: \"b9704ac0-9128-41f3-a31d-b3a862a36fab\") " Apr 16 20:49:03.114282 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:03.114239 2537 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b6c1948-cc96-4857-a9c6-8099a08202c6-kube-api-access-wlkhx" (OuterVolumeSpecName: "kube-api-access-wlkhx") pod "6b6c1948-cc96-4857-a9c6-8099a08202c6" (UID: "6b6c1948-cc96-4857-a9c6-8099a08202c6"). InnerVolumeSpecName "kube-api-access-wlkhx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:49:03.114394 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:03.114281 2537 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9704ac0-9128-41f3-a31d-b3a862a36fab-kube-api-access-8jztr" (OuterVolumeSpecName: "kube-api-access-8jztr") pod "b9704ac0-9128-41f3-a31d-b3a862a36fab" (UID: "b9704ac0-9128-41f3-a31d-b3a862a36fab"). InnerVolumeSpecName "kube-api-access-8jztr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:49:03.212877 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:03.212817 2537 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wlkhx\" (UniqueName: \"kubernetes.io/projected/6b6c1948-cc96-4857-a9c6-8099a08202c6-kube-api-access-wlkhx\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 16 20:49:03.212877 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:03.212842 2537 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8jztr\" (UniqueName: \"kubernetes.io/projected/b9704ac0-9128-41f3-a31d-b3a862a36fab-kube-api-access-8jztr\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 16 20:49:03.667405 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:03.667375 2537 generic.go:358] "Generic (PLEG): container finished" podID="b9704ac0-9128-41f3-a31d-b3a862a36fab" containerID="0144c81e7560ce0c7990febb692cd04497caeffc806b6f6ae6e136889a1a9efe" exitCode=0 Apr 16 20:49:03.667764 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:03.667430 2537 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6cff784565-smjvw" Apr 16 20:49:03.667764 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:03.667454 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-6cff784565-smjvw" event={"ID":"b9704ac0-9128-41f3-a31d-b3a862a36fab","Type":"ContainerDied","Data":"0144c81e7560ce0c7990febb692cd04497caeffc806b6f6ae6e136889a1a9efe"} Apr 16 20:49:03.667764 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:03.667484 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-6cff784565-smjvw" event={"ID":"b9704ac0-9128-41f3-a31d-b3a862a36fab","Type":"ContainerDied","Data":"c2f37f9bbafae8fa70b17310aa86d7cd62cb0ab4f501b14b9d3783b1c85267a3"} Apr 16 20:49:03.667764 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:03.667511 2537 scope.go:117] "RemoveContainer" containerID="0144c81e7560ce0c7990febb692cd04497caeffc806b6f6ae6e136889a1a9efe" Apr 16 20:49:03.668699 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:03.668580 2537 generic.go:358] "Generic (PLEG): container finished" podID="6b6c1948-cc96-4857-a9c6-8099a08202c6" containerID="83a9f7d9b605e5fe905fccac9ede8784e75459da8a32181f6836c61884246733" exitCode=0 Apr 16 20:49:03.668699 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:03.668635 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-7nc8x" event={"ID":"6b6c1948-cc96-4857-a9c6-8099a08202c6","Type":"ContainerDied","Data":"83a9f7d9b605e5fe905fccac9ede8784e75459da8a32181f6836c61884246733"} Apr 16 20:49:03.668699 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:03.668657 2537 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-7nc8x" Apr 16 20:49:03.668843 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:03.668656 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-7nc8x" event={"ID":"6b6c1948-cc96-4857-a9c6-8099a08202c6","Type":"ContainerDied","Data":"52183abea5363e20a761c789493848e589016c076415c03a0da8ed6c2fe16587"} Apr 16 20:49:03.670143 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:03.670121 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5544d6787c-dhx56" event={"ID":"6332ba3b-f489-4524-9e98-d5fab9a13f4a","Type":"ContainerStarted","Data":"5342d6261cfe70d25edbdbf302831623d2286e23864e6a9c708c9faa10f47b69"} Apr 16 20:49:03.670257 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:03.670148 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5544d6787c-dhx56" event={"ID":"6332ba3b-f489-4524-9e98-d5fab9a13f4a","Type":"ContainerStarted","Data":"cd26d67a1070eb20d5e1305cceeffb570c612538812316a062d0bb5c9a090a63"} Apr 16 20:49:03.675932 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:03.675909 2537 scope.go:117] "RemoveContainer" containerID="0144c81e7560ce0c7990febb692cd04497caeffc806b6f6ae6e136889a1a9efe" Apr 16 20:49:03.676210 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:49:03.676185 2537 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0144c81e7560ce0c7990febb692cd04497caeffc806b6f6ae6e136889a1a9efe\": container with ID starting with 0144c81e7560ce0c7990febb692cd04497caeffc806b6f6ae6e136889a1a9efe not found: ID does not exist" containerID="0144c81e7560ce0c7990febb692cd04497caeffc806b6f6ae6e136889a1a9efe" Apr 16 20:49:03.676295 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:03.676212 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0144c81e7560ce0c7990febb692cd04497caeffc806b6f6ae6e136889a1a9efe"} err="failed to get container status \"0144c81e7560ce0c7990febb692cd04497caeffc806b6f6ae6e136889a1a9efe\": rpc error: code = NotFound desc = could not find container \"0144c81e7560ce0c7990febb692cd04497caeffc806b6f6ae6e136889a1a9efe\": container with ID starting with 0144c81e7560ce0c7990febb692cd04497caeffc806b6f6ae6e136889a1a9efe not found: ID does not exist" Apr 16 20:49:03.676295 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:03.676233 2537 scope.go:117] "RemoveContainer" containerID="83a9f7d9b605e5fe905fccac9ede8784e75459da8a32181f6836c61884246733" Apr 16 20:49:03.683019 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:03.682991 2537 scope.go:117] "RemoveContainer" containerID="83a9f7d9b605e5fe905fccac9ede8784e75459da8a32181f6836c61884246733" Apr 16 20:49:03.683270 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:49:03.683252 2537 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83a9f7d9b605e5fe905fccac9ede8784e75459da8a32181f6836c61884246733\": container with ID starting with 83a9f7d9b605e5fe905fccac9ede8784e75459da8a32181f6836c61884246733 not found: ID does not exist" containerID="83a9f7d9b605e5fe905fccac9ede8784e75459da8a32181f6836c61884246733" Apr 16 20:49:03.683318 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:03.683276 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83a9f7d9b605e5fe905fccac9ede8784e75459da8a32181f6836c61884246733"} err="failed to get container status \"83a9f7d9b605e5fe905fccac9ede8784e75459da8a32181f6836c61884246733\": rpc error: code = NotFound desc = could not find container \"83a9f7d9b605e5fe905fccac9ede8784e75459da8a32181f6836c61884246733\": container with ID starting with 83a9f7d9b605e5fe905fccac9ede8784e75459da8a32181f6836c61884246733 not found: ID does not exist" Apr 16 20:49:03.689883 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:03.689840 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-5544d6787c-dhx56" podStartSLOduration=1.327572407 podStartE2EDuration="1.689828911s" podCreationTimestamp="2026-04-16 20:49:02 +0000 UTC" firstStartedPulling="2026-04-16 20:49:02.927426471 +0000 UTC m=+681.912570040" lastFinishedPulling="2026-04-16 20:49:03.289682975 +0000 UTC m=+682.274826544" observedRunningTime="2026-04-16 20:49:03.688412741 +0000 UTC m=+682.673556331" watchObservedRunningTime="2026-04-16 20:49:03.689828911 +0000 UTC m=+682.674972502" Apr 16 20:49:03.705690 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:03.705667 2537 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-7nc8x"] Apr 16 20:49:03.712098 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:03.712076 2537 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-7nc8x"] Apr 16 20:49:03.715778 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:03.715756 2537 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-g8vk2"] Apr 16 20:49:03.715962 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:03.715942 2537 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-g8vk2" podUID="f294644d-e4e9-4495-b665-ade78a8a82d4" containerName="authorino" containerID="cri-o://81a585f0adc225f1f0a15d4231832582f46c169b49c43d5a11766485c23c3fa4" gracePeriod=30 Apr 16 20:49:03.728369 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:03.728329 2537 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-6cff784565-smjvw"] Apr 16 20:49:03.731424 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:03.731404 2537 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-6cff784565-smjvw"] Apr 16 20:49:03.944305 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:03.944285 2537 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-g8vk2" Apr 16 20:49:04.121137 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:04.121057 2537 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gftxq\" (UniqueName: \"kubernetes.io/projected/f294644d-e4e9-4495-b665-ade78a8a82d4-kube-api-access-gftxq\") pod \"f294644d-e4e9-4495-b665-ade78a8a82d4\" (UID: \"f294644d-e4e9-4495-b665-ade78a8a82d4\") " Apr 16 20:49:04.123005 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:04.122980 2537 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f294644d-e4e9-4495-b665-ade78a8a82d4-kube-api-access-gftxq" (OuterVolumeSpecName: "kube-api-access-gftxq") pod "f294644d-e4e9-4495-b665-ade78a8a82d4" (UID: "f294644d-e4e9-4495-b665-ade78a8a82d4"). InnerVolumeSpecName "kube-api-access-gftxq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:49:04.222580 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:04.222521 2537 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gftxq\" (UniqueName: \"kubernetes.io/projected/f294644d-e4e9-4495-b665-ade78a8a82d4-kube-api-access-gftxq\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 16 20:49:04.675834 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:04.675802 2537 generic.go:358] "Generic (PLEG): container finished" podID="f294644d-e4e9-4495-b665-ade78a8a82d4" containerID="81a585f0adc225f1f0a15d4231832582f46c169b49c43d5a11766485c23c3fa4" exitCode=0 Apr 16 20:49:04.676232 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:04.675851 2537 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-g8vk2" Apr 16 20:49:04.676232 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:04.675881 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-g8vk2" event={"ID":"f294644d-e4e9-4495-b665-ade78a8a82d4","Type":"ContainerDied","Data":"81a585f0adc225f1f0a15d4231832582f46c169b49c43d5a11766485c23c3fa4"} Apr 16 20:49:04.676232 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:04.675912 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-g8vk2" event={"ID":"f294644d-e4e9-4495-b665-ade78a8a82d4","Type":"ContainerDied","Data":"bdb8a7af17f9f1f7ea039692634ca5efebad49ec9008159e0392d47f3ebfd95b"} Apr 16 20:49:04.676232 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:04.675928 2537 scope.go:117] "RemoveContainer" containerID="81a585f0adc225f1f0a15d4231832582f46c169b49c43d5a11766485c23c3fa4" Apr 16 20:49:04.684461 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:04.684439 2537 scope.go:117] "RemoveContainer" containerID="81a585f0adc225f1f0a15d4231832582f46c169b49c43d5a11766485c23c3fa4" Apr 16 20:49:04.684737 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:49:04.684716 2537 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81a585f0adc225f1f0a15d4231832582f46c169b49c43d5a11766485c23c3fa4\": container with ID starting with 81a585f0adc225f1f0a15d4231832582f46c169b49c43d5a11766485c23c3fa4 not found: ID does not exist" containerID="81a585f0adc225f1f0a15d4231832582f46c169b49c43d5a11766485c23c3fa4" Apr 16 20:49:04.684834 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:04.684747 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81a585f0adc225f1f0a15d4231832582f46c169b49c43d5a11766485c23c3fa4"} err="failed to get container status \"81a585f0adc225f1f0a15d4231832582f46c169b49c43d5a11766485c23c3fa4\": rpc error: code = NotFound desc = could not find container \"81a585f0adc225f1f0a15d4231832582f46c169b49c43d5a11766485c23c3fa4\": container with ID starting with 81a585f0adc225f1f0a15d4231832582f46c169b49c43d5a11766485c23c3fa4 not found: ID does not exist" Apr 16 20:49:04.698495 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:04.698462 2537 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-g8vk2"] Apr 16 20:49:04.699879 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:04.699856 2537 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-g8vk2"] Apr 16 20:49:05.589962 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:05.589929 2537 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b6c1948-cc96-4857-a9c6-8099a08202c6" path="/var/lib/kubelet/pods/6b6c1948-cc96-4857-a9c6-8099a08202c6/volumes" Apr 16 20:49:05.590240 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:05.590228 2537 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9704ac0-9128-41f3-a31d-b3a862a36fab" path="/var/lib/kubelet/pods/b9704ac0-9128-41f3-a31d-b3a862a36fab/volumes" Apr 16 20:49:05.590514 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:05.590503 2537 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f294644d-e4e9-4495-b665-ade78a8a82d4" path="/var/lib/kubelet/pods/f294644d-e4e9-4495-b665-ade78a8a82d4/volumes" Apr 16 20:49:51.638881 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:51.638840 2537 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5"] Apr 16 20:49:51.639426 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:51.639339 2537 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b9704ac0-9128-41f3-a31d-b3a862a36fab" containerName="authorino" Apr 16 20:49:51.639426 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:51.639358 2537 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9704ac0-9128-41f3-a31d-b3a862a36fab" containerName="authorino" Apr 16 20:49:51.639426 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:51.639373 2537 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f294644d-e4e9-4495-b665-ade78a8a82d4" containerName="authorino" Apr 16 20:49:51.639426 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:51.639382 2537 state_mem.go:107] "Deleted CPUSet assignment" podUID="f294644d-e4e9-4495-b665-ade78a8a82d4" containerName="authorino" Apr 16 20:49:51.639426 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:51.639415 2537 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b6c1948-cc96-4857-a9c6-8099a08202c6" containerName="authorino" Apr 16 20:49:51.639426 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:51.639424 2537 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b6c1948-cc96-4857-a9c6-8099a08202c6" containerName="authorino" Apr 16 20:49:51.639802 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:51.639505 2537 memory_manager.go:356] "RemoveStaleState removing state" podUID="6b6c1948-cc96-4857-a9c6-8099a08202c6" containerName="authorino" Apr 16 20:49:51.639802 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:51.639521 2537 memory_manager.go:356] "RemoveStaleState removing state" podUID="b9704ac0-9128-41f3-a31d-b3a862a36fab" containerName="authorino" Apr 16 20:49:51.639802 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:51.639531 2537 memory_manager.go:356] "RemoveStaleState removing state" podUID="f294644d-e4e9-4495-b665-ade78a8a82d4" containerName="authorino" Apr 16 20:49:51.644113 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:51.644092 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5" Apr 16 20:49:51.646572 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:51.646533 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 16 20:49:51.646706 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:51.646687 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 16 20:49:51.647610 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:51.647589 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-fbmqf\"" Apr 16 20:49:51.647707 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:51.647662 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 16 20:49:51.652938 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:51.652896 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5"] Apr 16 20:49:51.783282 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:51.783254 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcxsd\" (UniqueName: \"kubernetes.io/projected/41f31c78-31b2-4d46-ac75-14dc38559497-kube-api-access-pcxsd\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5\" (UID: \"41f31c78-31b2-4d46-ac75-14dc38559497\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5" Apr 16 20:49:51.783444 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:51.783291 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/41f31c78-31b2-4d46-ac75-14dc38559497-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5\" (UID: \"41f31c78-31b2-4d46-ac75-14dc38559497\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5" Apr 16 20:49:51.783444 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:51.783330 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/41f31c78-31b2-4d46-ac75-14dc38559497-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5\" (UID: \"41f31c78-31b2-4d46-ac75-14dc38559497\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5" Apr 16 20:49:51.783444 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:51.783392 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41f31c78-31b2-4d46-ac75-14dc38559497-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5\" (UID: \"41f31c78-31b2-4d46-ac75-14dc38559497\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5" Apr 16 20:49:51.783444 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:51.783426 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/41f31c78-31b2-4d46-ac75-14dc38559497-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5\" (UID: \"41f31c78-31b2-4d46-ac75-14dc38559497\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5" Apr 16 20:49:51.783601 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:51.783445 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/41f31c78-31b2-4d46-ac75-14dc38559497-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5\" (UID: \"41f31c78-31b2-4d46-ac75-14dc38559497\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5" Apr 16 20:49:51.884212 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:51.884182 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pcxsd\" (UniqueName: \"kubernetes.io/projected/41f31c78-31b2-4d46-ac75-14dc38559497-kube-api-access-pcxsd\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5\" (UID: \"41f31c78-31b2-4d46-ac75-14dc38559497\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5" Apr 16 20:49:51.884212 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:51.884216 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/41f31c78-31b2-4d46-ac75-14dc38559497-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5\" (UID: \"41f31c78-31b2-4d46-ac75-14dc38559497\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5" Apr 16 20:49:51.884426 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:51.884237 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/41f31c78-31b2-4d46-ac75-14dc38559497-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5\" (UID: \"41f31c78-31b2-4d46-ac75-14dc38559497\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5" Apr 16 20:49:51.884426 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:51.884347 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41f31c78-31b2-4d46-ac75-14dc38559497-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5\" (UID: \"41f31c78-31b2-4d46-ac75-14dc38559497\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5" Apr 16 20:49:51.884426 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:51.884396 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/41f31c78-31b2-4d46-ac75-14dc38559497-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5\" (UID: \"41f31c78-31b2-4d46-ac75-14dc38559497\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5" Apr 16 20:49:51.884579 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:51.884426 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/41f31c78-31b2-4d46-ac75-14dc38559497-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5\" (UID: \"41f31c78-31b2-4d46-ac75-14dc38559497\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5" Apr 16 20:49:51.884643 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:51.884623 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/41f31c78-31b2-4d46-ac75-14dc38559497-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5\" (UID: \"41f31c78-31b2-4d46-ac75-14dc38559497\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5" Apr 16 20:49:51.884706 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:51.884688 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41f31c78-31b2-4d46-ac75-14dc38559497-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5\" (UID: \"41f31c78-31b2-4d46-ac75-14dc38559497\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5" Apr 16 20:49:51.884792 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:51.884751 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/41f31c78-31b2-4d46-ac75-14dc38559497-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5\" (UID: \"41f31c78-31b2-4d46-ac75-14dc38559497\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5" Apr 16 20:49:51.886503 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:51.886478 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/41f31c78-31b2-4d46-ac75-14dc38559497-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5\" (UID: \"41f31c78-31b2-4d46-ac75-14dc38559497\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5" Apr 16 20:49:51.886895 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:51.886877 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/41f31c78-31b2-4d46-ac75-14dc38559497-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5\" (UID: \"41f31c78-31b2-4d46-ac75-14dc38559497\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5" Apr 16 20:49:51.891389 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:51.891340 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcxsd\" (UniqueName: \"kubernetes.io/projected/41f31c78-31b2-4d46-ac75-14dc38559497-kube-api-access-pcxsd\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5\" (UID: \"41f31c78-31b2-4d46-ac75-14dc38559497\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5" Apr 16 20:49:51.957730 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:51.957707 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5" Apr 16 20:49:52.080803 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:52.080770 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5"] Apr 16 20:49:52.084063 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:49:52.084029 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41f31c78_31b2_4d46_ac75_14dc38559497.slice/crio-e026caf72c107de6fd726ae5634010a13b13c53e87eface9ba92da4a2dead5c7 WatchSource:0}: Error finding container e026caf72c107de6fd726ae5634010a13b13c53e87eface9ba92da4a2dead5c7: Status 404 returned error can't find the container with id e026caf72c107de6fd726ae5634010a13b13c53e87eface9ba92da4a2dead5c7 Apr 16 20:49:52.085740 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:52.085720 2537 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:49:52.832170 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:52.832133 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5" event={"ID":"41f31c78-31b2-4d46-ac75-14dc38559497","Type":"ContainerStarted","Data":"e026caf72c107de6fd726ae5634010a13b13c53e87eface9ba92da4a2dead5c7"} Apr 16 20:49:57.851727 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:49:57.851644 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5" event={"ID":"41f31c78-31b2-4d46-ac75-14dc38559497","Type":"ContainerStarted","Data":"7520ec096964222576c18a5496b68a2afd793679d83edba2f76921bb1dc10848"} Apr 16 20:50:02.872347 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:50:02.872261 2537 generic.go:358] "Generic (PLEG): container finished" podID="41f31c78-31b2-4d46-ac75-14dc38559497" containerID="7520ec096964222576c18a5496b68a2afd793679d83edba2f76921bb1dc10848" exitCode=0 Apr 16 20:50:02.872347 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:50:02.872313 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5" event={"ID":"41f31c78-31b2-4d46-ac75-14dc38559497","Type":"ContainerDied","Data":"7520ec096964222576c18a5496b68a2afd793679d83edba2f76921bb1dc10848"} Apr 16 20:50:06.889980 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:50:06.889947 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5" event={"ID":"41f31c78-31b2-4d46-ac75-14dc38559497","Type":"ContainerStarted","Data":"1dbdb3bad1c64b4d445b7f72c994a361960b7069ac541b6d10f8ff89478b7896"} Apr 16 20:50:06.890367 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:50:06.890240 2537 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5" Apr 16 20:50:06.907736 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:50:06.907692 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5" podStartSLOduration=1.5705089129999998 podStartE2EDuration="15.907680603s" podCreationTimestamp="2026-04-16 20:49:51 +0000 UTC" firstStartedPulling="2026-04-16 20:49:52.08585514 +0000 UTC m=+731.070998712" lastFinishedPulling="2026-04-16 20:50:06.423026829 +0000 UTC m=+745.408170402" observedRunningTime="2026-04-16 20:50:06.906107989 +0000 UTC m=+745.891251592" watchObservedRunningTime="2026-04-16 20:50:06.907680603 +0000 UTC m=+745.892824194" Apr 16 20:50:17.906958 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:50:17.906927 2537 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5" Apr 16 20:50:32.941301 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:50:32.941267 2537 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl"] Apr 16 20:50:32.961816 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:50:32.961788 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl"] Apr 16 20:50:32.961982 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:50:32.961893 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl" Apr 16 20:50:32.964753 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:50:32.964730 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 16 20:50:33.017863 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:50:33.017826 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/be8360e6-fbc3-4161-87ed-bbcc08568b06-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl\" (UID: \"be8360e6-fbc3-4161-87ed-bbcc08568b06\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl" Apr 16 20:50:33.018024 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:50:33.017887 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/be8360e6-fbc3-4161-87ed-bbcc08568b06-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl\" (UID: \"be8360e6-fbc3-4161-87ed-bbcc08568b06\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl" Apr 16 20:50:33.018024 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:50:33.017987 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/be8360e6-fbc3-4161-87ed-bbcc08568b06-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl\" (UID: \"be8360e6-fbc3-4161-87ed-bbcc08568b06\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl" Apr 16 20:50:33.018024 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:50:33.018014 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/be8360e6-fbc3-4161-87ed-bbcc08568b06-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl\" (UID: \"be8360e6-fbc3-4161-87ed-bbcc08568b06\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl" Apr 16 20:50:33.018186 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:50:33.018039 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd6q6\" (UniqueName: \"kubernetes.io/projected/be8360e6-fbc3-4161-87ed-bbcc08568b06-kube-api-access-kd6q6\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl\" (UID: \"be8360e6-fbc3-4161-87ed-bbcc08568b06\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl" Apr 16 20:50:33.018186 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:50:33.018121 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/be8360e6-fbc3-4161-87ed-bbcc08568b06-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl\" (UID: \"be8360e6-fbc3-4161-87ed-bbcc08568b06\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl" Apr 16 20:50:33.119079 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:50:33.119049 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/be8360e6-fbc3-4161-87ed-bbcc08568b06-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl\" (UID: \"be8360e6-fbc3-4161-87ed-bbcc08568b06\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl" Apr 16 20:50:33.119223 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:50:33.119085 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/be8360e6-fbc3-4161-87ed-bbcc08568b06-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl\" (UID: \"be8360e6-fbc3-4161-87ed-bbcc08568b06\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl" Apr 16 20:50:33.119223 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:50:33.119105 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/be8360e6-fbc3-4161-87ed-bbcc08568b06-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl\" (UID: \"be8360e6-fbc3-4161-87ed-bbcc08568b06\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl" Apr 16 20:50:33.119321 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:50:33.119236 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/be8360e6-fbc3-4161-87ed-bbcc08568b06-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl\" (UID: \"be8360e6-fbc3-4161-87ed-bbcc08568b06\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl" Apr 16 20:50:33.119321 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:50:33.119267 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/be8360e6-fbc3-4161-87ed-bbcc08568b06-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl\" (UID: \"be8360e6-fbc3-4161-87ed-bbcc08568b06\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl" Apr 16 20:50:33.119321 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:50:33.119298 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kd6q6\" (UniqueName: \"kubernetes.io/projected/be8360e6-fbc3-4161-87ed-bbcc08568b06-kube-api-access-kd6q6\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl\" (UID: \"be8360e6-fbc3-4161-87ed-bbcc08568b06\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl" Apr 16 20:50:33.119465 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:50:33.119434 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/be8360e6-fbc3-4161-87ed-bbcc08568b06-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl\" (UID: \"be8360e6-fbc3-4161-87ed-bbcc08568b06\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl" Apr 16 20:50:33.119623 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:50:33.119590 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/be8360e6-fbc3-4161-87ed-bbcc08568b06-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl\" (UID: \"be8360e6-fbc3-4161-87ed-bbcc08568b06\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl" Apr 16 20:50:33.119721 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:50:33.119622 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/be8360e6-fbc3-4161-87ed-bbcc08568b06-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl\" (UID: \"be8360e6-fbc3-4161-87ed-bbcc08568b06\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl" Apr 16 20:50:33.121532 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:50:33.121497 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/be8360e6-fbc3-4161-87ed-bbcc08568b06-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl\" (UID: \"be8360e6-fbc3-4161-87ed-bbcc08568b06\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl" Apr 16 20:50:33.121676 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:50:33.121659 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/be8360e6-fbc3-4161-87ed-bbcc08568b06-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl\" (UID: \"be8360e6-fbc3-4161-87ed-bbcc08568b06\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl" Apr 16 20:50:33.132207 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:50:33.132188 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd6q6\" (UniqueName: \"kubernetes.io/projected/be8360e6-fbc3-4161-87ed-bbcc08568b06-kube-api-access-kd6q6\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl\" (UID: \"be8360e6-fbc3-4161-87ed-bbcc08568b06\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl" Apr 16 20:50:33.272415 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:50:33.272349 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl" Apr 16 20:50:33.392830 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:50:33.392802 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl"] Apr 16 20:50:33.395719 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:50:33.395691 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe8360e6_fbc3_4161_87ed_bbcc08568b06.slice/crio-b7c027cdbfde43eee5897bcee2386b4c08d9ae5e06b4ae708d4d7143b9cef6ba WatchSource:0}: Error finding container b7c027cdbfde43eee5897bcee2386b4c08d9ae5e06b4ae708d4d7143b9cef6ba: Status 404 returned error can't find the container with id b7c027cdbfde43eee5897bcee2386b4c08d9ae5e06b4ae708d4d7143b9cef6ba Apr 16 20:50:33.980621 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:50:33.980587 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl" event={"ID":"be8360e6-fbc3-4161-87ed-bbcc08568b06","Type":"ContainerStarted","Data":"5bbc7d8c1d15c7f891219accd25dc22d07cd8c411f400665acf2032a28e7e7d5"} Apr 16 20:50:33.980621 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:50:33.980624 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl" event={"ID":"be8360e6-fbc3-4161-87ed-bbcc08568b06","Type":"ContainerStarted","Data":"b7c027cdbfde43eee5897bcee2386b4c08d9ae5e06b4ae708d4d7143b9cef6ba"} Apr 16 20:50:38.998390 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:50:38.998355 2537 generic.go:358] "Generic (PLEG): container finished" podID="be8360e6-fbc3-4161-87ed-bbcc08568b06" containerID="5bbc7d8c1d15c7f891219accd25dc22d07cd8c411f400665acf2032a28e7e7d5" exitCode=0 Apr 16 20:50:38.998764 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:50:38.998432 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl" event={"ID":"be8360e6-fbc3-4161-87ed-bbcc08568b06","Type":"ContainerDied","Data":"5bbc7d8c1d15c7f891219accd25dc22d07cd8c411f400665acf2032a28e7e7d5"} Apr 16 20:50:40.003749 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:50:40.003717 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl" event={"ID":"be8360e6-fbc3-4161-87ed-bbcc08568b06","Type":"ContainerStarted","Data":"fbf3ad82af1c8f97baa3958d2e0ec5326e4259103541d1fbece0dd71e8a26d99"} Apr 16 20:50:40.004105 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:50:40.004046 2537 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl" Apr 16 20:50:40.022177 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:50:40.022045 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl" podStartSLOduration=7.851888142 podStartE2EDuration="8.022028726s" podCreationTimestamp="2026-04-16 20:50:32 +0000 UTC" firstStartedPulling="2026-04-16 20:50:38.999076711 +0000 UTC m=+777.984220280" lastFinishedPulling="2026-04-16 20:50:39.169217278 +0000 UTC m=+778.154360864" observedRunningTime="2026-04-16 20:50:40.021163181 +0000 UTC m=+779.006306764" watchObservedRunningTime="2026-04-16 20:50:40.022028726 +0000 UTC m=+779.007172318" Apr 16 20:50:51.019467 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:50:51.019433 2537 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl" Apr 16 20:51:00.060170 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:00.060135 2537 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-2gj2p"] Apr 16 20:51:00.064764 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:00.064747 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-2gj2p" Apr 16 20:51:00.066971 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:00.066950 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 16 20:51:00.070929 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:00.070900 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-2gj2p"] Apr 16 20:51:00.145904 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:00.145875 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c7b4520b-3abd-44c2-8383-35c64d159211-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-2gj2p\" (UID: \"c7b4520b-3abd-44c2-8383-35c64d159211\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-2gj2p" Apr 16 20:51:00.146041 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:00.145912 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f9xc\" (UniqueName: \"kubernetes.io/projected/c7b4520b-3abd-44c2-8383-35c64d159211-kube-api-access-5f9xc\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-2gj2p\" (UID: \"c7b4520b-3abd-44c2-8383-35c64d159211\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-2gj2p" Apr 16 20:51:00.146041 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:00.145931 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c7b4520b-3abd-44c2-8383-35c64d159211-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-2gj2p\" (UID: \"c7b4520b-3abd-44c2-8383-35c64d159211\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-2gj2p" Apr 16 20:51:00.146041 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:00.145949 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c7b4520b-3abd-44c2-8383-35c64d159211-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-2gj2p\" (UID: \"c7b4520b-3abd-44c2-8383-35c64d159211\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-2gj2p" Apr 16 20:51:00.146041 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:00.146026 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c7b4520b-3abd-44c2-8383-35c64d159211-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-2gj2p\" (UID: \"c7b4520b-3abd-44c2-8383-35c64d159211\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-2gj2p" Apr 16 20:51:00.146171 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:00.146084 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7b4520b-3abd-44c2-8383-35c64d159211-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-2gj2p\" (UID: \"c7b4520b-3abd-44c2-8383-35c64d159211\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-2gj2p" Apr 16 20:51:00.246916 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:00.246854 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c7b4520b-3abd-44c2-8383-35c64d159211-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-2gj2p\" (UID: \"c7b4520b-3abd-44c2-8383-35c64d159211\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-2gj2p" Apr 16 20:51:00.246916 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:00.246890 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5f9xc\" (UniqueName: \"kubernetes.io/projected/c7b4520b-3abd-44c2-8383-35c64d159211-kube-api-access-5f9xc\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-2gj2p\" (UID: \"c7b4520b-3abd-44c2-8383-35c64d159211\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-2gj2p" Apr 16 20:51:00.247092 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:00.246916 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c7b4520b-3abd-44c2-8383-35c64d159211-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-2gj2p\" (UID: \"c7b4520b-3abd-44c2-8383-35c64d159211\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-2gj2p" Apr 16 20:51:00.247092 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:00.247038 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c7b4520b-3abd-44c2-8383-35c64d159211-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-2gj2p\" (UID: \"c7b4520b-3abd-44c2-8383-35c64d159211\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-2gj2p" Apr 16 20:51:00.247205 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:00.247112 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c7b4520b-3abd-44c2-8383-35c64d159211-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-2gj2p\" (UID: \"c7b4520b-3abd-44c2-8383-35c64d159211\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-2gj2p" Apr 16 20:51:00.247205 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:00.247174 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7b4520b-3abd-44c2-8383-35c64d159211-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-2gj2p\" (UID: \"c7b4520b-3abd-44c2-8383-35c64d159211\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-2gj2p" Apr 16 20:51:00.247410 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:00.247390 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c7b4520b-3abd-44c2-8383-35c64d159211-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-2gj2p\" (UID: \"c7b4520b-3abd-44c2-8383-35c64d159211\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-2gj2p" Apr 16 20:51:00.247472 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:00.247455 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7b4520b-3abd-44c2-8383-35c64d159211-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-2gj2p\" (UID: \"c7b4520b-3abd-44c2-8383-35c64d159211\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-2gj2p" Apr 16 20:51:00.247547 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:00.247522 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c7b4520b-3abd-44c2-8383-35c64d159211-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-2gj2p\" (UID: \"c7b4520b-3abd-44c2-8383-35c64d159211\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-2gj2p" Apr 16 20:51:00.249035 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:00.249014 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c7b4520b-3abd-44c2-8383-35c64d159211-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-2gj2p\" (UID: \"c7b4520b-3abd-44c2-8383-35c64d159211\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-2gj2p" Apr 16 20:51:00.249294 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:00.249277 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c7b4520b-3abd-44c2-8383-35c64d159211-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-2gj2p\" (UID: \"c7b4520b-3abd-44c2-8383-35c64d159211\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-2gj2p" Apr 16 20:51:00.254535 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:00.254517 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5f9xc\" (UniqueName: \"kubernetes.io/projected/c7b4520b-3abd-44c2-8383-35c64d159211-kube-api-access-5f9xc\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-2gj2p\" (UID: \"c7b4520b-3abd-44c2-8383-35c64d159211\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-2gj2p" Apr 16 20:51:00.375643 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:00.375623 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-2gj2p" Apr 16 20:51:00.496455 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:00.496432 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-2gj2p"] Apr 16 20:51:00.498336 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:51:00.498300 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7b4520b_3abd_44c2_8383_35c64d159211.slice/crio-bf96b58690b496cbd3b327f8b078ff60d837e5c4998650d466db2d075914a6a2 WatchSource:0}: Error finding container bf96b58690b496cbd3b327f8b078ff60d837e5c4998650d466db2d075914a6a2: Status 404 returned error can't find the container with id bf96b58690b496cbd3b327f8b078ff60d837e5c4998650d466db2d075914a6a2 Apr 16 20:51:01.080708 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:01.080675 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-2gj2p" event={"ID":"c7b4520b-3abd-44c2-8383-35c64d159211","Type":"ContainerStarted","Data":"98a8e5a7c62563bdbb32aa7abb34397739fd927a93450ca04bc87ce6bf0fe0a4"} Apr 16 20:51:01.081071 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:01.080715 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-2gj2p" event={"ID":"c7b4520b-3abd-44c2-8383-35c64d159211","Type":"ContainerStarted","Data":"bf96b58690b496cbd3b327f8b078ff60d837e5c4998650d466db2d075914a6a2"} Apr 16 20:51:06.104651 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:06.104543 2537 generic.go:358] "Generic (PLEG): container finished" podID="c7b4520b-3abd-44c2-8383-35c64d159211" containerID="98a8e5a7c62563bdbb32aa7abb34397739fd927a93450ca04bc87ce6bf0fe0a4" exitCode=0 Apr 16 20:51:06.104651 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:06.104596 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-2gj2p" event={"ID":"c7b4520b-3abd-44c2-8383-35c64d159211","Type":"ContainerDied","Data":"98a8e5a7c62563bdbb32aa7abb34397739fd927a93450ca04bc87ce6bf0fe0a4"} Apr 16 20:51:07.109020 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:07.108988 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-2gj2p" event={"ID":"c7b4520b-3abd-44c2-8383-35c64d159211","Type":"ContainerStarted","Data":"44a384a5c1d3eb92eb03a7be158e09214d41b1db21b8fb935c4457a33e2b9cbc"} Apr 16 20:51:07.109373 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:07.109201 2537 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-2gj2p" Apr 16 20:51:07.126143 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:07.126080 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-2gj2p" podStartSLOduration=6.901005337 podStartE2EDuration="7.126064465s" podCreationTimestamp="2026-04-16 20:51:00 +0000 UTC" firstStartedPulling="2026-04-16 20:51:06.105387207 +0000 UTC m=+805.090530790" lastFinishedPulling="2026-04-16 20:51:06.330446349 +0000 UTC m=+805.315589918" observedRunningTime="2026-04-16 20:51:07.125823058 +0000 UTC m=+806.110966662" watchObservedRunningTime="2026-04-16 20:51:07.126064465 +0000 UTC m=+806.111208057" Apr 16 20:51:18.124444 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:18.124411 2537 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-2gj2p" Apr 16 20:51:20.517387 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:20.517355 2537 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-594cf87dc6-ttvfv"] Apr 16 20:51:20.521919 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:20.521896 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-594cf87dc6-ttvfv" Apr 16 20:51:20.531836 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:20.531807 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-594cf87dc6-ttvfv"] Apr 16 20:51:20.606926 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:20.606900 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcq49\" (UniqueName: \"kubernetes.io/projected/4d484da9-ffd6-405e-9d28-8ddadae96579-kube-api-access-zcq49\") pod \"authorino-594cf87dc6-ttvfv\" (UID: \"4d484da9-ffd6-405e-9d28-8ddadae96579\") " pod="kuadrant-system/authorino-594cf87dc6-ttvfv" Apr 16 20:51:20.607035 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:20.606987 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/4d484da9-ffd6-405e-9d28-8ddadae96579-tls-cert\") pod \"authorino-594cf87dc6-ttvfv\" (UID: \"4d484da9-ffd6-405e-9d28-8ddadae96579\") " pod="kuadrant-system/authorino-594cf87dc6-ttvfv" Apr 16 20:51:20.707575 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:20.707538 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/4d484da9-ffd6-405e-9d28-8ddadae96579-tls-cert\") pod \"authorino-594cf87dc6-ttvfv\" (UID: \"4d484da9-ffd6-405e-9d28-8ddadae96579\") " pod="kuadrant-system/authorino-594cf87dc6-ttvfv" Apr 16 20:51:20.707673 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:20.707606 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zcq49\" (UniqueName: \"kubernetes.io/projected/4d484da9-ffd6-405e-9d28-8ddadae96579-kube-api-access-zcq49\") pod \"authorino-594cf87dc6-ttvfv\" (UID: \"4d484da9-ffd6-405e-9d28-8ddadae96579\") " pod="kuadrant-system/authorino-594cf87dc6-ttvfv" Apr 16 20:51:20.709865 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:20.709841 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/4d484da9-ffd6-405e-9d28-8ddadae96579-tls-cert\") pod \"authorino-594cf87dc6-ttvfv\" (UID: \"4d484da9-ffd6-405e-9d28-8ddadae96579\") " pod="kuadrant-system/authorino-594cf87dc6-ttvfv" Apr 16 20:51:20.721810 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:20.721788 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcq49\" (UniqueName: \"kubernetes.io/projected/4d484da9-ffd6-405e-9d28-8ddadae96579-kube-api-access-zcq49\") pod \"authorino-594cf87dc6-ttvfv\" (UID: \"4d484da9-ffd6-405e-9d28-8ddadae96579\") " pod="kuadrant-system/authorino-594cf87dc6-ttvfv" Apr 16 20:51:20.831621 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:20.831545 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-594cf87dc6-ttvfv" Apr 16 20:51:20.962028 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:20.962001 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-594cf87dc6-ttvfv"] Apr 16 20:51:20.964242 ip-10-0-129-199 kubenswrapper[2537]: W0416 20:51:20.964209 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d484da9_ffd6_405e_9d28_8ddadae96579.slice/crio-abb5c99c9d7f0465c87f360e6c179760e0a5272a21f3f54c1aaa12fe16360ae3 WatchSource:0}: Error finding container abb5c99c9d7f0465c87f360e6c179760e0a5272a21f3f54c1aaa12fe16360ae3: Status 404 returned error can't find the container with id abb5c99c9d7f0465c87f360e6c179760e0a5272a21f3f54c1aaa12fe16360ae3 Apr 16 20:51:21.157909 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:21.157875 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-594cf87dc6-ttvfv" event={"ID":"4d484da9-ffd6-405e-9d28-8ddadae96579","Type":"ContainerStarted","Data":"abb5c99c9d7f0465c87f360e6c179760e0a5272a21f3f54c1aaa12fe16360ae3"} Apr 16 20:51:22.163336 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:22.163296 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-594cf87dc6-ttvfv" event={"ID":"4d484da9-ffd6-405e-9d28-8ddadae96579","Type":"ContainerStarted","Data":"2b2cf0d2ce8846a07ab2bf92860ca2496d74b984d9948632ad1c01504d6246ea"} Apr 16 20:51:22.197533 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:22.197474 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-594cf87dc6-ttvfv" podStartSLOduration=1.68375473 podStartE2EDuration="2.197460068s" podCreationTimestamp="2026-04-16 20:51:20 +0000 UTC" firstStartedPulling="2026-04-16 20:51:20.965685394 +0000 UTC m=+819.950828963" lastFinishedPulling="2026-04-16 20:51:21.479390732 +0000 UTC m=+820.464534301" observedRunningTime="2026-04-16 20:51:22.194469402 +0000 UTC m=+821.179612992" watchObservedRunningTime="2026-04-16 20:51:22.197460068 +0000 UTC m=+821.182603661" Apr 16 20:51:22.250615 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:22.250580 2537 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-5544d6787c-dhx56"] Apr 16 20:51:22.250850 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:22.250826 2537 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-5544d6787c-dhx56" podUID="6332ba3b-f489-4524-9e98-d5fab9a13f4a" containerName="authorino" containerID="cri-o://5342d6261cfe70d25edbdbf302831623d2286e23864e6a9c708c9faa10f47b69" gracePeriod=30 Apr 16 20:51:22.494035 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:22.494010 2537 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5544d6787c-dhx56" Apr 16 20:51:22.523978 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:22.523938 2537 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f46dm\" (UniqueName: \"kubernetes.io/projected/6332ba3b-f489-4524-9e98-d5fab9a13f4a-kube-api-access-f46dm\") pod \"6332ba3b-f489-4524-9e98-d5fab9a13f4a\" (UID: \"6332ba3b-f489-4524-9e98-d5fab9a13f4a\") " Apr 16 20:51:22.524113 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:22.524003 2537 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/6332ba3b-f489-4524-9e98-d5fab9a13f4a-tls-cert\") pod \"6332ba3b-f489-4524-9e98-d5fab9a13f4a\" (UID: \"6332ba3b-f489-4524-9e98-d5fab9a13f4a\") " Apr 16 20:51:22.526077 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:22.526053 2537 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6332ba3b-f489-4524-9e98-d5fab9a13f4a-kube-api-access-f46dm" (OuterVolumeSpecName: "kube-api-access-f46dm") pod "6332ba3b-f489-4524-9e98-d5fab9a13f4a" (UID: "6332ba3b-f489-4524-9e98-d5fab9a13f4a"). InnerVolumeSpecName "kube-api-access-f46dm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:51:22.535152 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:22.535125 2537 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6332ba3b-f489-4524-9e98-d5fab9a13f4a-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "6332ba3b-f489-4524-9e98-d5fab9a13f4a" (UID: "6332ba3b-f489-4524-9e98-d5fab9a13f4a"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:51:22.624616 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:22.624592 2537 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/6332ba3b-f489-4524-9e98-d5fab9a13f4a-tls-cert\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 16 20:51:22.624712 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:22.624621 2537 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f46dm\" (UniqueName: \"kubernetes.io/projected/6332ba3b-f489-4524-9e98-d5fab9a13f4a-kube-api-access-f46dm\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 16 20:51:23.167918 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:23.167880 2537 generic.go:358] "Generic (PLEG): container finished" podID="6332ba3b-f489-4524-9e98-d5fab9a13f4a" containerID="5342d6261cfe70d25edbdbf302831623d2286e23864e6a9c708c9faa10f47b69" exitCode=0 Apr 16 20:51:23.168364 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:23.167936 2537 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5544d6787c-dhx56" Apr 16 20:51:23.168364 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:23.167955 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5544d6787c-dhx56" event={"ID":"6332ba3b-f489-4524-9e98-d5fab9a13f4a","Type":"ContainerDied","Data":"5342d6261cfe70d25edbdbf302831623d2286e23864e6a9c708c9faa10f47b69"} Apr 16 20:51:23.168364 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:23.167989 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5544d6787c-dhx56" event={"ID":"6332ba3b-f489-4524-9e98-d5fab9a13f4a","Type":"ContainerDied","Data":"cd26d67a1070eb20d5e1305cceeffb570c612538812316a062d0bb5c9a090a63"} Apr 16 20:51:23.168364 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:23.168005 2537 scope.go:117] "RemoveContainer" containerID="5342d6261cfe70d25edbdbf302831623d2286e23864e6a9c708c9faa10f47b69" Apr 16 20:51:23.176539 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:23.176523 2537 scope.go:117] "RemoveContainer" containerID="5342d6261cfe70d25edbdbf302831623d2286e23864e6a9c708c9faa10f47b69" Apr 16 20:51:23.176840 ip-10-0-129-199 kubenswrapper[2537]: E0416 20:51:23.176825 2537 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5342d6261cfe70d25edbdbf302831623d2286e23864e6a9c708c9faa10f47b69\": container with ID starting with 5342d6261cfe70d25edbdbf302831623d2286e23864e6a9c708c9faa10f47b69 not found: ID does not exist" containerID="5342d6261cfe70d25edbdbf302831623d2286e23864e6a9c708c9faa10f47b69" Apr 16 20:51:23.176918 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:23.176852 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5342d6261cfe70d25edbdbf302831623d2286e23864e6a9c708c9faa10f47b69"} err="failed to get container status \"5342d6261cfe70d25edbdbf302831623d2286e23864e6a9c708c9faa10f47b69\": rpc error: code = NotFound desc = could not find container \"5342d6261cfe70d25edbdbf302831623d2286e23864e6a9c708c9faa10f47b69\": container with ID starting with 5342d6261cfe70d25edbdbf302831623d2286e23864e6a9c708c9faa10f47b69 not found: ID does not exist" Apr 16 20:51:23.189462 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:23.189439 2537 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-5544d6787c-dhx56"] Apr 16 20:51:23.194418 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:23.194397 2537 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-5544d6787c-dhx56"] Apr 16 20:51:23.589365 ip-10-0-129-199 kubenswrapper[2537]: I0416 20:51:23.589273 2537 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6332ba3b-f489-4524-9e98-d5fab9a13f4a" path="/var/lib/kubelet/pods/6332ba3b-f489-4524-9e98-d5fab9a13f4a/volumes" Apr 16 21:02:56.626251 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:02:56.626221 2537 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n6f2j"] Apr 16 21:02:56.626701 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:02:56.626446 2537 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n6f2j" podUID="80e4cc9c-6a18-4776-9940-378ed89ddd1d" containerName="manager" containerID="cri-o://115ace02cbe84f0327cb795f6c781e4980c5de62a58f163a27e1ac628cbf5cea" gracePeriod=10 Apr 16 21:02:57.159839 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:02:57.159817 2537 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n6f2j" Apr 16 21:02:57.190385 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:02:57.190357 2537 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/80e4cc9c-6a18-4776-9940-378ed89ddd1d-extensions-socket-volume\") pod \"80e4cc9c-6a18-4776-9940-378ed89ddd1d\" (UID: \"80e4cc9c-6a18-4776-9940-378ed89ddd1d\") " Apr 16 21:02:57.190530 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:02:57.190424 2537 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z64xk\" (UniqueName: \"kubernetes.io/projected/80e4cc9c-6a18-4776-9940-378ed89ddd1d-kube-api-access-z64xk\") pod \"80e4cc9c-6a18-4776-9940-378ed89ddd1d\" (UID: \"80e4cc9c-6a18-4776-9940-378ed89ddd1d\") " Apr 16 21:02:57.190826 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:02:57.190799 2537 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80e4cc9c-6a18-4776-9940-378ed89ddd1d-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "80e4cc9c-6a18-4776-9940-378ed89ddd1d" (UID: "80e4cc9c-6a18-4776-9940-378ed89ddd1d"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 21:02:57.192544 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:02:57.192517 2537 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80e4cc9c-6a18-4776-9940-378ed89ddd1d-kube-api-access-z64xk" (OuterVolumeSpecName: "kube-api-access-z64xk") pod "80e4cc9c-6a18-4776-9940-378ed89ddd1d" (UID: "80e4cc9c-6a18-4776-9940-378ed89ddd1d"). InnerVolumeSpecName "kube-api-access-z64xk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 21:02:57.291226 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:02:57.291165 2537 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z64xk\" (UniqueName: \"kubernetes.io/projected/80e4cc9c-6a18-4776-9940-378ed89ddd1d-kube-api-access-z64xk\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 16 21:02:57.291226 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:02:57.291190 2537 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/80e4cc9c-6a18-4776-9940-378ed89ddd1d-extensions-socket-volume\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 16 21:02:57.536473 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:02:57.536440 2537 generic.go:358] "Generic (PLEG): container finished" podID="80e4cc9c-6a18-4776-9940-378ed89ddd1d" containerID="115ace02cbe84f0327cb795f6c781e4980c5de62a58f163a27e1ac628cbf5cea" exitCode=0 Apr 16 21:02:57.536649 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:02:57.536499 2537 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n6f2j" Apr 16 21:02:57.536649 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:02:57.536517 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n6f2j" event={"ID":"80e4cc9c-6a18-4776-9940-378ed89ddd1d","Type":"ContainerDied","Data":"115ace02cbe84f0327cb795f6c781e4980c5de62a58f163a27e1ac628cbf5cea"} Apr 16 21:02:57.536649 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:02:57.536572 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n6f2j" event={"ID":"80e4cc9c-6a18-4776-9940-378ed89ddd1d","Type":"ContainerDied","Data":"1d306b77c357bbe2a92e79e880f9a470ff84dd3f2b384f7fa59b33cd79d5b3f1"} Apr 16 21:02:57.536649 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:02:57.536592 2537 scope.go:117] "RemoveContainer" containerID="115ace02cbe84f0327cb795f6c781e4980c5de62a58f163a27e1ac628cbf5cea" Apr 16 21:02:57.545455 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:02:57.545434 2537 scope.go:117] "RemoveContainer" containerID="115ace02cbe84f0327cb795f6c781e4980c5de62a58f163a27e1ac628cbf5cea" Apr 16 21:02:57.545783 ip-10-0-129-199 kubenswrapper[2537]: E0416 21:02:57.545761 2537 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"115ace02cbe84f0327cb795f6c781e4980c5de62a58f163a27e1ac628cbf5cea\": container with ID starting with 115ace02cbe84f0327cb795f6c781e4980c5de62a58f163a27e1ac628cbf5cea not found: ID does not exist" containerID="115ace02cbe84f0327cb795f6c781e4980c5de62a58f163a27e1ac628cbf5cea" Apr 16 21:02:57.545850 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:02:57.545793 2537 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"115ace02cbe84f0327cb795f6c781e4980c5de62a58f163a27e1ac628cbf5cea"} err="failed to get container status \"115ace02cbe84f0327cb795f6c781e4980c5de62a58f163a27e1ac628cbf5cea\": rpc error: code = NotFound desc = could not find container \"115ace02cbe84f0327cb795f6c781e4980c5de62a58f163a27e1ac628cbf5cea\": container with ID starting with 115ace02cbe84f0327cb795f6c781e4980c5de62a58f163a27e1ac628cbf5cea not found: ID does not exist" Apr 16 21:02:57.559829 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:02:57.559808 2537 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n6f2j"] Apr 16 21:02:57.563410 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:02:57.563393 2537 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-n6f2j"] Apr 16 21:02:57.587805 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:02:57.587779 2537 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80e4cc9c-6a18-4776-9940-378ed89ddd1d" path="/var/lib/kubelet/pods/80e4cc9c-6a18-4776-9940-378ed89ddd1d/volumes" Apr 16 21:04:02.736664 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:04:02.736620 2537 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qd45q"] Apr 16 21:04:02.737096 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:04:02.737013 2537 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80e4cc9c-6a18-4776-9940-378ed89ddd1d" containerName="manager" Apr 16 21:04:02.737096 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:04:02.737026 2537 state_mem.go:107] "Deleted CPUSet assignment" podUID="80e4cc9c-6a18-4776-9940-378ed89ddd1d" containerName="manager" Apr 16 21:04:02.737096 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:04:02.737041 2537 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6332ba3b-f489-4524-9e98-d5fab9a13f4a" containerName="authorino" Apr 16 21:04:02.737096 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:04:02.737048 2537 state_mem.go:107] "Deleted CPUSet assignment" podUID="6332ba3b-f489-4524-9e98-d5fab9a13f4a" containerName="authorino" Apr 16 21:04:02.737239 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:04:02.737116 2537 memory_manager.go:356] "RemoveStaleState removing state" podUID="80e4cc9c-6a18-4776-9940-378ed89ddd1d" containerName="manager" Apr 16 21:04:02.737239 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:04:02.737126 2537 memory_manager.go:356] "RemoveStaleState removing state" podUID="6332ba3b-f489-4524-9e98-d5fab9a13f4a" containerName="authorino" Apr 16 21:04:02.740352 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:04:02.740328 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qd45q" Apr 16 21:04:02.743030 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:04:02.743009 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-kbkct\"" Apr 16 21:04:02.751774 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:04:02.751749 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qd45q"] Apr 16 21:04:02.795068 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:04:02.795045 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qm8p\" (UniqueName: \"kubernetes.io/projected/e68f81e2-f8b3-47ec-80b2-c98615871104-kube-api-access-7qm8p\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-qd45q\" (UID: \"e68f81e2-f8b3-47ec-80b2-c98615871104\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qd45q" Apr 16 21:04:02.795183 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:04:02.795136 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e68f81e2-f8b3-47ec-80b2-c98615871104-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-qd45q\" (UID: \"e68f81e2-f8b3-47ec-80b2-c98615871104\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qd45q" Apr 16 21:04:02.895483 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:04:02.895457 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e68f81e2-f8b3-47ec-80b2-c98615871104-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-qd45q\" (UID: \"e68f81e2-f8b3-47ec-80b2-c98615871104\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qd45q" Apr 16 21:04:02.895627 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:04:02.895499 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7qm8p\" (UniqueName: \"kubernetes.io/projected/e68f81e2-f8b3-47ec-80b2-c98615871104-kube-api-access-7qm8p\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-qd45q\" (UID: \"e68f81e2-f8b3-47ec-80b2-c98615871104\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qd45q" Apr 16 21:04:02.895829 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:04:02.895810 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e68f81e2-f8b3-47ec-80b2-c98615871104-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-qd45q\" (UID: \"e68f81e2-f8b3-47ec-80b2-c98615871104\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qd45q" Apr 16 21:04:02.903747 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:04:02.903720 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qm8p\" (UniqueName: \"kubernetes.io/projected/e68f81e2-f8b3-47ec-80b2-c98615871104-kube-api-access-7qm8p\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-qd45q\" (UID: \"e68f81e2-f8b3-47ec-80b2-c98615871104\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qd45q" Apr 16 21:04:03.050646 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:04:03.050586 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qd45q" Apr 16 21:04:03.177529 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:04:03.177502 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qd45q"] Apr 16 21:04:03.179920 ip-10-0-129-199 kubenswrapper[2537]: W0416 21:04:03.179892 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode68f81e2_f8b3_47ec_80b2_c98615871104.slice/crio-4386f02919aa837f5758ab616b21ad706c3539a7a9190c86ca443cd95d4a4448 WatchSource:0}: Error finding container 4386f02919aa837f5758ab616b21ad706c3539a7a9190c86ca443cd95d4a4448: Status 404 returned error can't find the container with id 4386f02919aa837f5758ab616b21ad706c3539a7a9190c86ca443cd95d4a4448 Apr 16 21:04:03.182095 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:04:03.182079 2537 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 21:04:03.767286 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:04:03.767243 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qd45q" event={"ID":"e68f81e2-f8b3-47ec-80b2-c98615871104","Type":"ContainerStarted","Data":"61365db4195ac0288c4114b5337bafab6031fa4f1609ca1402a197e14e6fd342"} Apr 16 21:04:03.767286 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:04:03.767289 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qd45q" event={"ID":"e68f81e2-f8b3-47ec-80b2-c98615871104","Type":"ContainerStarted","Data":"4386f02919aa837f5758ab616b21ad706c3539a7a9190c86ca443cd95d4a4448"} Apr 16 21:04:03.767759 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:04:03.767410 2537 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qd45q" Apr 16 21:04:03.786887 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:04:03.786839 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qd45q" podStartSLOduration=1.786825181 podStartE2EDuration="1.786825181s" podCreationTimestamp="2026-04-16 21:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 21:04:03.785718122 +0000 UTC m=+1582.770861734" watchObservedRunningTime="2026-04-16 21:04:03.786825181 +0000 UTC m=+1582.771968768" Apr 16 21:04:14.773143 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:04:14.773105 2537 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qd45q" Apr 16 21:13:40.204966 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:13:40.204890 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-594cf87dc6-ttvfv_4d484da9-ffd6-405e-9d28-8ddadae96579/authorino/0.log" Apr 16 21:13:44.487871 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:13:44.487840 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-7cd8df7dd5-b68ld_13b78c68-8f8b-4fa4-b851-935fe80ea781/manager/0.log" Apr 16 21:13:46.064637 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:13:46.064609 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-594cf87dc6-ttvfv_4d484da9-ffd6-405e-9d28-8ddadae96579/authorino/0.log" Apr 16 21:13:46.639430 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:13:46.639395 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6bc9f4c76f-qd45q_e68f81e2-f8b3-47ec-80b2-c98615871104/manager/0.log" Apr 16 21:13:47.438366 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:13:47.438336 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-588879f674-dss9z_4c85833a-86bb-4bcf-ad25-a815d8e4ad37/kube-auth-proxy/0.log" Apr 16 21:13:48.149502 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:13:48.149469 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5_41f31c78-31b2-4d46-ac75-14dc38559497/main/0.log" Apr 16 21:13:48.156318 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:13:48.156297 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-8454f99c75-rrbc5_41f31c78-31b2-4d46-ac75-14dc38559497/storage-initializer/0.log" Apr 16 21:13:48.268699 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:13:48.268674 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-69d7bf476b-2gj2p_c7b4520b-3abd-44c2-8383-35c64d159211/storage-initializer/0.log" Apr 16 21:13:48.275674 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:13:48.275654 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-69d7bf476b-2gj2p_c7b4520b-3abd-44c2-8383-35c64d159211/main/0.log" Apr 16 21:13:48.617485 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:13:48.617413 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl_be8360e6-fbc3-4161-87ed-bbcc08568b06/storage-initializer/0.log" Apr 16 21:13:48.624163 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:13:48.624142 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-7889cd8c78-sddwl_be8360e6-fbc3-4161-87ed-bbcc08568b06/main/0.log" Apr 16 21:13:55.236073 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:13:55.236043 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-fsph8_7d19a112-53ff-4260-97bb-aaa69848369c/global-pull-secret-syncer/0.log" Apr 16 21:13:55.438869 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:13:55.438842 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-nlrkh_36f6c978-ccae-4818-a370-35e0101bf84f/konnectivity-agent/0.log" Apr 16 21:13:55.466873 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:13:55.466847 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-199.ec2.internal_87d4ca475b53fa90f2c794fc65d796bc/haproxy/0.log" Apr 16 21:13:59.657343 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:13:59.657309 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-594cf87dc6-ttvfv_4d484da9-ffd6-405e-9d28-8ddadae96579/authorino/0.log" Apr 16 21:13:59.899487 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:13:59.899457 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6bc9f4c76f-qd45q_e68f81e2-f8b3-47ec-80b2-c98615871104/manager/0.log" Apr 16 21:14:01.722673 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:01.722646 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-rzk7j_86469fb3-aefd-4f48-8ab5-d6bbfd40f984/kube-state-metrics/0.log" Apr 16 21:14:01.748474 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:01.748450 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-rzk7j_86469fb3-aefd-4f48-8ab5-d6bbfd40f984/kube-rbac-proxy-main/0.log" Apr 16 21:14:01.772778 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:01.772753 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-rzk7j_86469fb3-aefd-4f48-8ab5-d6bbfd40f984/kube-rbac-proxy-self/0.log" Apr 16 21:14:01.833066 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:01.833038 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-t7nch_58ddffcf-42f3-4d68-87b2-9bb8f0c60899/monitoring-plugin/0.log" Apr 16 21:14:02.007129 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:02.007037 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-fwdtc_6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a/node-exporter/0.log" Apr 16 21:14:02.030023 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:02.029998 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-fwdtc_6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a/kube-rbac-proxy/0.log" Apr 16 21:14:02.054071 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:02.054051 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-fwdtc_6c181ef6-1fbe-4a9b-b6ba-d5c29bc9742a/init-textfile/0.log" Apr 16 21:14:02.187477 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:02.187454 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_20f067b7-64cc-4569-9397-01e5937afcfa/prometheus/0.log" Apr 16 21:14:02.211160 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:02.211141 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_20f067b7-64cc-4569-9397-01e5937afcfa/config-reloader/0.log" Apr 16 21:14:02.237096 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:02.237075 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_20f067b7-64cc-4569-9397-01e5937afcfa/thanos-sidecar/0.log" Apr 16 21:14:02.255780 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:02.255749 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_20f067b7-64cc-4569-9397-01e5937afcfa/kube-rbac-proxy-web/0.log" Apr 16 21:14:02.280450 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:02.280397 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_20f067b7-64cc-4569-9397-01e5937afcfa/kube-rbac-proxy/0.log" Apr 16 21:14:02.302629 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:02.302611 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_20f067b7-64cc-4569-9397-01e5937afcfa/kube-rbac-proxy-thanos/0.log" Apr 16 21:14:02.324294 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:02.324276 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_20f067b7-64cc-4569-9397-01e5937afcfa/init-config-reloader/0.log" Apr 16 21:14:02.349976 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:02.349958 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-l5gv9_108c518b-e8d2-4cba-bed4-69db9efe10ac/prometheus-operator/0.log" Apr 16 21:14:02.368399 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:02.368382 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-l5gv9_108c518b-e8d2-4cba-bed4-69db9efe10ac/kube-rbac-proxy/0.log" Apr 16 21:14:02.408628 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:02.408609 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-bvzlr_934fbf27-bfcf-48a3-b5ad-351d9c325bab/prometheus-operator-admission-webhook/0.log" Apr 16 21:14:02.510456 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:02.510434 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d9c5db45d-svcm9_5af0ee5a-f6b9-4e06-8341-8a0ed59e153c/thanos-query/0.log" Apr 16 21:14:02.534510 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:02.534452 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d9c5db45d-svcm9_5af0ee5a-f6b9-4e06-8341-8a0ed59e153c/kube-rbac-proxy-web/0.log" Apr 16 21:14:02.561822 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:02.561802 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d9c5db45d-svcm9_5af0ee5a-f6b9-4e06-8341-8a0ed59e153c/kube-rbac-proxy/0.log" Apr 16 21:14:02.583064 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:02.583040 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d9c5db45d-svcm9_5af0ee5a-f6b9-4e06-8341-8a0ed59e153c/prom-label-proxy/0.log" Apr 16 21:14:02.608982 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:02.608962 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d9c5db45d-svcm9_5af0ee5a-f6b9-4e06-8341-8a0ed59e153c/kube-rbac-proxy-rules/0.log" Apr 16 21:14:02.632085 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:02.632064 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d9c5db45d-svcm9_5af0ee5a-f6b9-4e06-8341-8a0ed59e153c/kube-rbac-proxy-metrics/0.log" Apr 16 21:14:03.648297 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:03.648270 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-s2hb7_963c2c1a-13eb-461e-bac5-d7a50b6c68ca/networking-console-plugin/0.log" Apr 16 21:14:04.120404 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:04.120322 2537 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gr7vb/perf-node-gather-daemonset-dvhrn"] Apr 16 21:14:04.124009 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:04.123992 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gr7vb/perf-node-gather-daemonset-dvhrn" Apr 16 21:14:04.126485 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:04.126462 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gr7vb\"/\"openshift-service-ca.crt\"" Apr 16 21:14:04.126618 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:04.126500 2537 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gr7vb\"/\"kube-root-ca.crt\"" Apr 16 21:14:04.126618 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:04.126577 2537 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-gr7vb\"/\"default-dockercfg-55qmz\"" Apr 16 21:14:04.133601 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:04.133578 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gr7vb/perf-node-gather-daemonset-dvhrn"] Apr 16 21:14:04.226729 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:04.226700 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/05ed768d-3ba9-499b-b217-812a535c9975-podres\") pod \"perf-node-gather-daemonset-dvhrn\" (UID: \"05ed768d-3ba9-499b-b217-812a535c9975\") " pod="openshift-must-gather-gr7vb/perf-node-gather-daemonset-dvhrn" Apr 16 21:14:04.226865 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:04.226794 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w62sq\" (UniqueName: \"kubernetes.io/projected/05ed768d-3ba9-499b-b217-812a535c9975-kube-api-access-w62sq\") pod \"perf-node-gather-daemonset-dvhrn\" (UID: \"05ed768d-3ba9-499b-b217-812a535c9975\") " pod="openshift-must-gather-gr7vb/perf-node-gather-daemonset-dvhrn" Apr 16 21:14:04.226865 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:04.226842 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/05ed768d-3ba9-499b-b217-812a535c9975-lib-modules\") pod \"perf-node-gather-daemonset-dvhrn\" (UID: \"05ed768d-3ba9-499b-b217-812a535c9975\") " pod="openshift-must-gather-gr7vb/perf-node-gather-daemonset-dvhrn" Apr 16 21:14:04.226865 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:04.226861 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/05ed768d-3ba9-499b-b217-812a535c9975-proc\") pod \"perf-node-gather-daemonset-dvhrn\" (UID: \"05ed768d-3ba9-499b-b217-812a535c9975\") " pod="openshift-must-gather-gr7vb/perf-node-gather-daemonset-dvhrn" Apr 16 21:14:04.226978 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:04.226885 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/05ed768d-3ba9-499b-b217-812a535c9975-sys\") pod \"perf-node-gather-daemonset-dvhrn\" (UID: \"05ed768d-3ba9-499b-b217-812a535c9975\") " pod="openshift-must-gather-gr7vb/perf-node-gather-daemonset-dvhrn" Apr 16 21:14:04.327626 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:04.327596 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w62sq\" (UniqueName: \"kubernetes.io/projected/05ed768d-3ba9-499b-b217-812a535c9975-kube-api-access-w62sq\") pod \"perf-node-gather-daemonset-dvhrn\" (UID: \"05ed768d-3ba9-499b-b217-812a535c9975\") " pod="openshift-must-gather-gr7vb/perf-node-gather-daemonset-dvhrn" Apr 16 21:14:04.327751 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:04.327642 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/05ed768d-3ba9-499b-b217-812a535c9975-lib-modules\") pod \"perf-node-gather-daemonset-dvhrn\" (UID: \"05ed768d-3ba9-499b-b217-812a535c9975\") " pod="openshift-must-gather-gr7vb/perf-node-gather-daemonset-dvhrn" Apr 16 21:14:04.327751 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:04.327665 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/05ed768d-3ba9-499b-b217-812a535c9975-proc\") pod \"perf-node-gather-daemonset-dvhrn\" (UID: \"05ed768d-3ba9-499b-b217-812a535c9975\") " pod="openshift-must-gather-gr7vb/perf-node-gather-daemonset-dvhrn" Apr 16 21:14:04.327751 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:04.327693 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/05ed768d-3ba9-499b-b217-812a535c9975-sys\") pod \"perf-node-gather-daemonset-dvhrn\" (UID: \"05ed768d-3ba9-499b-b217-812a535c9975\") " pod="openshift-must-gather-gr7vb/perf-node-gather-daemonset-dvhrn" Apr 16 21:14:04.327751 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:04.327733 2537 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/05ed768d-3ba9-499b-b217-812a535c9975-podres\") pod \"perf-node-gather-daemonset-dvhrn\" (UID: \"05ed768d-3ba9-499b-b217-812a535c9975\") " pod="openshift-must-gather-gr7vb/perf-node-gather-daemonset-dvhrn" Apr 16 21:14:04.327899 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:04.327778 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/05ed768d-3ba9-499b-b217-812a535c9975-lib-modules\") pod \"perf-node-gather-daemonset-dvhrn\" (UID: \"05ed768d-3ba9-499b-b217-812a535c9975\") " pod="openshift-must-gather-gr7vb/perf-node-gather-daemonset-dvhrn" Apr 16 21:14:04.327899 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:04.327805 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/05ed768d-3ba9-499b-b217-812a535c9975-sys\") pod \"perf-node-gather-daemonset-dvhrn\" (UID: \"05ed768d-3ba9-499b-b217-812a535c9975\") " pod="openshift-must-gather-gr7vb/perf-node-gather-daemonset-dvhrn" Apr 16 21:14:04.327899 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:04.327805 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/05ed768d-3ba9-499b-b217-812a535c9975-proc\") pod \"perf-node-gather-daemonset-dvhrn\" (UID: \"05ed768d-3ba9-499b-b217-812a535c9975\") " pod="openshift-must-gather-gr7vb/perf-node-gather-daemonset-dvhrn" Apr 16 21:14:04.327899 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:04.327854 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/05ed768d-3ba9-499b-b217-812a535c9975-podres\") pod \"perf-node-gather-daemonset-dvhrn\" (UID: \"05ed768d-3ba9-499b-b217-812a535c9975\") " pod="openshift-must-gather-gr7vb/perf-node-gather-daemonset-dvhrn" Apr 16 21:14:04.335511 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:04.335494 2537 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w62sq\" (UniqueName: \"kubernetes.io/projected/05ed768d-3ba9-499b-b217-812a535c9975-kube-api-access-w62sq\") pod \"perf-node-gather-daemonset-dvhrn\" (UID: \"05ed768d-3ba9-499b-b217-812a535c9975\") " pod="openshift-must-gather-gr7vb/perf-node-gather-daemonset-dvhrn" Apr 16 21:14:04.434766 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:04.434749 2537 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gr7vb/perf-node-gather-daemonset-dvhrn" Apr 16 21:14:04.555055 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:04.555014 2537 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gr7vb/perf-node-gather-daemonset-dvhrn"] Apr 16 21:14:04.559827 ip-10-0-129-199 kubenswrapper[2537]: W0416 21:14:04.559741 2537 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod05ed768d_3ba9_499b_b217_812a535c9975.slice/crio-4190b28a18104cd6753033b68ce6e9aa795aacc3f18c98a007008711ac39971d WatchSource:0}: Error finding container 4190b28a18104cd6753033b68ce6e9aa795aacc3f18c98a007008711ac39971d: Status 404 returned error can't find the container with id 4190b28a18104cd6753033b68ce6e9aa795aacc3f18c98a007008711ac39971d Apr 16 21:14:04.561250 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:04.561232 2537 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 21:14:04.801813 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:04.801749 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gr7vb/perf-node-gather-daemonset-dvhrn" event={"ID":"05ed768d-3ba9-499b-b217-812a535c9975","Type":"ContainerStarted","Data":"56e99266892dc27704e324a7aae740b66a85aa1c4dc11934dd0ca572f36b9e5c"} Apr 16 21:14:04.801813 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:04.801783 2537 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gr7vb/perf-node-gather-daemonset-dvhrn" event={"ID":"05ed768d-3ba9-499b-b217-812a535c9975","Type":"ContainerStarted","Data":"4190b28a18104cd6753033b68ce6e9aa795aacc3f18c98a007008711ac39971d"} Apr 16 21:14:04.802144 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:04.801886 2537 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-gr7vb/perf-node-gather-daemonset-dvhrn" Apr 16 21:14:04.829627 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:04.829589 2537 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gr7vb/perf-node-gather-daemonset-dvhrn" podStartSLOduration=0.829553737 podStartE2EDuration="829.553737ms" podCreationTimestamp="2026-04-16 21:14:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 21:14:04.829480013 +0000 UTC m=+2183.814623604" watchObservedRunningTime="2026-04-16 21:14:04.829553737 +0000 UTC m=+2183.814697327" Apr 16 21:14:05.248339 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:05.248310 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-8d7wj_7853bade-dca6-4e57-adbd-3fc48629e3ed/volume-data-source-validator/0.log" Apr 16 21:14:06.090220 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:06.090192 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-fw7ch_8ff520c7-8bbe-42a7-8fc6-dced59fa3098/dns/0.log" Apr 16 21:14:06.110471 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:06.110448 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-fw7ch_8ff520c7-8bbe-42a7-8fc6-dced59fa3098/kube-rbac-proxy/0.log" Apr 16 21:14:06.210447 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:06.210427 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-gsrd4_74821fb7-65d5-4396-bc93-add3e5936d13/dns-node-resolver/0.log" Apr 16 21:14:06.728684 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:06.728655 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-2t2sk_81e7762a-1005-4a21-8f55-9dda467004e0/node-ca/0.log" Apr 16 21:14:07.717196 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:07.717167 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-588879f674-dss9z_4c85833a-86bb-4bcf-ad25-a815d8e4ad37/kube-auth-proxy/0.log" Apr 16 21:14:08.373383 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:08.373354 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-dxvbz_e45e2b17-af71-470b-a92b-013389ef5f6c/serve-healthcheck-canary/0.log" Apr 16 21:14:08.881641 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:08.881616 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fkfvs_8a7d4825-6f48-4c8e-98c8-973ed6e7a0de/kube-rbac-proxy/0.log" Apr 16 21:14:08.904114 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:08.904095 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fkfvs_8a7d4825-6f48-4c8e-98c8-973ed6e7a0de/exporter/0.log" Apr 16 21:14:08.930486 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:08.930469 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fkfvs_8a7d4825-6f48-4c8e-98c8-973ed6e7a0de/extractor/0.log" Apr 16 21:14:10.814643 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:10.814615 2537 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-gr7vb/perf-node-gather-daemonset-dvhrn" Apr 16 21:14:11.232355 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:11.232327 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-7cd8df7dd5-b68ld_13b78c68-8f8b-4fa4-b851-935fe80ea781/manager/0.log" Apr 16 21:14:12.476401 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:12.476370 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-5494fc4578-p6zmk_fdaddecb-774b-4a02-bd24-e82de6235dbb/manager/0.log" Apr 16 21:14:18.600701 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:18.600664 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8bj89_7770e551-6840-450b-81fa-c37714dbe265/kube-multus/0.log" Apr 16 21:14:18.971065 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:18.971035 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kslqr_88ebfbf2-8dfe-4d3c-91ed-559a91a0a925/kube-multus-additional-cni-plugins/0.log" Apr 16 21:14:19.009959 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:19.009935 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kslqr_88ebfbf2-8dfe-4d3c-91ed-559a91a0a925/egress-router-binary-copy/0.log" Apr 16 21:14:19.055003 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:19.054980 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kslqr_88ebfbf2-8dfe-4d3c-91ed-559a91a0a925/cni-plugins/0.log" Apr 16 21:14:19.090236 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:19.090200 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kslqr_88ebfbf2-8dfe-4d3c-91ed-559a91a0a925/bond-cni-plugin/0.log" Apr 16 21:14:19.120963 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:19.120943 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kslqr_88ebfbf2-8dfe-4d3c-91ed-559a91a0a925/routeoverride-cni/0.log" Apr 16 21:14:19.151916 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:19.151897 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kslqr_88ebfbf2-8dfe-4d3c-91ed-559a91a0a925/whereabouts-cni-bincopy/0.log" Apr 16 21:14:19.176251 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:19.176227 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kslqr_88ebfbf2-8dfe-4d3c-91ed-559a91a0a925/whereabouts-cni/0.log" Apr 16 21:14:19.566653 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:19.566578 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-d6xkn_11dcf076-13e5-4128-bf13-7e6c86c6dd5b/network-metrics-daemon/0.log" Apr 16 21:14:19.593096 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:19.593069 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-d6xkn_11dcf076-13e5-4128-bf13-7e6c86c6dd5b/kube-rbac-proxy/0.log" Apr 16 21:14:20.608371 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:20.608342 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5qqqk_4f9a3fdc-c6ac-415d-a704-3724ed4158a1/ovn-controller/0.log" Apr 16 21:14:20.663695 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:20.663671 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5qqqk_4f9a3fdc-c6ac-415d-a704-3724ed4158a1/ovn-acl-logging/0.log" Apr 16 21:14:20.704416 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:20.704395 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5qqqk_4f9a3fdc-c6ac-415d-a704-3724ed4158a1/kube-rbac-proxy-node/0.log" Apr 16 21:14:20.739603 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:20.739587 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5qqqk_4f9a3fdc-c6ac-415d-a704-3724ed4158a1/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 21:14:20.772742 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:20.772722 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5qqqk_4f9a3fdc-c6ac-415d-a704-3724ed4158a1/northd/0.log" Apr 16 21:14:20.816692 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:20.816676 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5qqqk_4f9a3fdc-c6ac-415d-a704-3724ed4158a1/nbdb/0.log" Apr 16 21:14:20.842296 ip-10-0-129-199 kubenswrapper[2537]: I0416 21:14:20.842279 2537 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5qqqk_4f9a3fdc-c6ac-415d-a704-3724ed4158a1/sbdb/0.log"