Apr 17 21:10:43.569213 ip-10-0-135-174 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 21:10:43.569224 ip-10-0-135-174 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 21:10:43.569231 ip-10-0-135-174 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 21:10:43.569469 ip-10-0-135-174 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 21:10:53.718942 ip-10-0-135-174 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 21:10:53.718957 ip-10-0-135-174 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 751cae78d8b84f34872d30bbb653057f -- Apr 17 21:13:16.187448 ip-10-0-135-174 systemd[1]: Starting Kubernetes Kubelet... Apr 17 21:13:16.613888 ip-10-0-135-174 kubenswrapper[2572]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 21:13:16.613888 ip-10-0-135-174 kubenswrapper[2572]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 21:13:16.613888 ip-10-0-135-174 kubenswrapper[2572]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 21:13:16.613888 ip-10-0-135-174 kubenswrapper[2572]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 21:13:16.613888 ip-10-0-135-174 kubenswrapper[2572]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 21:13:16.615399 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.615321 2572 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 21:13:16.617567 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617551 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 21:13:16.617567 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617566 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 21:13:16.617628 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617570 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 21:13:16.617628 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617574 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 21:13:16.617628 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617577 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 21:13:16.617628 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617580 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 21:13:16.617628 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617583 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 21:13:16.617628 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617586 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 21:13:16.617628 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617589 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 21:13:16.617628 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617591 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 21:13:16.617628 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617594 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 21:13:16.617628 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617596 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 21:13:16.617628 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617599 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 21:13:16.617628 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617604 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 21:13:16.617628 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617608 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 21:13:16.617628 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617610 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 21:13:16.617628 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617613 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 21:13:16.617628 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617616 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 21:13:16.617628 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617620 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 21:13:16.617628 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617623 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 21:13:16.617628 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617626 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 21:13:16.618075 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617629 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 21:13:16.618075 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617632 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 21:13:16.618075 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617635 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 21:13:16.618075 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617638 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 21:13:16.618075 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617641 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 21:13:16.618075 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617644 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 21:13:16.618075 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617646 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 21:13:16.618075 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617649 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 21:13:16.618075 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617651 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 21:13:16.618075 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617654 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 21:13:16.618075 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617656 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 21:13:16.618075 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617659 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 21:13:16.618075 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617661 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 21:13:16.618075 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617664 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 21:13:16.618075 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617666 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 21:13:16.618075 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617670 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 21:13:16.618075 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617672 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 21:13:16.618075 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617675 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 21:13:16.618075 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617678 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 21:13:16.618075 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617681 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 21:13:16.618571 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617683 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 21:13:16.618571 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617686 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 21:13:16.618571 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617689 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 21:13:16.618571 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617691 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 21:13:16.618571 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617694 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 21:13:16.618571 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617696 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 21:13:16.618571 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617699 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 21:13:16.618571 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617701 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 21:13:16.618571 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617704 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 21:13:16.618571 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617707 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 21:13:16.618571 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617711 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 21:13:16.618571 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617714 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 21:13:16.618571 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617716 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 21:13:16.618571 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617720 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 21:13:16.618571 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617722 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 21:13:16.618571 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617725 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 21:13:16.618571 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617728 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 21:13:16.618571 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617730 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 21:13:16.618571 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617733 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 21:13:16.618571 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617735 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 21:13:16.619049 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617738 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 21:13:16.619049 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617741 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 21:13:16.619049 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617744 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 21:13:16.619049 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617746 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 21:13:16.619049 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617749 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 21:13:16.619049 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617751 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 21:13:16.619049 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617754 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 21:13:16.619049 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617756 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 21:13:16.619049 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617759 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 21:13:16.619049 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617761 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 21:13:16.619049 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617764 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 21:13:16.619049 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617766 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 21:13:16.619049 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617769 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 21:13:16.619049 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617771 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 21:13:16.619049 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617774 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 21:13:16.619049 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617776 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 21:13:16.619049 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617779 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 21:13:16.619049 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617782 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 21:13:16.619049 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617785 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 21:13:16.619049 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617787 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 21:13:16.619550 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617790 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 21:13:16.619550 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617792 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 21:13:16.619550 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617795 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 21:13:16.619550 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617797 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 21:13:16.619550 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.617800 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 21:13:16.619550 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618182 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 21:13:16.619550 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618187 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 21:13:16.619550 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618190 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 21:13:16.619550 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618193 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 21:13:16.619550 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618196 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 21:13:16.619550 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618199 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 21:13:16.619550 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618201 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 21:13:16.619550 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618204 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 21:13:16.619550 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618206 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 21:13:16.619550 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618209 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 21:13:16.619550 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618212 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 21:13:16.619550 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618214 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 21:13:16.619550 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618218 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 21:13:16.619550 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618220 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 21:13:16.619550 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618223 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 21:13:16.620029 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618225 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 21:13:16.620029 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618228 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 21:13:16.620029 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618230 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 21:13:16.620029 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618234 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 21:13:16.620029 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618237 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 21:13:16.620029 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618239 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 21:13:16.620029 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618242 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 21:13:16.620029 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618245 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 21:13:16.620029 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618247 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 21:13:16.620029 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618250 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 21:13:16.620029 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618253 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 21:13:16.620029 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618256 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 21:13:16.620029 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618258 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 21:13:16.620029 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618261 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 21:13:16.620029 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618264 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 21:13:16.620029 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618266 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 21:13:16.620029 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618269 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 21:13:16.620029 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618272 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 21:13:16.620029 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618274 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 21:13:16.620029 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618276 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 21:13:16.620541 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618279 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 21:13:16.620541 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618282 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 21:13:16.620541 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618285 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 21:13:16.620541 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618287 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 21:13:16.620541 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618290 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 21:13:16.620541 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618292 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 21:13:16.620541 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618297 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 21:13:16.620541 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618300 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 21:13:16.620541 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618302 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 21:13:16.620541 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618306 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 21:13:16.620541 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618308 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 21:13:16.620541 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618311 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 21:13:16.620541 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618314 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 21:13:16.620541 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618316 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 21:13:16.620541 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618319 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 21:13:16.620541 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618321 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 21:13:16.620541 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618323 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 21:13:16.620541 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618326 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 21:13:16.620541 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618328 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 21:13:16.621018 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618331 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 21:13:16.621018 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618333 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 21:13:16.621018 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618336 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 21:13:16.621018 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618339 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 21:13:16.621018 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618342 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 21:13:16.621018 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618344 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 21:13:16.621018 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618348 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 21:13:16.621018 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618350 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 21:13:16.621018 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618353 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 21:13:16.621018 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618356 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 21:13:16.621018 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618358 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 21:13:16.621018 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618361 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 21:13:16.621018 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618364 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 21:13:16.621018 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618367 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 21:13:16.621018 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618369 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 21:13:16.621018 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618372 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 21:13:16.621018 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618374 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 21:13:16.621018 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618377 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 21:13:16.621018 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618379 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 21:13:16.621532 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618382 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 21:13:16.621532 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618385 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 21:13:16.621532 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618387 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 21:13:16.621532 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618390 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 21:13:16.621532 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618392 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 21:13:16.621532 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618395 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 21:13:16.621532 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618398 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 21:13:16.621532 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618401 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 21:13:16.621532 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618403 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 21:13:16.621532 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618406 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 21:13:16.621532 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618409 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 21:13:16.621532 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618411 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 21:13:16.621532 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.618413 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 21:13:16.621532 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618482 2572 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 21:13:16.621532 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618489 2572 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 21:13:16.621532 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618496 2572 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 21:13:16.621532 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618500 2572 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 21:13:16.621532 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618504 2572 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 21:13:16.621532 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618508 2572 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 21:13:16.621532 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618512 2572 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 21:13:16.621532 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618517 2572 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 21:13:16.622037 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618520 2572 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 21:13:16.622037 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618523 2572 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 21:13:16.622037 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618526 2572 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 21:13:16.622037 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618530 2572 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 21:13:16.622037 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618533 2572 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 21:13:16.622037 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618536 2572 flags.go:64] FLAG: --cgroup-root="" Apr 17 21:13:16.622037 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618538 2572 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 21:13:16.622037 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618541 2572 flags.go:64] FLAG: --client-ca-file="" Apr 17 21:13:16.622037 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618544 2572 flags.go:64] FLAG: --cloud-config="" Apr 17 21:13:16.622037 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618547 2572 flags.go:64] FLAG: --cloud-provider="external" Apr 17 21:13:16.622037 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618550 2572 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 21:13:16.622037 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618553 2572 flags.go:64] FLAG: --cluster-domain="" Apr 17 21:13:16.622037 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618556 2572 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 21:13:16.622037 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618559 2572 flags.go:64] FLAG: --config-dir="" Apr 17 21:13:16.622037 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618562 2572 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 21:13:16.622037 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618566 2572 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 21:13:16.622037 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618569 2572 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 21:13:16.622037 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618573 2572 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 21:13:16.622037 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618576 2572 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 21:13:16.622037 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618579 2572 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 21:13:16.622037 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618582 2572 flags.go:64] FLAG: --contention-profiling="false" Apr 17 21:13:16.622037 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618585 2572 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 21:13:16.622037 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618588 2572 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 21:13:16.622037 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618591 2572 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 21:13:16.622037 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618594 2572 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 21:13:16.622649 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618598 2572 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 21:13:16.622649 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618602 2572 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 21:13:16.622649 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618605 2572 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 21:13:16.622649 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618608 2572 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 21:13:16.622649 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618611 2572 flags.go:64] FLAG: --enable-server="true" Apr 17 21:13:16.622649 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618614 2572 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 21:13:16.622649 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618619 2572 flags.go:64] FLAG: --event-burst="100" Apr 17 21:13:16.622649 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618622 2572 flags.go:64] FLAG: --event-qps="50" Apr 17 21:13:16.622649 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618625 2572 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 21:13:16.622649 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618628 2572 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 21:13:16.622649 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618631 2572 flags.go:64] FLAG: --eviction-hard="" Apr 17 21:13:16.622649 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618635 2572 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 21:13:16.622649 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618656 2572 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 21:13:16.622649 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618660 2572 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 21:13:16.622649 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618663 2572 flags.go:64] FLAG: --eviction-soft="" Apr 17 21:13:16.622649 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618666 2572 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 21:13:16.622649 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618669 2572 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 21:13:16.622649 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618672 2572 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 21:13:16.622649 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618675 2572 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 21:13:16.622649 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618678 2572 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 21:13:16.622649 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618681 2572 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 21:13:16.622649 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618684 2572 flags.go:64] FLAG: --feature-gates="" Apr 17 21:13:16.622649 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618688 2572 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 21:13:16.622649 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618691 2572 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 21:13:16.622649 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618694 2572 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 21:13:16.623304 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618703 2572 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 21:13:16.623304 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618706 2572 flags.go:64] FLAG: --healthz-port="10248" Apr 17 21:13:16.623304 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618709 2572 flags.go:64] FLAG: --help="false" Apr 17 21:13:16.623304 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618712 2572 flags.go:64] FLAG: --hostname-override="ip-10-0-135-174.ec2.internal" Apr 17 21:13:16.623304 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618715 2572 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 21:13:16.623304 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618718 2572 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 21:13:16.623304 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618721 2572 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 21:13:16.623304 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618724 2572 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 21:13:16.623304 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618727 2572 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 21:13:16.623304 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618730 2572 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 21:13:16.623304 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618733 2572 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 21:13:16.623304 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618736 2572 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 21:13:16.623304 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618739 2572 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 21:13:16.623304 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618742 2572 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 21:13:16.623304 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618745 2572 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 21:13:16.623304 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618748 2572 flags.go:64] FLAG: --kube-reserved="" Apr 17 21:13:16.623304 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618751 2572 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 21:13:16.623304 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618754 2572 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 21:13:16.623304 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618757 2572 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 21:13:16.623304 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618759 2572 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 21:13:16.623304 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618762 2572 flags.go:64] FLAG: --lock-file="" Apr 17 21:13:16.623304 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618765 2572 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 21:13:16.623304 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618768 2572 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 21:13:16.623304 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618771 2572 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 21:13:16.623889 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618776 2572 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 21:13:16.623889 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618779 2572 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 21:13:16.623889 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618781 2572 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 21:13:16.623889 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618784 2572 flags.go:64] FLAG: --logging-format="text" Apr 17 21:13:16.623889 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618787 2572 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 21:13:16.623889 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618790 2572 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 21:13:16.623889 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618793 2572 flags.go:64] FLAG: --manifest-url="" Apr 17 21:13:16.623889 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618796 2572 flags.go:64] FLAG: --manifest-url-header="" Apr 17 21:13:16.623889 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618800 2572 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 21:13:16.623889 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618809 2572 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 21:13:16.623889 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618813 2572 flags.go:64] FLAG: --max-pods="110" Apr 17 21:13:16.623889 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618816 2572 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 21:13:16.623889 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618819 2572 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 21:13:16.623889 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618821 2572 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 21:13:16.623889 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618824 2572 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 21:13:16.623889 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618827 2572 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 21:13:16.623889 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618830 2572 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 21:13:16.623889 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618833 2572 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 21:13:16.623889 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618842 2572 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 21:13:16.623889 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618845 2572 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 21:13:16.623889 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618852 2572 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 21:13:16.623889 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618855 2572 flags.go:64] FLAG: --pod-cidr="" Apr 17 21:13:16.623889 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618858 2572 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 21:13:16.624489 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618863 2572 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 21:13:16.624489 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618866 2572 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 21:13:16.624489 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618870 2572 flags.go:64] FLAG: --pods-per-core="0" Apr 17 21:13:16.624489 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618872 2572 flags.go:64] FLAG: --port="10250" Apr 17 21:13:16.624489 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618875 2572 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 21:13:16.624489 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618878 2572 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0eadbdf4d004d9df7" Apr 17 21:13:16.624489 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618881 2572 flags.go:64] FLAG: --qos-reserved="" Apr 17 21:13:16.624489 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618884 2572 flags.go:64] FLAG: --read-only-port="10255" Apr 17 21:13:16.624489 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618887 2572 flags.go:64] FLAG: --register-node="true" Apr 17 21:13:16.624489 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618890 2572 flags.go:64] FLAG: --register-schedulable="true" Apr 17 21:13:16.624489 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618893 2572 flags.go:64] FLAG: --register-with-taints="" Apr 17 21:13:16.624489 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618897 2572 flags.go:64] FLAG: --registry-burst="10" Apr 17 21:13:16.624489 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618900 2572 flags.go:64] FLAG: --registry-qps="5" Apr 17 21:13:16.624489 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618902 2572 flags.go:64] FLAG: --reserved-cpus="" Apr 17 21:13:16.624489 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618905 2572 flags.go:64] FLAG: --reserved-memory="" Apr 17 21:13:16.624489 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618909 2572 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 21:13:16.624489 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618912 2572 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 21:13:16.624489 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618915 2572 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 21:13:16.624489 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618918 2572 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 21:13:16.624489 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618926 2572 flags.go:64] FLAG: --runonce="false" Apr 17 21:13:16.624489 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618929 2572 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 21:13:16.624489 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618933 2572 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 21:13:16.624489 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618936 2572 flags.go:64] FLAG: --seccomp-default="false" Apr 17 21:13:16.624489 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618938 2572 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 21:13:16.624489 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618941 2572 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 21:13:16.624489 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618944 2572 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 21:13:16.625117 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618947 2572 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 21:13:16.625117 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618950 2572 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 21:13:16.625117 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618953 2572 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 21:13:16.625117 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618956 2572 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 21:13:16.625117 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618960 2572 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 21:13:16.625117 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618962 2572 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 21:13:16.625117 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618966 2572 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 21:13:16.625117 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618969 2572 flags.go:64] FLAG: --system-cgroups="" Apr 17 21:13:16.625117 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618972 2572 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 21:13:16.625117 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618977 2572 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 21:13:16.625117 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618980 2572 flags.go:64] FLAG: --tls-cert-file="" Apr 17 21:13:16.625117 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618983 2572 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 21:13:16.625117 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618987 2572 flags.go:64] FLAG: --tls-min-version="" Apr 17 21:13:16.625117 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618989 2572 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 21:13:16.625117 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618992 2572 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 21:13:16.625117 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618995 2572 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 21:13:16.625117 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.618998 2572 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 21:13:16.625117 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.619001 2572 flags.go:64] FLAG: --v="2" Apr 17 21:13:16.625117 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.619005 2572 flags.go:64] FLAG: --version="false" Apr 17 21:13:16.625117 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.619010 2572 flags.go:64] FLAG: --vmodule="" Apr 17 21:13:16.625117 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.619014 2572 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 21:13:16.625117 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.619017 2572 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 21:13:16.625117 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619115 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 21:13:16.625117 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619119 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 21:13:16.625717 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619122 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 21:13:16.625717 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619126 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 21:13:16.625717 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619130 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 21:13:16.625717 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619134 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 21:13:16.625717 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619137 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 21:13:16.625717 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619140 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 21:13:16.625717 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619143 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 21:13:16.625717 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619145 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 21:13:16.625717 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619148 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 21:13:16.625717 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619151 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 21:13:16.625717 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619153 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 21:13:16.625717 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619156 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 21:13:16.625717 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619160 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 21:13:16.625717 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619163 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 21:13:16.625717 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619181 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 21:13:16.625717 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619185 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 21:13:16.625717 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619188 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 21:13:16.625717 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619190 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 21:13:16.625717 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619193 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 21:13:16.626204 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619196 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 21:13:16.626204 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619199 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 21:13:16.626204 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619202 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 21:13:16.626204 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619205 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 21:13:16.626204 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619207 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 21:13:16.626204 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619210 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 21:13:16.626204 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619212 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 21:13:16.626204 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619215 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 21:13:16.626204 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619218 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 21:13:16.626204 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619220 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 21:13:16.626204 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619223 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 21:13:16.626204 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619226 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 21:13:16.626204 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619229 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 21:13:16.626204 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619231 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 21:13:16.626204 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619234 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 21:13:16.626204 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619237 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 21:13:16.626204 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619240 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 21:13:16.626204 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619242 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 21:13:16.626204 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619245 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 21:13:16.626666 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619248 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 21:13:16.626666 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619250 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 21:13:16.626666 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619253 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 21:13:16.626666 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619255 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 21:13:16.626666 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619258 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 21:13:16.626666 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619260 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 21:13:16.626666 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619264 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 21:13:16.626666 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619266 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 21:13:16.626666 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619271 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 21:13:16.626666 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619273 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 21:13:16.626666 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619276 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 21:13:16.626666 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619279 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 21:13:16.626666 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619281 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 21:13:16.626666 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619284 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 21:13:16.626666 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619286 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 21:13:16.626666 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619290 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 21:13:16.626666 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619294 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 21:13:16.626666 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619296 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 21:13:16.626666 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619299 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 21:13:16.626666 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619301 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 21:13:16.627151 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619304 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 21:13:16.627151 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619306 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 21:13:16.627151 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619308 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 21:13:16.627151 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619311 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 21:13:16.627151 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619314 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 21:13:16.627151 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619316 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 21:13:16.627151 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619319 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 21:13:16.627151 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619321 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 21:13:16.627151 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619324 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 21:13:16.627151 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619327 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 21:13:16.627151 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619330 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 21:13:16.627151 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619333 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 21:13:16.627151 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619335 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 21:13:16.627151 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619338 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 21:13:16.627151 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619340 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 21:13:16.627151 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619343 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 21:13:16.627151 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619345 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 21:13:16.627151 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619348 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 21:13:16.627151 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619352 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 21:13:16.627151 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619355 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 21:13:16.627690 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619358 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 21:13:16.627690 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619361 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 21:13:16.627690 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619363 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 21:13:16.627690 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619366 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 21:13:16.627690 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619369 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 21:13:16.627690 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.619371 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 21:13:16.627690 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.620599 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 21:13:16.627690 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.627054 2572 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 21:13:16.627690 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.627068 2572 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 21:13:16.627690 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627119 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 21:13:16.627690 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627123 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 21:13:16.627690 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627127 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 21:13:16.627690 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627129 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 21:13:16.627690 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627132 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 21:13:16.627690 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627135 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 21:13:16.627690 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627138 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 21:13:16.628088 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627140 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 21:13:16.628088 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627143 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 21:13:16.628088 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627145 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 21:13:16.628088 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627148 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 21:13:16.628088 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627150 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 21:13:16.628088 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627153 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 21:13:16.628088 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627156 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 21:13:16.628088 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627158 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 21:13:16.628088 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627161 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 21:13:16.628088 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627164 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 21:13:16.628088 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627182 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 21:13:16.628088 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627185 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 21:13:16.628088 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627188 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 21:13:16.628088 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627191 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 21:13:16.628088 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627193 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 21:13:16.628088 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627196 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 21:13:16.628088 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627198 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 21:13:16.628088 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627201 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 21:13:16.628088 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627204 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 21:13:16.628088 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627207 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 21:13:16.628646 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627209 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 21:13:16.628646 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627212 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 21:13:16.628646 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627214 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 21:13:16.628646 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627217 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 21:13:16.628646 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627224 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 21:13:16.628646 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627227 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 21:13:16.628646 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627229 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 21:13:16.628646 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627232 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 21:13:16.628646 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627234 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 21:13:16.628646 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627237 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 21:13:16.628646 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627239 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 21:13:16.628646 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627242 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 21:13:16.628646 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627244 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 21:13:16.628646 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627247 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 21:13:16.628646 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627249 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 21:13:16.628646 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627252 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 21:13:16.628646 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627254 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 21:13:16.628646 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627257 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 21:13:16.628646 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627259 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 21:13:16.628646 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627262 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 21:13:16.629142 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627264 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 21:13:16.629142 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627267 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 21:13:16.629142 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627269 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 21:13:16.629142 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627272 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 21:13:16.629142 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627274 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 21:13:16.629142 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627277 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 21:13:16.629142 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627279 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 21:13:16.629142 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627282 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 21:13:16.629142 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627284 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 21:13:16.629142 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627288 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 21:13:16.629142 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627292 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 21:13:16.629142 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627295 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 21:13:16.629142 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627297 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 21:13:16.629142 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627300 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 21:13:16.629142 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627302 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 21:13:16.629142 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627305 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 21:13:16.629142 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627307 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 21:13:16.629142 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627314 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 21:13:16.629142 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627319 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 21:13:16.629608 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627322 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 21:13:16.629608 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627325 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 21:13:16.629608 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627327 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 21:13:16.629608 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627330 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 21:13:16.629608 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627333 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 21:13:16.629608 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627336 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 21:13:16.629608 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627339 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 21:13:16.629608 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627342 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 21:13:16.629608 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627351 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 21:13:16.629608 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627354 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 21:13:16.629608 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627357 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 21:13:16.629608 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627360 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 21:13:16.629608 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627362 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 21:13:16.629608 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627365 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 21:13:16.629608 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627368 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 21:13:16.629608 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627371 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 21:13:16.629608 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627374 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 21:13:16.629608 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627377 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 21:13:16.629608 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627379 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 21:13:16.630068 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627382 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 21:13:16.630068 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.627387 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 21:13:16.630068 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627520 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 21:13:16.630068 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627525 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 21:13:16.630068 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627527 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 21:13:16.630068 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627530 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 21:13:16.630068 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627533 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 21:13:16.630068 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627536 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 21:13:16.630068 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627539 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 21:13:16.630068 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627541 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 21:13:16.630068 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627544 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 21:13:16.630068 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627547 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 21:13:16.630068 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627555 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 21:13:16.630068 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627558 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 21:13:16.630068 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627560 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 21:13:16.630068 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627563 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 21:13:16.630472 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627565 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 21:13:16.630472 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627568 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 21:13:16.630472 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627570 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 21:13:16.630472 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627573 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 21:13:16.630472 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627575 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 21:13:16.630472 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627577 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 21:13:16.630472 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627580 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 21:13:16.630472 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627582 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 21:13:16.630472 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627587 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 21:13:16.630472 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627590 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 21:13:16.630472 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627593 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 21:13:16.630472 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627596 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 21:13:16.630472 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627598 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 21:13:16.630472 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627601 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 21:13:16.630472 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627603 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 21:13:16.630472 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627606 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 21:13:16.630472 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627608 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 21:13:16.630472 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627611 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 21:13:16.630472 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627613 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 21:13:16.630932 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627616 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 21:13:16.630932 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627618 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 21:13:16.630932 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627621 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 21:13:16.630932 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627624 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 21:13:16.630932 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627626 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 21:13:16.630932 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627629 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 21:13:16.630932 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627631 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 21:13:16.630932 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627633 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 21:13:16.630932 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627637 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 21:13:16.630932 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627640 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 21:13:16.630932 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627648 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 21:13:16.630932 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627651 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 21:13:16.630932 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627653 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 21:13:16.630932 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627656 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 21:13:16.630932 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627658 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 21:13:16.630932 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627661 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 21:13:16.630932 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627664 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 21:13:16.630932 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627666 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 21:13:16.630932 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627669 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 21:13:16.631449 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627671 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 21:13:16.631449 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627674 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 21:13:16.631449 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627676 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 21:13:16.631449 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627678 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 21:13:16.631449 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627681 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 21:13:16.631449 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627683 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 21:13:16.631449 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627686 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 21:13:16.631449 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627689 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 21:13:16.631449 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627691 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 21:13:16.631449 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627694 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 21:13:16.631449 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627696 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 21:13:16.631449 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627700 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 21:13:16.631449 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627702 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 21:13:16.631449 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627705 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 21:13:16.631449 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627707 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 21:13:16.631449 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627710 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 21:13:16.631449 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627712 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 21:13:16.631449 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627715 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 21:13:16.631449 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627717 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 21:13:16.631449 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627720 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 21:13:16.631933 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627722 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 21:13:16.631933 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627725 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 21:13:16.631933 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627728 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 21:13:16.631933 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627730 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 21:13:16.631933 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627738 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 21:13:16.631933 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627741 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 21:13:16.631933 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627743 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 21:13:16.631933 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627746 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 21:13:16.631933 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627749 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 21:13:16.631933 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627751 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 21:13:16.631933 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627753 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 21:13:16.631933 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627756 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 21:13:16.631933 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627758 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 21:13:16.631933 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:16.627761 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 21:13:16.631933 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.627765 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 21:13:16.631933 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.628562 2572 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 21:13:16.632505 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.630578 2572 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 21:13:16.632505 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.631912 2572 server.go:1019] "Starting client certificate rotation" Apr 17 21:13:16.632505 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.632007 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 21:13:16.632725 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.632713 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 21:13:16.655384 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.655366 2572 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 21:13:16.661488 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.661466 2572 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 21:13:16.677323 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.677196 2572 log.go:25] "Validated CRI v1 runtime API" Apr 17 21:13:16.683256 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.683238 2572 log.go:25] "Validated CRI v1 image API" Apr 17 21:13:16.684106 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.684075 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 21:13:16.684603 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.684589 2572 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 21:13:16.688578 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.688560 2572 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 8560af88-dcb5-48d1-a0df-ee549dfece6d:/dev/nvme0n1p4 b28a2f52-cfc8-4b0a-a765-00681ad8be1c:/dev/nvme0n1p3] Apr 17 21:13:16.688630 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.688578 2572 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 21:13:16.694807 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.694698 2572 manager.go:217] Machine: {Timestamp:2026-04-17 21:13:16.69264585 +0000 UTC m=+0.389200913 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099973 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec25d1e5e888c235d8e0b9d75042a0f6 SystemUUID:ec25d1e5-e888-c235-d8e0-b9d75042a0f6 BootID:751cae78-d8b8-4f34-872d-30bbb653057f Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:05:21:fd:51:2f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:05:21:fd:51:2f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:52:17:64:c7:98:31 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 21:13:16.694807 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.694802 2572 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 21:13:16.694913 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.694878 2572 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 21:13:16.695924 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.695899 2572 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 21:13:16.696060 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.695925 2572 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-135-174.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 21:13:16.696104 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.696069 2572 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 21:13:16.696104 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.696076 2572 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 21:13:16.696104 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.696089 2572 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 21:13:16.696828 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.696818 2572 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 21:13:16.697575 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.697565 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 17 21:13:16.697700 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.697691 2572 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 21:13:16.699811 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.699802 2572 kubelet.go:491] "Attempting to sync node with API server" Apr 17 21:13:16.699853 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.699815 2572 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 21:13:16.699853 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.699826 2572 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 21:13:16.699853 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.699835 2572 kubelet.go:397] "Adding apiserver pod source" Apr 17 21:13:16.699853 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.699842 2572 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 21:13:16.701048 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.701037 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 21:13:16.701107 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.701055 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 21:13:16.701440 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.701416 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-rlk2l" Apr 17 21:13:16.705192 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.705156 2572 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 21:13:16.706718 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.706697 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-rlk2l" Apr 17 21:13:16.706840 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.706827 2572 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 21:13:16.708647 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.708634 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 21:13:16.708713 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.708654 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 21:13:16.708713 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.708664 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 21:13:16.708713 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.708671 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 21:13:16.708713 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.708677 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 21:13:16.708713 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.708683 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 21:13:16.708713 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.708688 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 21:13:16.708713 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.708693 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 21:13:16.708713 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.708701 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 21:13:16.708713 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.708707 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 21:13:16.708713 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.708715 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 21:13:16.708975 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.708724 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 21:13:16.709515 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.709505 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 21:13:16.709552 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.709516 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 21:13:16.713095 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.713083 2572 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 21:13:16.713151 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.713116 2572 server.go:1295] "Started kubelet" Apr 17 21:13:16.713294 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.713204 2572 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 21:13:16.713347 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.713290 2572 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 21:13:16.713397 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.713355 2572 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 21:13:16.713952 ip-10-0-135-174 systemd[1]: Started Kubernetes Kubelet. Apr 17 21:13:16.714408 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.714134 2572 server.go:317] "Adding debug handlers to kubelet server" Apr 17 21:13:16.714856 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.714833 2572 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 21:13:16.715015 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.714953 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 21:13:16.716987 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.716972 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 21:13:16.718630 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.718614 2572 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-135-174.ec2.internal" not found Apr 17 21:13:16.719762 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.719743 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 21:13:16.720353 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.720270 2572 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 21:13:16.722075 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.722057 2572 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 21:13:16.722075 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.722077 2572 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 21:13:16.722272 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.722183 2572 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 21:13:16.722272 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.722247 2572 reconstruct.go:97] "Volume reconstruction finished" Apr 17 21:13:16.722272 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.722256 2572 reconciler.go:26] "Reconciler: start to sync state" Apr 17 21:13:16.722688 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.722672 2572 factory.go:55] Registering systemd factory Apr 17 21:13:16.722779 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:16.722686 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-174.ec2.internal\" not found" Apr 17 21:13:16.722779 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.722757 2572 factory.go:223] Registration of the systemd container factory successfully Apr 17 21:13:16.723341 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.723327 2572 factory.go:153] Registering CRI-O factory Apr 17 21:13:16.723522 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.723504 2572 factory.go:223] Registration of the crio container factory successfully Apr 17 21:13:16.723623 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.723567 2572 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 21:13:16.723623 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.723596 2572 factory.go:103] Registering Raw factory Apr 17 21:13:16.723623 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.723610 2572 manager.go:1196] Started watching for new ooms in manager Apr 17 21:13:16.724052 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.724022 2572 manager.go:319] Starting recovery of all containers Apr 17 21:13:16.724148 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.724083 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 21:13:16.726264 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:16.726218 2572 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 21:13:16.730806 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:16.730783 2572 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-135-174.ec2.internal\" not found" node="ip-10-0-135-174.ec2.internal" Apr 17 21:13:16.734512 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.734495 2572 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-135-174.ec2.internal" not found Apr 17 21:13:16.734686 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.734672 2572 manager.go:324] Recovery completed Apr 17 21:13:16.735928 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:16.735910 2572 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 17 21:13:16.738691 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.738678 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 21:13:16.740304 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.740288 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-174.ec2.internal" event="NodeHasSufficientMemory" Apr 17 21:13:16.740369 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.740315 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-174.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 21:13:16.740369 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.740325 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-174.ec2.internal" event="NodeHasSufficientPID" Apr 17 21:13:16.740821 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.740804 2572 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 21:13:16.740821 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.740820 2572 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 21:13:16.740910 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.740840 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 17 21:13:16.743021 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.743010 2572 policy_none.go:49] "None policy: Start" Apr 17 21:13:16.743056 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.743026 2572 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 21:13:16.743056 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.743035 2572 state_mem.go:35] "Initializing new in-memory state store" Apr 17 21:13:16.767451 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.767437 2572 manager.go:341] "Starting Device Plugin manager" Apr 17 21:13:16.767534 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:16.767488 2572 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 21:13:16.767534 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.767498 2572 server.go:85] "Starting device plugin registration server" Apr 17 21:13:16.767728 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.767716 2572 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 21:13:16.767766 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.767731 2572 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 21:13:16.767858 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.767845 2572 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 21:13:16.767966 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.767916 2572 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 21:13:16.767966 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.767928 2572 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 21:13:16.768605 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:16.768575 2572 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 21:13:16.768687 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:16.768610 2572 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-135-174.ec2.internal\" not found" Apr 17 21:13:16.789595 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.789581 2572 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-135-174.ec2.internal" not found Apr 17 21:13:16.861862 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.861839 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 21:13:16.862911 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.862896 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 21:13:16.862992 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.862924 2572 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 21:13:16.862992 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.862944 2572 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 21:13:16.862992 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.862953 2572 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 21:13:16.863136 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:16.863030 2572 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 21:13:16.865338 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.865293 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 21:13:16.868193 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.868179 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 21:13:16.869437 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.869422 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-174.ec2.internal" event="NodeHasSufficientMemory" Apr 17 21:13:16.869521 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.869453 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-174.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 21:13:16.869521 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.869468 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-174.ec2.internal" event="NodeHasSufficientPID" Apr 17 21:13:16.869521 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.869498 2572 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-135-174.ec2.internal" Apr 17 21:13:16.876594 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.876576 2572 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-135-174.ec2.internal" Apr 17 21:13:16.876668 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:16.876602 2572 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-135-174.ec2.internal\": node \"ip-10-0-135-174.ec2.internal\" not found" Apr 17 21:13:16.963317 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.963295 2572 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-174.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-135-174.ec2.internal"] Apr 17 21:13:16.965347 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.965332 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-174.ec2.internal" Apr 17 21:13:16.965427 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.965338 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-174.ec2.internal" Apr 17 21:13:16.986655 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.986634 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-174.ec2.internal" Apr 17 21:13:16.990840 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:16.990824 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-174.ec2.internal" Apr 17 21:13:17.000210 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.000192 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 21:13:17.000301 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.000192 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 21:13:17.023877 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.023857 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c375478ce01e8633f2b111db25ba3d4f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-174.ec2.internal\" (UID: \"c375478ce01e8633f2b111db25ba3d4f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-174.ec2.internal" Apr 17 21:13:17.023923 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.023886 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c375478ce01e8633f2b111db25ba3d4f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-174.ec2.internal\" (UID: \"c375478ce01e8633f2b111db25ba3d4f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-174.ec2.internal" Apr 17 21:13:17.023923 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.023904 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a9002f67337ee452fe59c1455ae92d3b-config\") pod \"kube-apiserver-proxy-ip-10-0-135-174.ec2.internal\" (UID: \"a9002f67337ee452fe59c1455ae92d3b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-174.ec2.internal" Apr 17 21:13:17.124348 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.124301 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c375478ce01e8633f2b111db25ba3d4f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-174.ec2.internal\" (UID: \"c375478ce01e8633f2b111db25ba3d4f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-174.ec2.internal" Apr 17 21:13:17.124348 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.124326 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c375478ce01e8633f2b111db25ba3d4f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-174.ec2.internal\" (UID: \"c375478ce01e8633f2b111db25ba3d4f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-174.ec2.internal" Apr 17 21:13:17.124348 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.124343 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a9002f67337ee452fe59c1455ae92d3b-config\") pod \"kube-apiserver-proxy-ip-10-0-135-174.ec2.internal\" (UID: \"a9002f67337ee452fe59c1455ae92d3b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-174.ec2.internal" Apr 17 21:13:17.124499 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.124399 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c375478ce01e8633f2b111db25ba3d4f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-174.ec2.internal\" (UID: \"c375478ce01e8633f2b111db25ba3d4f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-174.ec2.internal" Apr 17 21:13:17.124499 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.124408 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a9002f67337ee452fe59c1455ae92d3b-config\") pod \"kube-apiserver-proxy-ip-10-0-135-174.ec2.internal\" (UID: \"a9002f67337ee452fe59c1455ae92d3b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-174.ec2.internal" Apr 17 21:13:17.124499 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.124430 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c375478ce01e8633f2b111db25ba3d4f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-174.ec2.internal\" (UID: \"c375478ce01e8633f2b111db25ba3d4f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-174.ec2.internal" Apr 17 21:13:17.303617 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.303596 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-174.ec2.internal" Apr 17 21:13:17.304684 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.304667 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-174.ec2.internal" Apr 17 21:13:17.631970 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.631943 2572 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 21:13:17.632637 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.632078 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 21:13:17.632637 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.632085 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 21:13:17.632637 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.632129 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 21:13:17.700379 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.700353 2572 apiserver.go:52] "Watching apiserver" Apr 17 21:13:17.706081 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.706061 2572 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 21:13:17.707946 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.707923 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-s4jhq","openshift-image-registry/node-ca-qsc7l","openshift-multus/multus-additional-cni-plugins-w6rk2","openshift-multus/multus-nbbnz","openshift-multus/network-metrics-daemon-nprtf","openshift-network-operator/iptables-alerter-4rxvx","openshift-ovn-kubernetes/ovnkube-node-7q89l","kube-system/konnectivity-agent-hllsm","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffqrk","openshift-dns/node-resolver-qnj2m","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-174.ec2.internal","openshift-network-diagnostics/network-check-target-dzwbk","kube-system/kube-apiserver-proxy-ip-10-0-135-174.ec2.internal"] Apr 17 21:13:17.708950 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.708916 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 21:08:16 +0000 UTC" deadline="2027-12-28 21:37:27.129363222 +0000 UTC" Apr 17 21:13:17.708950 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.708948 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14880h24m9.420417681s" Apr 17 21:13:17.710746 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.710332 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" Apr 17 21:13:17.710746 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.710637 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qsc7l" Apr 17 21:13:17.712350 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.711643 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-w6rk2" Apr 17 21:13:17.712993 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.712551 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-ntzjq\"" Apr 17 21:13:17.712993 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.712652 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 21:13:17.712993 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.712855 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 21:13:17.712993 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.712868 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 21:13:17.712993 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.712870 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 21:13:17.713574 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.713468 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 21:13:17.713975 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.713669 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-5qs5g\"" Apr 17 21:13:17.714076 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.714037 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.714155 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.714127 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nprtf" Apr 17 21:13:17.714237 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.714189 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 21:13:17.714296 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:17.714248 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nprtf" podUID="4574ebc4-f71f-480d-9e07-5e822f12bb1a" Apr 17 21:13:17.714573 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.714532 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 21:13:17.714653 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.714583 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-hv2kl\"" Apr 17 21:13:17.714708 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.714689 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 21:13:17.714835 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.714815 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 21:13:17.714907 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.714847 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 21:13:17.715736 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.715715 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4rxvx" Apr 17 21:13:17.716109 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.716091 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 21:13:17.716226 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.716211 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-8cjps\"" Apr 17 21:13:17.716886 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.716872 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.717405 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.717388 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 21:13:17.717547 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.717529 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 21:13:17.717784 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.717771 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 21:13:17.717784 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.717780 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-f872d\"" Apr 17 21:13:17.717887 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.717874 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-hllsm" Apr 17 21:13:17.718714 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.718698 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 21:13:17.718829 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.718700 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 21:13:17.719125 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.719060 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffqrk" Apr 17 21:13:17.719125 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.719086 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 21:13:17.719125 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.719119 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 21:13:17.719337 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.719197 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 21:13:17.719337 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.719221 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-fdsrs\"" Apr 17 21:13:17.719747 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.719729 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 21:13:17.719877 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.719862 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 21:13:17.719949 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.719911 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 21:13:17.719949 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.719934 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 21:13:17.720049 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.719934 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-zkp65\"" Apr 17 21:13:17.720274 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.720255 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qnj2m" Apr 17 21:13:17.721279 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.721262 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 21:13:17.721367 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.721295 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-lk4tq\"" Apr 17 21:13:17.721448 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.721431 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dzwbk" Apr 17 21:13:17.721506 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:17.721490 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dzwbk" podUID="063917df-783e-488d-9088-c5b98092ea29" Apr 17 21:13:17.721563 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.721526 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 21:13:17.721612 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.721602 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 21:13:17.722261 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.722240 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 21:13:17.722261 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.722258 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-t8jv7\"" Apr 17 21:13:17.722394 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.722240 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 21:13:17.722930 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.722916 2572 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 21:13:17.728404 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.728382 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-env-overrides\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.728404 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.728404 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d7ac1837-11d5-4b96-a8c1-023420cd9d90-sys-fs\") pod \"aws-ebs-csi-driver-node-ffqrk\" (UID: \"d7ac1837-11d5-4b96-a8c1-023420cd9d90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffqrk" Apr 17 21:13:17.728524 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.728419 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/066bf68c-ad0c-4b29-a288-000664effe73-cni-binary-copy\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.728524 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.728436 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/066bf68c-ad0c-4b29-a288-000664effe73-multus-daemon-config\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.728524 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.728451 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/778ea300-0830-4ba5-8bc4-8bc4314e7652-hosts-file\") pod \"node-resolver-qnj2m\" (UID: \"778ea300-0830-4ba5-8bc4-8bc4314e7652\") " pod="openshift-dns/node-resolver-qnj2m" Apr 17 21:13:17.728524 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.728489 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/778ea300-0830-4ba5-8bc4-8bc4314e7652-tmp-dir\") pod \"node-resolver-qnj2m\" (UID: \"778ea300-0830-4ba5-8bc4-8bc4314e7652\") " pod="openshift-dns/node-resolver-qnj2m" Apr 17 21:13:17.728667 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.728579 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26-system-cni-dir\") pod \"multus-additional-cni-plugins-w6rk2\" (UID: \"4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26\") " pod="openshift-multus/multus-additional-cni-plugins-w6rk2" Apr 17 21:13:17.728667 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.728612 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/066bf68c-ad0c-4b29-a288-000664effe73-hostroot\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.728667 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.728639 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4hzf\" (UniqueName: \"kubernetes.io/projected/778ea300-0830-4ba5-8bc4-8bc4314e7652-kube-api-access-j4hzf\") pod \"node-resolver-qnj2m\" (UID: \"778ea300-0830-4ba5-8bc4-8bc4314e7652\") " pod="openshift-dns/node-resolver-qnj2m" Apr 17 21:13:17.728752 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.728666 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-w6rk2\" (UID: \"4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26\") " pod="openshift-multus/multus-additional-cni-plugins-w6rk2" Apr 17 21:13:17.728752 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.728686 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-systemd-units\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.728752 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.728702 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-host-run-ovn-kubernetes\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.728752 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.728718 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6a3f259-7589-452e-a348-011b87bae9f4-etc-kubernetes\") pod \"tuned-s4jhq\" (UID: \"c6a3f259-7589-452e-a348-011b87bae9f4\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" Apr 17 21:13:17.728752 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.728739 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbjjb\" (UniqueName: \"kubernetes.io/projected/c6a3f259-7589-452e-a348-011b87bae9f4-kube-api-access-vbjjb\") pod \"tuned-s4jhq\" (UID: \"c6a3f259-7589-452e-a348-011b87bae9f4\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" Apr 17 21:13:17.728971 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.728763 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/066bf68c-ad0c-4b29-a288-000664effe73-os-release\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.728971 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.728797 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-host-slash\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.728971 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.728813 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-ovn-node-metrics-cert\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.728971 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.728834 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c6a3f259-7589-452e-a348-011b87bae9f4-etc-systemd\") pod \"tuned-s4jhq\" (UID: \"c6a3f259-7589-452e-a348-011b87bae9f4\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" Apr 17 21:13:17.728971 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.728855 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/500b1ed9-0358-4553-b67c-814ae8a286af-serviceca\") pod \"node-ca-qsc7l\" (UID: \"500b1ed9-0358-4553-b67c-814ae8a286af\") " pod="openshift-image-registry/node-ca-qsc7l" Apr 17 21:13:17.728971 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.728870 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/066bf68c-ad0c-4b29-a288-000664effe73-host-var-lib-cni-multus\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.728971 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.728903 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/066bf68c-ad0c-4b29-a288-000664effe73-multus-conf-dir\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.729246 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.728972 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26-cnibin\") pod \"multus-additional-cni-plugins-w6rk2\" (UID: \"4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26\") " pod="openshift-multus/multus-additional-cni-plugins-w6rk2" Apr 17 21:13:17.729246 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.729001 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-node-log\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.729246 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.729028 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-host-cni-bin\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.729246 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.729053 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ltlg\" (UniqueName: \"kubernetes.io/projected/d7ac1837-11d5-4b96-a8c1-023420cd9d90-kube-api-access-7ltlg\") pod \"aws-ebs-csi-driver-node-ffqrk\" (UID: \"d7ac1837-11d5-4b96-a8c1-023420cd9d90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffqrk" Apr 17 21:13:17.729246 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.729092 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/aa8fd541-9d45-4bc6-891c-792a7ef59496-agent-certs\") pod \"konnectivity-agent-hllsm\" (UID: \"aa8fd541-9d45-4bc6-891c-792a7ef59496\") " pod="kube-system/konnectivity-agent-hllsm" Apr 17 21:13:17.729246 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.729117 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d7ac1837-11d5-4b96-a8c1-023420cd9d90-registration-dir\") pod \"aws-ebs-csi-driver-node-ffqrk\" (UID: \"d7ac1837-11d5-4b96-a8c1-023420cd9d90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffqrk" Apr 17 21:13:17.729246 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.729139 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c6a3f259-7589-452e-a348-011b87bae9f4-lib-modules\") pod \"tuned-s4jhq\" (UID: \"c6a3f259-7589-452e-a348-011b87bae9f4\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" Apr 17 21:13:17.729246 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.729162 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x698d\" (UniqueName: \"kubernetes.io/projected/8aa98af8-e317-4bda-af80-1a5a1ed4fbe3-kube-api-access-x698d\") pod \"iptables-alerter-4rxvx\" (UID: \"8aa98af8-e317-4bda-af80-1a5a1ed4fbe3\") " pod="openshift-network-operator/iptables-alerter-4rxvx" Apr 17 21:13:17.729246 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.729213 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4574ebc4-f71f-480d-9e07-5e822f12bb1a-metrics-certs\") pod \"network-metrics-daemon-nprtf\" (UID: \"4574ebc4-f71f-480d-9e07-5e822f12bb1a\") " pod="openshift-multus/network-metrics-daemon-nprtf" Apr 17 21:13:17.729246 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.729236 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-run-ovn\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.729663 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.729261 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.729663 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.729287 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/066bf68c-ad0c-4b29-a288-000664effe73-host-run-k8s-cni-cncf-io\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.729663 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.729331 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8aa98af8-e317-4bda-af80-1a5a1ed4fbe3-host-slash\") pod \"iptables-alerter-4rxvx\" (UID: \"8aa98af8-e317-4bda-af80-1a5a1ed4fbe3\") " pod="openshift-network-operator/iptables-alerter-4rxvx" Apr 17 21:13:17.729663 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.729381 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l792\" (UniqueName: \"kubernetes.io/projected/063917df-783e-488d-9088-c5b98092ea29-kube-api-access-8l792\") pod \"network-check-target-dzwbk\" (UID: \"063917df-783e-488d-9088-c5b98092ea29\") " pod="openshift-network-diagnostics/network-check-target-dzwbk" Apr 17 21:13:17.729663 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.729429 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-w6rk2\" (UID: \"4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26\") " pod="openshift-multus/multus-additional-cni-plugins-w6rk2" Apr 17 21:13:17.729663 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.729458 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-host-run-netns\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.729663 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.729484 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-etc-openvswitch\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.729663 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.729508 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-ovnkube-config\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.729663 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.729550 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d7ac1837-11d5-4b96-a8c1-023420cd9d90-etc-selinux\") pod \"aws-ebs-csi-driver-node-ffqrk\" (UID: \"d7ac1837-11d5-4b96-a8c1-023420cd9d90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffqrk" Apr 17 21:13:17.729663 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.729571 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c6a3f259-7589-452e-a348-011b87bae9f4-run\") pod \"tuned-s4jhq\" (UID: \"c6a3f259-7589-452e-a348-011b87bae9f4\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" Apr 17 21:13:17.729663 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.729600 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/066bf68c-ad0c-4b29-a288-000664effe73-host-var-lib-kubelet\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.729663 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.729637 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nrc7\" (UniqueName: \"kubernetes.io/projected/066bf68c-ad0c-4b29-a288-000664effe73-kube-api-access-2nrc7\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.729663 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.729658 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26-os-release\") pod \"multus-additional-cni-plugins-w6rk2\" (UID: \"4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26\") " pod="openshift-multus/multus-additional-cni-plugins-w6rk2" Apr 17 21:13:17.730315 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.729683 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-var-lib-openvswitch\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.730315 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.729705 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-host-cni-netd\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.730315 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.729752 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs8cq\" (UniqueName: \"kubernetes.io/projected/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-kube-api-access-rs8cq\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.730315 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.729792 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/066bf68c-ad0c-4b29-a288-000664effe73-system-cni-dir\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.730315 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.729821 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/066bf68c-ad0c-4b29-a288-000664effe73-host-run-netns\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.730315 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.729842 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-log-socket\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.730315 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.729872 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/500b1ed9-0358-4553-b67c-814ae8a286af-host\") pod \"node-ca-qsc7l\" (UID: \"500b1ed9-0358-4553-b67c-814ae8a286af\") " pod="openshift-image-registry/node-ca-qsc7l" Apr 17 21:13:17.730315 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.729893 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/066bf68c-ad0c-4b29-a288-000664effe73-multus-cni-dir\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.730315 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.729922 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d7ac1837-11d5-4b96-a8c1-023420cd9d90-socket-dir\") pod \"aws-ebs-csi-driver-node-ffqrk\" (UID: \"d7ac1837-11d5-4b96-a8c1-023420cd9d90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffqrk" Apr 17 21:13:17.730315 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.729944 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c6a3f259-7589-452e-a348-011b87bae9f4-etc-sysctl-d\") pod \"tuned-s4jhq\" (UID: \"c6a3f259-7589-452e-a348-011b87bae9f4\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" Apr 17 21:13:17.730315 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.729969 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d7ac1837-11d5-4b96-a8c1-023420cd9d90-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ffqrk\" (UID: \"d7ac1837-11d5-4b96-a8c1-023420cd9d90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffqrk" Apr 17 21:13:17.730315 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.729993 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c6a3f259-7589-452e-a348-011b87bae9f4-etc-tuned\") pod \"tuned-s4jhq\" (UID: \"c6a3f259-7589-452e-a348-011b87bae9f4\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" Apr 17 21:13:17.730315 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.730024 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-run-openvswitch\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.730315 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.730046 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c6a3f259-7589-452e-a348-011b87bae9f4-etc-modprobe-d\") pod \"tuned-s4jhq\" (UID: \"c6a3f259-7589-452e-a348-011b87bae9f4\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" Apr 17 21:13:17.730315 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.730068 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c6a3f259-7589-452e-a348-011b87bae9f4-etc-sysctl-conf\") pod \"tuned-s4jhq\" (UID: \"c6a3f259-7589-452e-a348-011b87bae9f4\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" Apr 17 21:13:17.730315 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.730092 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g4hb\" (UniqueName: \"kubernetes.io/projected/500b1ed9-0358-4553-b67c-814ae8a286af-kube-api-access-5g4hb\") pod \"node-ca-qsc7l\" (UID: \"500b1ed9-0358-4553-b67c-814ae8a286af\") " pod="openshift-image-registry/node-ca-qsc7l" Apr 17 21:13:17.730315 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.730118 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/066bf68c-ad0c-4b29-a288-000664effe73-host-run-multus-certs\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.730953 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.730141 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8aa98af8-e317-4bda-af80-1a5a1ed4fbe3-iptables-alerter-script\") pod \"iptables-alerter-4rxvx\" (UID: \"8aa98af8-e317-4bda-af80-1a5a1ed4fbe3\") " pod="openshift-network-operator/iptables-alerter-4rxvx" Apr 17 21:13:17.730953 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.730197 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv6pz\" (UniqueName: \"kubernetes.io/projected/4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26-kube-api-access-gv6pz\") pod \"multus-additional-cni-plugins-w6rk2\" (UID: \"4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26\") " pod="openshift-multus/multus-additional-cni-plugins-w6rk2" Apr 17 21:13:17.730953 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.730230 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/aa8fd541-9d45-4bc6-891c-792a7ef59496-konnectivity-ca\") pod \"konnectivity-agent-hllsm\" (UID: \"aa8fd541-9d45-4bc6-891c-792a7ef59496\") " pod="kube-system/konnectivity-agent-hllsm" Apr 17 21:13:17.730953 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.730253 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26-tuning-conf-dir\") pod \"multus-additional-cni-plugins-w6rk2\" (UID: \"4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26\") " pod="openshift-multus/multus-additional-cni-plugins-w6rk2" Apr 17 21:13:17.730953 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.730269 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rclpc\" (UniqueName: \"kubernetes.io/projected/4574ebc4-f71f-480d-9e07-5e822f12bb1a-kube-api-access-rclpc\") pod \"network-metrics-daemon-nprtf\" (UID: \"4574ebc4-f71f-480d-9e07-5e822f12bb1a\") " pod="openshift-multus/network-metrics-daemon-nprtf" Apr 17 21:13:17.730953 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.730283 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-ovnkube-script-lib\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.730953 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.730305 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c6a3f259-7589-452e-a348-011b87bae9f4-tmp\") pod \"tuned-s4jhq\" (UID: \"c6a3f259-7589-452e-a348-011b87bae9f4\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" Apr 17 21:13:17.730953 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.730330 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/066bf68c-ad0c-4b29-a288-000664effe73-cnibin\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.730953 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.730348 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/066bf68c-ad0c-4b29-a288-000664effe73-host-var-lib-cni-bin\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.730953 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.730368 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/066bf68c-ad0c-4b29-a288-000664effe73-etc-kubernetes\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.730953 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.730389 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6a3f259-7589-452e-a348-011b87bae9f4-var-lib-kubelet\") pod \"tuned-s4jhq\" (UID: \"c6a3f259-7589-452e-a348-011b87bae9f4\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" Apr 17 21:13:17.730953 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.730429 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c6a3f259-7589-452e-a348-011b87bae9f4-host\") pod \"tuned-s4jhq\" (UID: \"c6a3f259-7589-452e-a348-011b87bae9f4\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" Apr 17 21:13:17.730953 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.730466 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26-cni-binary-copy\") pod \"multus-additional-cni-plugins-w6rk2\" (UID: \"4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26\") " pod="openshift-multus/multus-additional-cni-plugins-w6rk2" Apr 17 21:13:17.730953 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.730503 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-run-systemd\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.730953 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.730542 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d7ac1837-11d5-4b96-a8c1-023420cd9d90-device-dir\") pod \"aws-ebs-csi-driver-node-ffqrk\" (UID: \"d7ac1837-11d5-4b96-a8c1-023420cd9d90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffqrk" Apr 17 21:13:17.730953 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.730560 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c6a3f259-7589-452e-a348-011b87bae9f4-etc-sysconfig\") pod \"tuned-s4jhq\" (UID: \"c6a3f259-7589-452e-a348-011b87bae9f4\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" Apr 17 21:13:17.731425 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.730601 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c6a3f259-7589-452e-a348-011b87bae9f4-sys\") pod \"tuned-s4jhq\" (UID: \"c6a3f259-7589-452e-a348-011b87bae9f4\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" Apr 17 21:13:17.731425 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.730621 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/066bf68c-ad0c-4b29-a288-000664effe73-multus-socket-dir-parent\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.731425 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.730637 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-host-kubelet\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.732680 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.732658 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 21:13:17.759895 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.758672 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-dg9r6" Apr 17 21:13:17.771682 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.771515 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-dg9r6" Apr 17 21:13:17.779512 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:17.779489 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9002f67337ee452fe59c1455ae92d3b.slice/crio-cafe5b2e697ae2eec8f8dae2cc97279205ddc5e0cb9cbabcba0cec2bffcede50 WatchSource:0}: Error finding container cafe5b2e697ae2eec8f8dae2cc97279205ddc5e0cb9cbabcba0cec2bffcede50: Status 404 returned error can't find the container with id cafe5b2e697ae2eec8f8dae2cc97279205ddc5e0cb9cbabcba0cec2bffcede50 Apr 17 21:13:17.782316 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:17.782291 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc375478ce01e8633f2b111db25ba3d4f.slice/crio-59f70724b94e54ea85d410cc427186c08ab83491e4fd4594152e2d26cc0a16c3 WatchSource:0}: Error finding container 59f70724b94e54ea85d410cc427186c08ab83491e4fd4594152e2d26cc0a16c3: Status 404 returned error can't find the container with id 59f70724b94e54ea85d410cc427186c08ab83491e4fd4594152e2d26cc0a16c3 Apr 17 21:13:17.784384 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.784368 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 21:13:17.830869 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.830848 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d7ac1837-11d5-4b96-a8c1-023420cd9d90-socket-dir\") pod \"aws-ebs-csi-driver-node-ffqrk\" (UID: \"d7ac1837-11d5-4b96-a8c1-023420cd9d90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffqrk" Apr 17 21:13:17.830970 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.830876 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c6a3f259-7589-452e-a348-011b87bae9f4-etc-sysctl-d\") pod \"tuned-s4jhq\" (UID: \"c6a3f259-7589-452e-a348-011b87bae9f4\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" Apr 17 21:13:17.830970 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.830892 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d7ac1837-11d5-4b96-a8c1-023420cd9d90-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ffqrk\" (UID: \"d7ac1837-11d5-4b96-a8c1-023420cd9d90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffqrk" Apr 17 21:13:17.830970 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.830906 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c6a3f259-7589-452e-a348-011b87bae9f4-etc-tuned\") pod \"tuned-s4jhq\" (UID: \"c6a3f259-7589-452e-a348-011b87bae9f4\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" Apr 17 21:13:17.830970 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.830951 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d7ac1837-11d5-4b96-a8c1-023420cd9d90-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ffqrk\" (UID: \"d7ac1837-11d5-4b96-a8c1-023420cd9d90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffqrk" Apr 17 21:13:17.831146 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.830973 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-run-openvswitch\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.831146 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.830988 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c6a3f259-7589-452e-a348-011b87bae9f4-etc-modprobe-d\") pod \"tuned-s4jhq\" (UID: \"c6a3f259-7589-452e-a348-011b87bae9f4\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" Apr 17 21:13:17.831146 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.831004 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c6a3f259-7589-452e-a348-011b87bae9f4-etc-sysctl-conf\") pod \"tuned-s4jhq\" (UID: \"c6a3f259-7589-452e-a348-011b87bae9f4\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" Apr 17 21:13:17.831146 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.831028 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5g4hb\" (UniqueName: \"kubernetes.io/projected/500b1ed9-0358-4553-b67c-814ae8a286af-kube-api-access-5g4hb\") pod \"node-ca-qsc7l\" (UID: \"500b1ed9-0358-4553-b67c-814ae8a286af\") " pod="openshift-image-registry/node-ca-qsc7l" Apr 17 21:13:17.831146 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.831028 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d7ac1837-11d5-4b96-a8c1-023420cd9d90-socket-dir\") pod \"aws-ebs-csi-driver-node-ffqrk\" (UID: \"d7ac1837-11d5-4b96-a8c1-023420cd9d90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffqrk" Apr 17 21:13:17.831146 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.831052 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/066bf68c-ad0c-4b29-a288-000664effe73-host-run-multus-certs\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.831146 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.831057 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-run-openvswitch\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.831146 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.831076 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8aa98af8-e317-4bda-af80-1a5a1ed4fbe3-iptables-alerter-script\") pod \"iptables-alerter-4rxvx\" (UID: \"8aa98af8-e317-4bda-af80-1a5a1ed4fbe3\") " pod="openshift-network-operator/iptables-alerter-4rxvx" Apr 17 21:13:17.831146 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.831103 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gv6pz\" (UniqueName: \"kubernetes.io/projected/4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26-kube-api-access-gv6pz\") pod \"multus-additional-cni-plugins-w6rk2\" (UID: \"4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26\") " pod="openshift-multus/multus-additional-cni-plugins-w6rk2" Apr 17 21:13:17.831146 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.831129 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/aa8fd541-9d45-4bc6-891c-792a7ef59496-konnectivity-ca\") pod \"konnectivity-agent-hllsm\" (UID: \"aa8fd541-9d45-4bc6-891c-792a7ef59496\") " pod="kube-system/konnectivity-agent-hllsm" Apr 17 21:13:17.831146 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.831106 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/066bf68c-ad0c-4b29-a288-000664effe73-host-run-multus-certs\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.831652 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.831159 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26-tuning-conf-dir\") pod \"multus-additional-cni-plugins-w6rk2\" (UID: \"4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26\") " pod="openshift-multus/multus-additional-cni-plugins-w6rk2" Apr 17 21:13:17.831652 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.831186 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c6a3f259-7589-452e-a348-011b87bae9f4-etc-sysctl-conf\") pod \"tuned-s4jhq\" (UID: \"c6a3f259-7589-452e-a348-011b87bae9f4\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" Apr 17 21:13:17.831652 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.831203 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rclpc\" (UniqueName: \"kubernetes.io/projected/4574ebc4-f71f-480d-9e07-5e822f12bb1a-kube-api-access-rclpc\") pod \"network-metrics-daemon-nprtf\" (UID: \"4574ebc4-f71f-480d-9e07-5e822f12bb1a\") " pod="openshift-multus/network-metrics-daemon-nprtf" Apr 17 21:13:17.831652 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.831228 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-ovnkube-script-lib\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.831652 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.831254 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c6a3f259-7589-452e-a348-011b87bae9f4-tmp\") pod \"tuned-s4jhq\" (UID: \"c6a3f259-7589-452e-a348-011b87bae9f4\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" Apr 17 21:13:17.831652 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.831297 2572 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 21:13:17.831652 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.831052 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c6a3f259-7589-452e-a348-011b87bae9f4-etc-sysctl-d\") pod \"tuned-s4jhq\" (UID: \"c6a3f259-7589-452e-a348-011b87bae9f4\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" Apr 17 21:13:17.831652 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.831158 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c6a3f259-7589-452e-a348-011b87bae9f4-etc-modprobe-d\") pod \"tuned-s4jhq\" (UID: \"c6a3f259-7589-452e-a348-011b87bae9f4\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" Apr 17 21:13:17.831652 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.831341 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26-tuning-conf-dir\") pod \"multus-additional-cni-plugins-w6rk2\" (UID: \"4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26\") " pod="openshift-multus/multus-additional-cni-plugins-w6rk2" Apr 17 21:13:17.831652 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.831344 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/066bf68c-ad0c-4b29-a288-000664effe73-cnibin\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.831652 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.831303 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/066bf68c-ad0c-4b29-a288-000664effe73-cnibin\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.831652 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.831580 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/066bf68c-ad0c-4b29-a288-000664effe73-host-var-lib-cni-bin\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.831652 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.831607 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/066bf68c-ad0c-4b29-a288-000664effe73-etc-kubernetes\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.831652 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.831629 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6a3f259-7589-452e-a348-011b87bae9f4-var-lib-kubelet\") pod \"tuned-s4jhq\" (UID: \"c6a3f259-7589-452e-a348-011b87bae9f4\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" Apr 17 21:13:17.832100 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.831674 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c6a3f259-7589-452e-a348-011b87bae9f4-host\") pod \"tuned-s4jhq\" (UID: \"c6a3f259-7589-452e-a348-011b87bae9f4\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" Apr 17 21:13:17.832100 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.831701 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26-cni-binary-copy\") pod \"multus-additional-cni-plugins-w6rk2\" (UID: \"4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26\") " pod="openshift-multus/multus-additional-cni-plugins-w6rk2" Apr 17 21:13:17.832100 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.831713 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/066bf68c-ad0c-4b29-a288-000664effe73-etc-kubernetes\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.832100 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.831725 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-run-systemd\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.832100 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.831671 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/066bf68c-ad0c-4b29-a288-000664effe73-host-var-lib-cni-bin\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.833469 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.831734 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8aa98af8-e317-4bda-af80-1a5a1ed4fbe3-iptables-alerter-script\") pod \"iptables-alerter-4rxvx\" (UID: \"8aa98af8-e317-4bda-af80-1a5a1ed4fbe3\") " pod="openshift-network-operator/iptables-alerter-4rxvx" Apr 17 21:13:17.833568 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.833542 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/aa8fd541-9d45-4bc6-891c-792a7ef59496-konnectivity-ca\") pod \"konnectivity-agent-hllsm\" (UID: \"aa8fd541-9d45-4bc6-891c-792a7ef59496\") " pod="kube-system/konnectivity-agent-hllsm" Apr 17 21:13:17.834191 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.833668 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-run-systemd\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.834191 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.833742 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-ovnkube-script-lib\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.834191 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.833747 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d7ac1837-11d5-4b96-a8c1-023420cd9d90-device-dir\") pod \"aws-ebs-csi-driver-node-ffqrk\" (UID: \"d7ac1837-11d5-4b96-a8c1-023420cd9d90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffqrk" Apr 17 21:13:17.834191 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.833810 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d7ac1837-11d5-4b96-a8c1-023420cd9d90-device-dir\") pod \"aws-ebs-csi-driver-node-ffqrk\" (UID: \"d7ac1837-11d5-4b96-a8c1-023420cd9d90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffqrk" Apr 17 21:13:17.834191 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.833828 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c6a3f259-7589-452e-a348-011b87bae9f4-etc-sysconfig\") pod \"tuned-s4jhq\" (UID: \"c6a3f259-7589-452e-a348-011b87bae9f4\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" Apr 17 21:13:17.834191 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.833876 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6a3f259-7589-452e-a348-011b87bae9f4-var-lib-kubelet\") pod \"tuned-s4jhq\" (UID: \"c6a3f259-7589-452e-a348-011b87bae9f4\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" Apr 17 21:13:17.834191 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.833886 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c6a3f259-7589-452e-a348-011b87bae9f4-sys\") pod \"tuned-s4jhq\" (UID: \"c6a3f259-7589-452e-a348-011b87bae9f4\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" Apr 17 21:13:17.834191 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.833909 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c6a3f259-7589-452e-a348-011b87bae9f4-sys\") pod \"tuned-s4jhq\" (UID: \"c6a3f259-7589-452e-a348-011b87bae9f4\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" Apr 17 21:13:17.834191 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.833935 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/066bf68c-ad0c-4b29-a288-000664effe73-multus-socket-dir-parent\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.834191 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.833965 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-host-kubelet\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.834191 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.833970 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c6a3f259-7589-452e-a348-011b87bae9f4-etc-sysconfig\") pod \"tuned-s4jhq\" (UID: \"c6a3f259-7589-452e-a348-011b87bae9f4\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" Apr 17 21:13:17.834191 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.834029 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-env-overrides\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.834191 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.834061 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c6a3f259-7589-452e-a348-011b87bae9f4-host\") pod \"tuned-s4jhq\" (UID: \"c6a3f259-7589-452e-a348-011b87bae9f4\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" Apr 17 21:13:17.834191 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.834069 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d7ac1837-11d5-4b96-a8c1-023420cd9d90-sys-fs\") pod \"aws-ebs-csi-driver-node-ffqrk\" (UID: \"d7ac1837-11d5-4b96-a8c1-023420cd9d90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffqrk" Apr 17 21:13:17.834191 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.834103 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/066bf68c-ad0c-4b29-a288-000664effe73-cni-binary-copy\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.834191 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.834183 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-host-kubelet\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.834191 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.834194 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/066bf68c-ad0c-4b29-a288-000664effe73-multus-daemon-config\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.835013 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.834229 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/778ea300-0830-4ba5-8bc4-8bc4314e7652-hosts-file\") pod \"node-resolver-qnj2m\" (UID: \"778ea300-0830-4ba5-8bc4-8bc4314e7652\") " pod="openshift-dns/node-resolver-qnj2m" Apr 17 21:13:17.835013 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.834244 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d7ac1837-11d5-4b96-a8c1-023420cd9d90-sys-fs\") pod \"aws-ebs-csi-driver-node-ffqrk\" (UID: \"d7ac1837-11d5-4b96-a8c1-023420cd9d90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffqrk" Apr 17 21:13:17.835013 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.834256 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/778ea300-0830-4ba5-8bc4-8bc4314e7652-tmp-dir\") pod \"node-resolver-qnj2m\" (UID: \"778ea300-0830-4ba5-8bc4-8bc4314e7652\") " pod="openshift-dns/node-resolver-qnj2m" Apr 17 21:13:17.835013 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.834290 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26-system-cni-dir\") pod \"multus-additional-cni-plugins-w6rk2\" (UID: \"4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26\") " pod="openshift-multus/multus-additional-cni-plugins-w6rk2" Apr 17 21:13:17.835013 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.834321 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/066bf68c-ad0c-4b29-a288-000664effe73-hostroot\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.835013 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.834352 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j4hzf\" (UniqueName: \"kubernetes.io/projected/778ea300-0830-4ba5-8bc4-8bc4314e7652-kube-api-access-j4hzf\") pod \"node-resolver-qnj2m\" (UID: \"778ea300-0830-4ba5-8bc4-8bc4314e7652\") " pod="openshift-dns/node-resolver-qnj2m" Apr 17 21:13:17.835013 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.834384 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-w6rk2\" (UID: \"4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26\") " pod="openshift-multus/multus-additional-cni-plugins-w6rk2" Apr 17 21:13:17.835013 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.834439 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c6a3f259-7589-452e-a348-011b87bae9f4-etc-tuned\") pod \"tuned-s4jhq\" (UID: \"c6a3f259-7589-452e-a348-011b87bae9f4\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" Apr 17 21:13:17.835013 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.834459 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-env-overrides\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.835013 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.834484 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-systemd-units\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.835013 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.834033 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/066bf68c-ad0c-4b29-a288-000664effe73-multus-socket-dir-parent\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.835013 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.834441 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-systemd-units\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.835013 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.834543 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-host-run-ovn-kubernetes\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.835013 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.834545 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26-cni-binary-copy\") pod \"multus-additional-cni-plugins-w6rk2\" (UID: \"4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26\") " pod="openshift-multus/multus-additional-cni-plugins-w6rk2" Apr 17 21:13:17.835013 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.834600 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6a3f259-7589-452e-a348-011b87bae9f4-etc-kubernetes\") pod \"tuned-s4jhq\" (UID: \"c6a3f259-7589-452e-a348-011b87bae9f4\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" Apr 17 21:13:17.835013 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.834635 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vbjjb\" (UniqueName: \"kubernetes.io/projected/c6a3f259-7589-452e-a348-011b87bae9f4-kube-api-access-vbjjb\") pod \"tuned-s4jhq\" (UID: \"c6a3f259-7589-452e-a348-011b87bae9f4\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" Apr 17 21:13:17.835013 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.834667 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/066bf68c-ad0c-4b29-a288-000664effe73-os-release\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.835701 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.834676 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-host-run-ovn-kubernetes\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.835701 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.834698 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-host-slash\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.835701 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.834736 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/778ea300-0830-4ba5-8bc4-8bc4314e7652-hosts-file\") pod \"node-resolver-qnj2m\" (UID: \"778ea300-0830-4ba5-8bc4-8bc4314e7652\") " pod="openshift-dns/node-resolver-qnj2m" Apr 17 21:13:17.835701 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.834746 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-ovn-node-metrics-cert\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.835701 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.834780 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26-system-cni-dir\") pod \"multus-additional-cni-plugins-w6rk2\" (UID: \"4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26\") " pod="openshift-multus/multus-additional-cni-plugins-w6rk2" Apr 17 21:13:17.835701 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.834811 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/778ea300-0830-4ba5-8bc4-8bc4314e7652-tmp-dir\") pod \"node-resolver-qnj2m\" (UID: \"778ea300-0830-4ba5-8bc4-8bc4314e7652\") " pod="openshift-dns/node-resolver-qnj2m" Apr 17 21:13:17.835701 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.834817 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/066bf68c-ad0c-4b29-a288-000664effe73-hostroot\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.835701 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.834842 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c6a3f259-7589-452e-a348-011b87bae9f4-etc-systemd\") pod \"tuned-s4jhq\" (UID: \"c6a3f259-7589-452e-a348-011b87bae9f4\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" Apr 17 21:13:17.835701 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.834894 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c6a3f259-7589-452e-a348-011b87bae9f4-etc-systemd\") pod \"tuned-s4jhq\" (UID: \"c6a3f259-7589-452e-a348-011b87bae9f4\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" Apr 17 21:13:17.835701 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.834943 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-host-slash\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.835701 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.834988 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6a3f259-7589-452e-a348-011b87bae9f4-etc-kubernetes\") pod \"tuned-s4jhq\" (UID: \"c6a3f259-7589-452e-a348-011b87bae9f4\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" Apr 17 21:13:17.835701 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.835059 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/066bf68c-ad0c-4b29-a288-000664effe73-os-release\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.835701 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.835094 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/500b1ed9-0358-4553-b67c-814ae8a286af-serviceca\") pod \"node-ca-qsc7l\" (UID: \"500b1ed9-0358-4553-b67c-814ae8a286af\") " pod="openshift-image-registry/node-ca-qsc7l" Apr 17 21:13:17.835701 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.835128 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/066bf68c-ad0c-4b29-a288-000664effe73-host-var-lib-cni-multus\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.835701 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.835157 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/066bf68c-ad0c-4b29-a288-000664effe73-multus-conf-dir\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.835701 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.835198 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26-cnibin\") pod \"multus-additional-cni-plugins-w6rk2\" (UID: \"4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26\") " pod="openshift-multus/multus-additional-cni-plugins-w6rk2" Apr 17 21:13:17.835701 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.835227 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-node-log\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.835701 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.835271 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-host-cni-bin\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.836386 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.835290 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-w6rk2\" (UID: \"4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26\") " pod="openshift-multus/multus-additional-cni-plugins-w6rk2" Apr 17 21:13:17.836386 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.835301 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7ltlg\" (UniqueName: \"kubernetes.io/projected/d7ac1837-11d5-4b96-a8c1-023420cd9d90-kube-api-access-7ltlg\") pod \"aws-ebs-csi-driver-node-ffqrk\" (UID: \"d7ac1837-11d5-4b96-a8c1-023420cd9d90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffqrk" Apr 17 21:13:17.836386 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.835333 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/aa8fd541-9d45-4bc6-891c-792a7ef59496-agent-certs\") pod \"konnectivity-agent-hllsm\" (UID: \"aa8fd541-9d45-4bc6-891c-792a7ef59496\") " pod="kube-system/konnectivity-agent-hllsm" Apr 17 21:13:17.836386 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.835361 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d7ac1837-11d5-4b96-a8c1-023420cd9d90-registration-dir\") pod \"aws-ebs-csi-driver-node-ffqrk\" (UID: \"d7ac1837-11d5-4b96-a8c1-023420cd9d90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffqrk" Apr 17 21:13:17.836386 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.835366 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/066bf68c-ad0c-4b29-a288-000664effe73-multus-conf-dir\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.836386 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.835390 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c6a3f259-7589-452e-a348-011b87bae9f4-lib-modules\") pod \"tuned-s4jhq\" (UID: \"c6a3f259-7589-452e-a348-011b87bae9f4\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" Apr 17 21:13:17.836386 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.835420 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x698d\" (UniqueName: \"kubernetes.io/projected/8aa98af8-e317-4bda-af80-1a5a1ed4fbe3-kube-api-access-x698d\") pod \"iptables-alerter-4rxvx\" (UID: \"8aa98af8-e317-4bda-af80-1a5a1ed4fbe3\") " pod="openshift-network-operator/iptables-alerter-4rxvx" Apr 17 21:13:17.836386 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.835450 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4574ebc4-f71f-480d-9e07-5e822f12bb1a-metrics-certs\") pod \"network-metrics-daemon-nprtf\" (UID: \"4574ebc4-f71f-480d-9e07-5e822f12bb1a\") " pod="openshift-multus/network-metrics-daemon-nprtf" Apr 17 21:13:17.836386 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.835511 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-run-ovn\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.836386 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.835517 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/066bf68c-ad0c-4b29-a288-000664effe73-multus-daemon-config\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.836386 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.835538 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.836386 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.835574 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/066bf68c-ad0c-4b29-a288-000664effe73-host-run-k8s-cni-cncf-io\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.836386 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.835605 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8aa98af8-e317-4bda-af80-1a5a1ed4fbe3-host-slash\") pod \"iptables-alerter-4rxvx\" (UID: \"8aa98af8-e317-4bda-af80-1a5a1ed4fbe3\") " pod="openshift-network-operator/iptables-alerter-4rxvx" Apr 17 21:13:17.836386 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.835637 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8l792\" (UniqueName: \"kubernetes.io/projected/063917df-783e-488d-9088-c5b98092ea29-kube-api-access-8l792\") pod \"network-check-target-dzwbk\" (UID: \"063917df-783e-488d-9088-c5b98092ea29\") " pod="openshift-network-diagnostics/network-check-target-dzwbk" Apr 17 21:13:17.836386 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.835670 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-w6rk2\" (UID: \"4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26\") " pod="openshift-multus/multus-additional-cni-plugins-w6rk2" Apr 17 21:13:17.836386 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.835696 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-host-run-netns\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.836386 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.835730 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-etc-openvswitch\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.837134 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.835759 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-ovnkube-config\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.837134 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.835791 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d7ac1837-11d5-4b96-a8c1-023420cd9d90-etc-selinux\") pod \"aws-ebs-csi-driver-node-ffqrk\" (UID: \"d7ac1837-11d5-4b96-a8c1-023420cd9d90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffqrk" Apr 17 21:13:17.837134 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.835821 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c6a3f259-7589-452e-a348-011b87bae9f4-run\") pod \"tuned-s4jhq\" (UID: \"c6a3f259-7589-452e-a348-011b87bae9f4\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" Apr 17 21:13:17.837134 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.835848 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/066bf68c-ad0c-4b29-a288-000664effe73-host-var-lib-kubelet\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.837134 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.835877 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2nrc7\" (UniqueName: \"kubernetes.io/projected/066bf68c-ad0c-4b29-a288-000664effe73-kube-api-access-2nrc7\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.837134 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:17.835881 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:13:17.837134 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.835908 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26-os-release\") pod \"multus-additional-cni-plugins-w6rk2\" (UID: \"4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26\") " pod="openshift-multus/multus-additional-cni-plugins-w6rk2" Apr 17 21:13:17.837134 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.835938 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-var-lib-openvswitch\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.837134 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:17.835969 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4574ebc4-f71f-480d-9e07-5e822f12bb1a-metrics-certs podName:4574ebc4-f71f-480d-9e07-5e822f12bb1a nodeName:}" failed. No retries permitted until 2026-04-17 21:13:18.3359481 +0000 UTC m=+2.032503173 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4574ebc4-f71f-480d-9e07-5e822f12bb1a-metrics-certs") pod "network-metrics-daemon-nprtf" (UID: "4574ebc4-f71f-480d-9e07-5e822f12bb1a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:13:17.837134 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.835994 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-var-lib-openvswitch\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.837134 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.836005 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-host-cni-netd\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.837134 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.836041 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-run-ovn\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.837134 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.836050 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rs8cq\" (UniqueName: \"kubernetes.io/projected/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-kube-api-access-rs8cq\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.837134 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.836088 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/066bf68c-ad0c-4b29-a288-000664effe73-system-cni-dir\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.837134 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.836093 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.837134 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.836122 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/066bf68c-ad0c-4b29-a288-000664effe73-host-run-netns\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.837134 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.836140 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/066bf68c-ad0c-4b29-a288-000664effe73-host-run-k8s-cni-cncf-io\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.837943 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.836151 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-log-socket\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.837943 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.836203 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/500b1ed9-0358-4553-b67c-814ae8a286af-host\") pod \"node-ca-qsc7l\" (UID: \"500b1ed9-0358-4553-b67c-814ae8a286af\") " pod="openshift-image-registry/node-ca-qsc7l" Apr 17 21:13:17.837943 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.836205 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8aa98af8-e317-4bda-af80-1a5a1ed4fbe3-host-slash\") pod \"iptables-alerter-4rxvx\" (UID: \"8aa98af8-e317-4bda-af80-1a5a1ed4fbe3\") " pod="openshift-network-operator/iptables-alerter-4rxvx" Apr 17 21:13:17.837943 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.836237 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/066bf68c-ad0c-4b29-a288-000664effe73-multus-cni-dir\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.837943 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.836261 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/066bf68c-ad0c-4b29-a288-000664effe73-cni-binary-copy\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.837943 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.836332 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/066bf68c-ad0c-4b29-a288-000664effe73-host-var-lib-cni-multus\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.837943 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.836363 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/066bf68c-ad0c-4b29-a288-000664effe73-multus-cni-dir\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.837943 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.836435 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26-cnibin\") pod \"multus-additional-cni-plugins-w6rk2\" (UID: \"4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26\") " pod="openshift-multus/multus-additional-cni-plugins-w6rk2" Apr 17 21:13:17.837943 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.836486 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-node-log\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.837943 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.836569 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-host-cni-netd\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.837943 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.836691 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-host-cni-bin\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.837943 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.836875 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/066bf68c-ad0c-4b29-a288-000664effe73-system-cni-dir\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.837943 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.836937 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/066bf68c-ad0c-4b29-a288-000664effe73-host-run-netns\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.837943 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.836963 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-host-run-netns\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.837943 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.836985 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-log-socket\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.837943 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.837021 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-etc-openvswitch\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.837943 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.837042 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/500b1ed9-0358-4553-b67c-814ae8a286af-host\") pod \"node-ca-qsc7l\" (UID: \"500b1ed9-0358-4553-b67c-814ae8a286af\") " pod="openshift-image-registry/node-ca-qsc7l" Apr 17 21:13:17.837943 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.837450 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/500b1ed9-0358-4553-b67c-814ae8a286af-serviceca\") pod \"node-ca-qsc7l\" (UID: \"500b1ed9-0358-4553-b67c-814ae8a286af\") " pod="openshift-image-registry/node-ca-qsc7l" Apr 17 21:13:17.838811 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.837531 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d7ac1837-11d5-4b96-a8c1-023420cd9d90-etc-selinux\") pod \"aws-ebs-csi-driver-node-ffqrk\" (UID: \"d7ac1837-11d5-4b96-a8c1-023420cd9d90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffqrk" Apr 17 21:13:17.838811 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.837600 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c6a3f259-7589-452e-a348-011b87bae9f4-run\") pod \"tuned-s4jhq\" (UID: \"c6a3f259-7589-452e-a348-011b87bae9f4\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" Apr 17 21:13:17.838811 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.837607 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-ovnkube-config\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.838811 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.837687 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26-os-release\") pod \"multus-additional-cni-plugins-w6rk2\" (UID: \"4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26\") " pod="openshift-multus/multus-additional-cni-plugins-w6rk2" Apr 17 21:13:17.838811 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.837702 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-w6rk2\" (UID: \"4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26\") " pod="openshift-multus/multus-additional-cni-plugins-w6rk2" Apr 17 21:13:17.838811 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.837753 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/066bf68c-ad0c-4b29-a288-000664effe73-host-var-lib-kubelet\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.838811 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.838120 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c6a3f259-7589-452e-a348-011b87bae9f4-lib-modules\") pod \"tuned-s4jhq\" (UID: \"c6a3f259-7589-452e-a348-011b87bae9f4\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" Apr 17 21:13:17.838811 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.838203 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d7ac1837-11d5-4b96-a8c1-023420cd9d90-registration-dir\") pod \"aws-ebs-csi-driver-node-ffqrk\" (UID: \"d7ac1837-11d5-4b96-a8c1-023420cd9d90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffqrk" Apr 17 21:13:17.839242 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.838859 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c6a3f259-7589-452e-a348-011b87bae9f4-tmp\") pod \"tuned-s4jhq\" (UID: \"c6a3f259-7589-452e-a348-011b87bae9f4\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" Apr 17 21:13:17.839242 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.839077 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-ovn-node-metrics-cert\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.839506 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.839440 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/aa8fd541-9d45-4bc6-891c-792a7ef59496-agent-certs\") pod \"konnectivity-agent-hllsm\" (UID: \"aa8fd541-9d45-4bc6-891c-792a7ef59496\") " pod="kube-system/konnectivity-agent-hllsm" Apr 17 21:13:17.840959 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.840924 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rclpc\" (UniqueName: \"kubernetes.io/projected/4574ebc4-f71f-480d-9e07-5e822f12bb1a-kube-api-access-rclpc\") pod \"network-metrics-daemon-nprtf\" (UID: \"4574ebc4-f71f-480d-9e07-5e822f12bb1a\") " pod="openshift-multus/network-metrics-daemon-nprtf" Apr 17 21:13:17.841500 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.841438 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g4hb\" (UniqueName: \"kubernetes.io/projected/500b1ed9-0358-4553-b67c-814ae8a286af-kube-api-access-5g4hb\") pod \"node-ca-qsc7l\" (UID: \"500b1ed9-0358-4553-b67c-814ae8a286af\") " pod="openshift-image-registry/node-ca-qsc7l" Apr 17 21:13:17.841878 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.841852 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv6pz\" (UniqueName: \"kubernetes.io/projected/4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26-kube-api-access-gv6pz\") pod \"multus-additional-cni-plugins-w6rk2\" (UID: \"4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26\") " pod="openshift-multus/multus-additional-cni-plugins-w6rk2" Apr 17 21:13:17.842157 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.842143 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4hzf\" (UniqueName: \"kubernetes.io/projected/778ea300-0830-4ba5-8bc4-8bc4314e7652-kube-api-access-j4hzf\") pod \"node-resolver-qnj2m\" (UID: \"778ea300-0830-4ba5-8bc4-8bc4314e7652\") " pod="openshift-dns/node-resolver-qnj2m" Apr 17 21:13:17.842737 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.842722 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbjjb\" (UniqueName: \"kubernetes.io/projected/c6a3f259-7589-452e-a348-011b87bae9f4-kube-api-access-vbjjb\") pod \"tuned-s4jhq\" (UID: \"c6a3f259-7589-452e-a348-011b87bae9f4\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" Apr 17 21:13:17.847072 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:17.847057 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 21:13:17.847072 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:17.847074 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 21:13:17.847232 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:17.847083 2572 projected.go:194] Error preparing data for projected volume kube-api-access-8l792 for pod openshift-network-diagnostics/network-check-target-dzwbk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:13:17.847232 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:17.847136 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/063917df-783e-488d-9088-c5b98092ea29-kube-api-access-8l792 podName:063917df-783e-488d-9088-c5b98092ea29 nodeName:}" failed. No retries permitted until 2026-04-17 21:13:18.347123252 +0000 UTC m=+2.043678301 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-8l792" (UniqueName: "kubernetes.io/projected/063917df-783e-488d-9088-c5b98092ea29-kube-api-access-8l792") pod "network-check-target-dzwbk" (UID: "063917df-783e-488d-9088-c5b98092ea29") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:13:17.849161 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.849139 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ltlg\" (UniqueName: \"kubernetes.io/projected/d7ac1837-11d5-4b96-a8c1-023420cd9d90-kube-api-access-7ltlg\") pod \"aws-ebs-csi-driver-node-ffqrk\" (UID: \"d7ac1837-11d5-4b96-a8c1-023420cd9d90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffqrk" Apr 17 21:13:17.849383 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.849362 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs8cq\" (UniqueName: \"kubernetes.io/projected/6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82-kube-api-access-rs8cq\") pod \"ovnkube-node-7q89l\" (UID: \"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82\") " pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:17.849759 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.849735 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x698d\" (UniqueName: \"kubernetes.io/projected/8aa98af8-e317-4bda-af80-1a5a1ed4fbe3-kube-api-access-x698d\") pod \"iptables-alerter-4rxvx\" (UID: \"8aa98af8-e317-4bda-af80-1a5a1ed4fbe3\") " pod="openshift-network-operator/iptables-alerter-4rxvx" Apr 17 21:13:17.851572 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.851557 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nrc7\" (UniqueName: \"kubernetes.io/projected/066bf68c-ad0c-4b29-a288-000664effe73-kube-api-access-2nrc7\") pod \"multus-nbbnz\" (UID: \"066bf68c-ad0c-4b29-a288-000664effe73\") " pod="openshift-multus/multus-nbbnz" Apr 17 21:13:17.866291 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.866258 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-174.ec2.internal" event={"ID":"c375478ce01e8633f2b111db25ba3d4f","Type":"ContainerStarted","Data":"59f70724b94e54ea85d410cc427186c08ab83491e4fd4594152e2d26cc0a16c3"} Apr 17 21:13:17.867195 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:17.867163 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-174.ec2.internal" event={"ID":"a9002f67337ee452fe59c1455ae92d3b","Type":"ContainerStarted","Data":"cafe5b2e697ae2eec8f8dae2cc97279205ddc5e0cb9cbabcba0cec2bffcede50"} Apr 17 21:13:18.028931 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:18.028909 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" Apr 17 21:13:18.034657 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:18.034633 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6a3f259_7589_452e_a348_011b87bae9f4.slice/crio-675df675215cce87a63574c6647d7cbbf921a03c2339667372263627ddc484d4 WatchSource:0}: Error finding container 675df675215cce87a63574c6647d7cbbf921a03c2339667372263627ddc484d4: Status 404 returned error can't find the container with id 675df675215cce87a63574c6647d7cbbf921a03c2339667372263627ddc484d4 Apr 17 21:13:18.043808 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:18.043789 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qsc7l" Apr 17 21:13:18.049466 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:18.049442 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod500b1ed9_0358_4553_b67c_814ae8a286af.slice/crio-2fe879806af43962b9331a968a1a2916740ddc0fb7f144c8ecb52ba56b3b0c93 WatchSource:0}: Error finding container 2fe879806af43962b9331a968a1a2916740ddc0fb7f144c8ecb52ba56b3b0c93: Status 404 returned error can't find the container with id 2fe879806af43962b9331a968a1a2916740ddc0fb7f144c8ecb52ba56b3b0c93 Apr 17 21:13:18.062839 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:18.062820 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-w6rk2" Apr 17 21:13:18.067788 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:18.067770 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ec35e02_e4dc_4f4b_8be7_ab8dceabbc26.slice/crio-25df4bd58c99c41a2cbb875510b3857f4724278fd932b78290e10cacf42bc699 WatchSource:0}: Error finding container 25df4bd58c99c41a2cbb875510b3857f4724278fd932b78290e10cacf42bc699: Status 404 returned error can't find the container with id 25df4bd58c99c41a2cbb875510b3857f4724278fd932b78290e10cacf42bc699 Apr 17 21:13:18.078351 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:18.078336 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nbbnz" Apr 17 21:13:18.083373 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:18.083351 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod066bf68c_ad0c_4b29_a288_000664effe73.slice/crio-5819dcf22c531ce18291154efbefe8149e71a3fdb90e575ec80b29330eb310fe WatchSource:0}: Error finding container 5819dcf22c531ce18291154efbefe8149e71a3fdb90e575ec80b29330eb310fe: Status 404 returned error can't find the container with id 5819dcf22c531ce18291154efbefe8149e71a3fdb90e575ec80b29330eb310fe Apr 17 21:13:18.084951 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:18.084930 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4rxvx" Apr 17 21:13:18.089955 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:18.089937 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8aa98af8_e317_4bda_af80_1a5a1ed4fbe3.slice/crio-abd4e84fbef1e9505cae38456c0c917eefb92ba6c8338eda38e54be8337f7076 WatchSource:0}: Error finding container abd4e84fbef1e9505cae38456c0c917eefb92ba6c8338eda38e54be8337f7076: Status 404 returned error can't find the container with id abd4e84fbef1e9505cae38456c0c917eefb92ba6c8338eda38e54be8337f7076 Apr 17 21:13:18.092585 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:18.092570 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:18.096801 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:18.096778 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-hllsm" Apr 17 21:13:18.098152 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:18.098135 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b49b1b9_2c21_4e2b_aed1_79c8cdc16b82.slice/crio-db319d082e14dda5645d1c1094e0577c2b5f43014d27d05e3493859ddb453c13 WatchSource:0}: Error finding container db319d082e14dda5645d1c1094e0577c2b5f43014d27d05e3493859ddb453c13: Status 404 returned error can't find the container with id db319d082e14dda5645d1c1094e0577c2b5f43014d27d05e3493859ddb453c13 Apr 17 21:13:18.102352 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:18.102334 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffqrk" Apr 17 21:13:18.102804 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:18.102785 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa8fd541_9d45_4bc6_891c_792a7ef59496.slice/crio-2656825435d542dc1c08afb7eaab366c1b7ee9ab2f7c51b310ab592eb0a125a3 WatchSource:0}: Error finding container 2656825435d542dc1c08afb7eaab366c1b7ee9ab2f7c51b310ab592eb0a125a3: Status 404 returned error can't find the container with id 2656825435d542dc1c08afb7eaab366c1b7ee9ab2f7c51b310ab592eb0a125a3 Apr 17 21:13:18.107243 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:18.107227 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qnj2m" Apr 17 21:13:18.109364 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:18.109293 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7ac1837_11d5_4b96_a8c1_023420cd9d90.slice/crio-0738132b3a78cdab493248bfeb8ebbddbd3c1ed7cdf38ad2049ff09d54d662f0 WatchSource:0}: Error finding container 0738132b3a78cdab493248bfeb8ebbddbd3c1ed7cdf38ad2049ff09d54d662f0: Status 404 returned error can't find the container with id 0738132b3a78cdab493248bfeb8ebbddbd3c1ed7cdf38ad2049ff09d54d662f0 Apr 17 21:13:18.114023 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:13:18.114007 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod778ea300_0830_4ba5_8bc4_8bc4314e7652.slice/crio-6644490867d86d37e2e6310ba7c562ec81d00ee8edd3577079a998e6e6c06a66 WatchSource:0}: Error finding container 6644490867d86d37e2e6310ba7c562ec81d00ee8edd3577079a998e6e6c06a66: Status 404 returned error can't find the container with id 6644490867d86d37e2e6310ba7c562ec81d00ee8edd3577079a998e6e6c06a66 Apr 17 21:13:18.339568 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:18.339487 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4574ebc4-f71f-480d-9e07-5e822f12bb1a-metrics-certs\") pod \"network-metrics-daemon-nprtf\" (UID: \"4574ebc4-f71f-480d-9e07-5e822f12bb1a\") " pod="openshift-multus/network-metrics-daemon-nprtf" Apr 17 21:13:18.339708 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:18.339636 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:13:18.339708 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:18.339698 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4574ebc4-f71f-480d-9e07-5e822f12bb1a-metrics-certs podName:4574ebc4-f71f-480d-9e07-5e822f12bb1a nodeName:}" failed. No retries permitted until 2026-04-17 21:13:19.339678512 +0000 UTC m=+3.036233578 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4574ebc4-f71f-480d-9e07-5e822f12bb1a-metrics-certs") pod "network-metrics-daemon-nprtf" (UID: "4574ebc4-f71f-480d-9e07-5e822f12bb1a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:13:18.441256 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:18.440600 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8l792\" (UniqueName: \"kubernetes.io/projected/063917df-783e-488d-9088-c5b98092ea29-kube-api-access-8l792\") pod \"network-check-target-dzwbk\" (UID: \"063917df-783e-488d-9088-c5b98092ea29\") " pod="openshift-network-diagnostics/network-check-target-dzwbk" Apr 17 21:13:18.441256 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:18.440802 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 21:13:18.441256 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:18.440822 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 21:13:18.441256 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:18.440834 2572 projected.go:194] Error preparing data for projected volume kube-api-access-8l792 for pod openshift-network-diagnostics/network-check-target-dzwbk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:13:18.441256 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:18.440894 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/063917df-783e-488d-9088-c5b98092ea29-kube-api-access-8l792 podName:063917df-783e-488d-9088-c5b98092ea29 nodeName:}" failed. No retries permitted until 2026-04-17 21:13:19.440873509 +0000 UTC m=+3.137428582 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-8l792" (UniqueName: "kubernetes.io/projected/063917df-783e-488d-9088-c5b98092ea29-kube-api-access-8l792") pod "network-check-target-dzwbk" (UID: "063917df-783e-488d-9088-c5b98092ea29") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:13:18.732408 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:18.732380 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 21:13:18.772526 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:18.772479 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 21:08:17 +0000 UTC" deadline="2028-01-22 17:37:08.972673132 +0000 UTC" Apr 17 21:13:18.772526 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:18.772524 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15476h23m50.200153149s" Apr 17 21:13:18.880615 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:18.880554 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qnj2m" event={"ID":"778ea300-0830-4ba5-8bc4-8bc4314e7652","Type":"ContainerStarted","Data":"6644490867d86d37e2e6310ba7c562ec81d00ee8edd3577079a998e6e6c06a66"} Apr 17 21:13:18.889710 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:18.889678 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffqrk" event={"ID":"d7ac1837-11d5-4b96-a8c1-023420cd9d90","Type":"ContainerStarted","Data":"0738132b3a78cdab493248bfeb8ebbddbd3c1ed7cdf38ad2049ff09d54d662f0"} Apr 17 21:13:18.891701 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:18.891673 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-hllsm" event={"ID":"aa8fd541-9d45-4bc6-891c-792a7ef59496","Type":"ContainerStarted","Data":"2656825435d542dc1c08afb7eaab366c1b7ee9ab2f7c51b310ab592eb0a125a3"} Apr 17 21:13:18.892995 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:18.892971 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" event={"ID":"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82","Type":"ContainerStarted","Data":"db319d082e14dda5645d1c1094e0577c2b5f43014d27d05e3493859ddb453c13"} Apr 17 21:13:18.898963 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:18.898933 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nbbnz" event={"ID":"066bf68c-ad0c-4b29-a288-000664effe73","Type":"ContainerStarted","Data":"5819dcf22c531ce18291154efbefe8149e71a3fdb90e575ec80b29330eb310fe"} Apr 17 21:13:18.915644 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:18.915607 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w6rk2" event={"ID":"4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26","Type":"ContainerStarted","Data":"25df4bd58c99c41a2cbb875510b3857f4724278fd932b78290e10cacf42bc699"} Apr 17 21:13:18.923990 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:18.923960 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4rxvx" event={"ID":"8aa98af8-e317-4bda-af80-1a5a1ed4fbe3","Type":"ContainerStarted","Data":"abd4e84fbef1e9505cae38456c0c917eefb92ba6c8338eda38e54be8337f7076"} Apr 17 21:13:18.929513 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:18.929487 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qsc7l" event={"ID":"500b1ed9-0358-4553-b67c-814ae8a286af","Type":"ContainerStarted","Data":"2fe879806af43962b9331a968a1a2916740ddc0fb7f144c8ecb52ba56b3b0c93"} Apr 17 21:13:18.951860 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:18.951628 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" event={"ID":"c6a3f259-7589-452e-a348-011b87bae9f4","Type":"ContainerStarted","Data":"675df675215cce87a63574c6647d7cbbf921a03c2339667372263627ddc484d4"} Apr 17 21:13:19.132328 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:19.132253 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 21:13:19.198518 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:19.198244 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 21:13:19.349199 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:19.349004 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4574ebc4-f71f-480d-9e07-5e822f12bb1a-metrics-certs\") pod \"network-metrics-daemon-nprtf\" (UID: \"4574ebc4-f71f-480d-9e07-5e822f12bb1a\") " pod="openshift-multus/network-metrics-daemon-nprtf" Apr 17 21:13:19.349377 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:19.349208 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:13:19.349377 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:19.349269 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4574ebc4-f71f-480d-9e07-5e822f12bb1a-metrics-certs podName:4574ebc4-f71f-480d-9e07-5e822f12bb1a nodeName:}" failed. No retries permitted until 2026-04-17 21:13:21.349252458 +0000 UTC m=+5.045807508 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4574ebc4-f71f-480d-9e07-5e822f12bb1a-metrics-certs") pod "network-metrics-daemon-nprtf" (UID: "4574ebc4-f71f-480d-9e07-5e822f12bb1a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:13:19.449906 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:19.449827 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8l792\" (UniqueName: \"kubernetes.io/projected/063917df-783e-488d-9088-c5b98092ea29-kube-api-access-8l792\") pod \"network-check-target-dzwbk\" (UID: \"063917df-783e-488d-9088-c5b98092ea29\") " pod="openshift-network-diagnostics/network-check-target-dzwbk" Apr 17 21:13:19.450063 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:19.449979 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 21:13:19.450063 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:19.450001 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 21:13:19.450063 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:19.450013 2572 projected.go:194] Error preparing data for projected volume kube-api-access-8l792 for pod openshift-network-diagnostics/network-check-target-dzwbk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:13:19.450241 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:19.450066 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/063917df-783e-488d-9088-c5b98092ea29-kube-api-access-8l792 podName:063917df-783e-488d-9088-c5b98092ea29 nodeName:}" failed. No retries permitted until 2026-04-17 21:13:21.450048541 +0000 UTC m=+5.146603604 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-8l792" (UniqueName: "kubernetes.io/projected/063917df-783e-488d-9088-c5b98092ea29-kube-api-access-8l792") pod "network-check-target-dzwbk" (UID: "063917df-783e-488d-9088-c5b98092ea29") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:13:19.773319 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:19.773270 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 21:08:17 +0000 UTC" deadline="2028-01-15 21:11:15.711486578 +0000 UTC" Apr 17 21:13:19.773858 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:19.773315 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15311h57m55.938174575s" Apr 17 21:13:19.863877 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:19.863295 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nprtf" Apr 17 21:13:19.863877 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:19.863295 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dzwbk" Apr 17 21:13:19.863877 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:19.863461 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nprtf" podUID="4574ebc4-f71f-480d-9e07-5e822f12bb1a" Apr 17 21:13:19.863877 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:19.863524 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dzwbk" podUID="063917df-783e-488d-9088-c5b98092ea29" Apr 17 21:13:21.365394 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:21.365356 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4574ebc4-f71f-480d-9e07-5e822f12bb1a-metrics-certs\") pod \"network-metrics-daemon-nprtf\" (UID: \"4574ebc4-f71f-480d-9e07-5e822f12bb1a\") " pod="openshift-multus/network-metrics-daemon-nprtf" Apr 17 21:13:21.365857 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:21.365498 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:13:21.365857 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:21.365567 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4574ebc4-f71f-480d-9e07-5e822f12bb1a-metrics-certs podName:4574ebc4-f71f-480d-9e07-5e822f12bb1a nodeName:}" failed. No retries permitted until 2026-04-17 21:13:25.365547274 +0000 UTC m=+9.062102327 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4574ebc4-f71f-480d-9e07-5e822f12bb1a-metrics-certs") pod "network-metrics-daemon-nprtf" (UID: "4574ebc4-f71f-480d-9e07-5e822f12bb1a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:13:21.466392 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:21.466353 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8l792\" (UniqueName: \"kubernetes.io/projected/063917df-783e-488d-9088-c5b98092ea29-kube-api-access-8l792\") pod \"network-check-target-dzwbk\" (UID: \"063917df-783e-488d-9088-c5b98092ea29\") " pod="openshift-network-diagnostics/network-check-target-dzwbk" Apr 17 21:13:21.466556 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:21.466517 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 21:13:21.466556 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:21.466543 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 21:13:21.466556 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:21.466555 2572 projected.go:194] Error preparing data for projected volume kube-api-access-8l792 for pod openshift-network-diagnostics/network-check-target-dzwbk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:13:21.466709 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:21.466617 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/063917df-783e-488d-9088-c5b98092ea29-kube-api-access-8l792 podName:063917df-783e-488d-9088-c5b98092ea29 nodeName:}" failed. No retries permitted until 2026-04-17 21:13:25.466596647 +0000 UTC m=+9.163151711 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-8l792" (UniqueName: "kubernetes.io/projected/063917df-783e-488d-9088-c5b98092ea29-kube-api-access-8l792") pod "network-check-target-dzwbk" (UID: "063917df-783e-488d-9088-c5b98092ea29") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:13:21.864467 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:21.863742 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dzwbk" Apr 17 21:13:21.864467 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:21.863873 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dzwbk" podUID="063917df-783e-488d-9088-c5b98092ea29" Apr 17 21:13:21.864467 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:21.864266 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nprtf" Apr 17 21:13:21.864467 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:21.864428 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nprtf" podUID="4574ebc4-f71f-480d-9e07-5e822f12bb1a" Apr 17 21:13:23.864216 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:23.863489 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dzwbk" Apr 17 21:13:23.864216 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:23.863617 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dzwbk" podUID="063917df-783e-488d-9088-c5b98092ea29" Apr 17 21:13:23.864216 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:23.864040 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nprtf" Apr 17 21:13:23.864216 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:23.864148 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nprtf" podUID="4574ebc4-f71f-480d-9e07-5e822f12bb1a" Apr 17 21:13:25.396000 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:25.395947 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4574ebc4-f71f-480d-9e07-5e822f12bb1a-metrics-certs\") pod \"network-metrics-daemon-nprtf\" (UID: \"4574ebc4-f71f-480d-9e07-5e822f12bb1a\") " pod="openshift-multus/network-metrics-daemon-nprtf" Apr 17 21:13:25.396468 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:25.396124 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:13:25.396468 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:25.396211 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4574ebc4-f71f-480d-9e07-5e822f12bb1a-metrics-certs podName:4574ebc4-f71f-480d-9e07-5e822f12bb1a nodeName:}" failed. No retries permitted until 2026-04-17 21:13:33.39619126 +0000 UTC m=+17.092746311 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4574ebc4-f71f-480d-9e07-5e822f12bb1a-metrics-certs") pod "network-metrics-daemon-nprtf" (UID: "4574ebc4-f71f-480d-9e07-5e822f12bb1a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:13:25.496818 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:25.496652 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8l792\" (UniqueName: \"kubernetes.io/projected/063917df-783e-488d-9088-c5b98092ea29-kube-api-access-8l792\") pod \"network-check-target-dzwbk\" (UID: \"063917df-783e-488d-9088-c5b98092ea29\") " pod="openshift-network-diagnostics/network-check-target-dzwbk" Apr 17 21:13:25.496818 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:25.496819 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 21:13:25.496818 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:25.496837 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 21:13:25.497108 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:25.496849 2572 projected.go:194] Error preparing data for projected volume kube-api-access-8l792 for pod openshift-network-diagnostics/network-check-target-dzwbk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:13:25.497108 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:25.496908 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/063917df-783e-488d-9088-c5b98092ea29-kube-api-access-8l792 podName:063917df-783e-488d-9088-c5b98092ea29 nodeName:}" failed. No retries permitted until 2026-04-17 21:13:33.496890698 +0000 UTC m=+17.193445766 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-8l792" (UniqueName: "kubernetes.io/projected/063917df-783e-488d-9088-c5b98092ea29-kube-api-access-8l792") pod "network-check-target-dzwbk" (UID: "063917df-783e-488d-9088-c5b98092ea29") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:13:25.863477 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:25.863435 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dzwbk" Apr 17 21:13:25.863640 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:25.863565 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dzwbk" podUID="063917df-783e-488d-9088-c5b98092ea29" Apr 17 21:13:25.863950 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:25.863435 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nprtf" Apr 17 21:13:25.864078 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:25.864005 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nprtf" podUID="4574ebc4-f71f-480d-9e07-5e822f12bb1a" Apr 17 21:13:27.863820 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:27.863785 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nprtf" Apr 17 21:13:27.864200 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:27.863786 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dzwbk" Apr 17 21:13:27.864200 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:27.863919 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nprtf" podUID="4574ebc4-f71f-480d-9e07-5e822f12bb1a" Apr 17 21:13:27.864200 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:27.863997 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dzwbk" podUID="063917df-783e-488d-9088-c5b98092ea29" Apr 17 21:13:29.863420 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:29.863377 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nprtf" Apr 17 21:13:29.863420 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:29.863409 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dzwbk" Apr 17 21:13:29.863852 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:29.863519 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nprtf" podUID="4574ebc4-f71f-480d-9e07-5e822f12bb1a" Apr 17 21:13:29.863852 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:29.863645 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dzwbk" podUID="063917df-783e-488d-9088-c5b98092ea29" Apr 17 21:13:31.863705 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:31.863676 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dzwbk" Apr 17 21:13:31.864038 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:31.863688 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nprtf" Apr 17 21:13:31.864038 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:31.863767 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dzwbk" podUID="063917df-783e-488d-9088-c5b98092ea29" Apr 17 21:13:31.864038 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:31.863841 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nprtf" podUID="4574ebc4-f71f-480d-9e07-5e822f12bb1a" Apr 17 21:13:33.455653 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:33.455609 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4574ebc4-f71f-480d-9e07-5e822f12bb1a-metrics-certs\") pod \"network-metrics-daemon-nprtf\" (UID: \"4574ebc4-f71f-480d-9e07-5e822f12bb1a\") " pod="openshift-multus/network-metrics-daemon-nprtf" Apr 17 21:13:33.456065 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:33.455781 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:13:33.456065 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:33.455860 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4574ebc4-f71f-480d-9e07-5e822f12bb1a-metrics-certs podName:4574ebc4-f71f-480d-9e07-5e822f12bb1a nodeName:}" failed. No retries permitted until 2026-04-17 21:13:49.455838065 +0000 UTC m=+33.152393117 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4574ebc4-f71f-480d-9e07-5e822f12bb1a-metrics-certs") pod "network-metrics-daemon-nprtf" (UID: "4574ebc4-f71f-480d-9e07-5e822f12bb1a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:13:33.556154 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:33.556112 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8l792\" (UniqueName: \"kubernetes.io/projected/063917df-783e-488d-9088-c5b98092ea29-kube-api-access-8l792\") pod \"network-check-target-dzwbk\" (UID: \"063917df-783e-488d-9088-c5b98092ea29\") " pod="openshift-network-diagnostics/network-check-target-dzwbk" Apr 17 21:13:33.556329 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:33.556301 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 21:13:33.556329 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:33.556323 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 21:13:33.556406 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:33.556335 2572 projected.go:194] Error preparing data for projected volume kube-api-access-8l792 for pod openshift-network-diagnostics/network-check-target-dzwbk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:13:33.556406 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:33.556397 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/063917df-783e-488d-9088-c5b98092ea29-kube-api-access-8l792 podName:063917df-783e-488d-9088-c5b98092ea29 nodeName:}" failed. No retries permitted until 2026-04-17 21:13:49.556376215 +0000 UTC m=+33.252931279 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-8l792" (UniqueName: "kubernetes.io/projected/063917df-783e-488d-9088-c5b98092ea29-kube-api-access-8l792") pod "network-check-target-dzwbk" (UID: "063917df-783e-488d-9088-c5b98092ea29") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:13:33.863604 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:33.863564 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dzwbk" Apr 17 21:13:33.863796 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:33.863567 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nprtf" Apr 17 21:13:33.863796 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:33.863698 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dzwbk" podUID="063917df-783e-488d-9088-c5b98092ea29" Apr 17 21:13:33.863899 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:33.863796 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nprtf" podUID="4574ebc4-f71f-480d-9e07-5e822f12bb1a" Apr 17 21:13:35.863593 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:35.863414 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dzwbk" Apr 17 21:13:35.864150 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:35.863414 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nprtf" Apr 17 21:13:35.864150 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:35.863690 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dzwbk" podUID="063917df-783e-488d-9088-c5b98092ea29" Apr 17 21:13:35.864150 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:35.863749 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nprtf" podUID="4574ebc4-f71f-480d-9e07-5e822f12bb1a" Apr 17 21:13:35.987043 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:35.986869 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7q89l_6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82/ovn-acl-logging/0.log" Apr 17 21:13:35.987419 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:35.987397 2572 generic.go:358] "Generic (PLEG): container finished" podID="6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82" containerID="f1a91a85c9dd25d0db744fd6d7d7854be3d70ceff85c6946a0c9c4867f2a2774" exitCode=1 Apr 17 21:13:35.987533 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:35.987448 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" event={"ID":"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82","Type":"ContainerStarted","Data":"275902bacd56dd9698fd6394fb8b982f19ea3743dc9bbfe8de7c683bbb55f89a"} Apr 17 21:13:35.987533 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:35.987488 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" event={"ID":"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82","Type":"ContainerStarted","Data":"fe04894d76f22ae0f8a376cd10d365512fc77137bf1d6258891ab9069defebd9"} Apr 17 21:13:35.987533 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:35.987502 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" event={"ID":"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82","Type":"ContainerDied","Data":"f1a91a85c9dd25d0db744fd6d7d7854be3d70ceff85c6946a0c9c4867f2a2774"} Apr 17 21:13:35.987533 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:35.987517 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" event={"ID":"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82","Type":"ContainerStarted","Data":"2316fb2aee86b699be15dde9123bea1ce3d057cc880f6d2ddf553cd60d27c0d7"} Apr 17 21:13:35.989447 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:35.989419 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nbbnz" event={"ID":"066bf68c-ad0c-4b29-a288-000664effe73","Type":"ContainerStarted","Data":"7659ab52fc0fe82e2e03beef1e0acc132948adb7b95f0ff428e525410c2b9573"} Apr 17 21:13:35.991065 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:35.991045 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" event={"ID":"c6a3f259-7589-452e-a348-011b87bae9f4","Type":"ContainerStarted","Data":"8e8361ebba8804f86969d6710e4fa896e697aa52fc27818f7df1a0db71bd083d"} Apr 17 21:13:35.995557 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:35.995468 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-174.ec2.internal" event={"ID":"a9002f67337ee452fe59c1455ae92d3b","Type":"ContainerStarted","Data":"a29720433f51a5b30d6da5162061caeb6b392ec00910b8f13a105b16ffb0cedc"} Apr 17 21:13:36.004345 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:36.004287 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-nbbnz" podStartSLOduration=2.493447014 podStartE2EDuration="20.004272147s" podCreationTimestamp="2026-04-17 21:13:16 +0000 UTC" firstStartedPulling="2026-04-17 21:13:18.084688666 +0000 UTC m=+1.781243720" lastFinishedPulling="2026-04-17 21:13:35.595513791 +0000 UTC m=+19.292068853" observedRunningTime="2026-04-17 21:13:36.003960628 +0000 UTC m=+19.700515714" watchObservedRunningTime="2026-04-17 21:13:36.004272147 +0000 UTC m=+19.700827220" Apr 17 21:13:36.017277 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:36.016635 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-174.ec2.internal" podStartSLOduration=20.016616589 podStartE2EDuration="20.016616589s" podCreationTimestamp="2026-04-17 21:13:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:13:36.016385225 +0000 UTC m=+19.712940298" watchObservedRunningTime="2026-04-17 21:13:36.016616589 +0000 UTC m=+19.713171668" Apr 17 21:13:36.032364 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:36.032302 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-s4jhq" podStartSLOduration=2.692847109 podStartE2EDuration="20.032286368s" podCreationTimestamp="2026-04-17 21:13:16 +0000 UTC" firstStartedPulling="2026-04-17 21:13:18.036020491 +0000 UTC m=+1.732575540" lastFinishedPulling="2026-04-17 21:13:35.375459741 +0000 UTC m=+19.072014799" observedRunningTime="2026-04-17 21:13:36.031474521 +0000 UTC m=+19.728029595" watchObservedRunningTime="2026-04-17 21:13:36.032286368 +0000 UTC m=+19.728841443" Apr 17 21:13:36.999116 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:36.998838 2572 generic.go:358] "Generic (PLEG): container finished" podID="c375478ce01e8633f2b111db25ba3d4f" containerID="d2625e117684e8ede6577a929a17598bd11dfada7bf89ea361b321e6a1dd6464" exitCode=0 Apr 17 21:13:36.999116 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:36.998936 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-174.ec2.internal" event={"ID":"c375478ce01e8633f2b111db25ba3d4f","Type":"ContainerDied","Data":"d2625e117684e8ede6577a929a17598bd11dfada7bf89ea361b321e6a1dd6464"} Apr 17 21:13:37.000463 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:37.000438 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qnj2m" event={"ID":"778ea300-0830-4ba5-8bc4-8bc4314e7652","Type":"ContainerStarted","Data":"a47f12b54b0349cd48021b0811245d35312618086117770558552d2450a34214"} Apr 17 21:13:37.001754 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:37.001722 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffqrk" event={"ID":"d7ac1837-11d5-4b96-a8c1-023420cd9d90","Type":"ContainerStarted","Data":"c62b8f7340b56eb284eb0b3c29413eae248087937b258e85ef89e511f78e4b20"} Apr 17 21:13:37.003032 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:37.003012 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-hllsm" event={"ID":"aa8fd541-9d45-4bc6-891c-792a7ef59496","Type":"ContainerStarted","Data":"4489e7cda4820e5094ee8d45c42854d0f1d65fa2ab355093c5068c311fd086d5"} Apr 17 21:13:37.005705 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:37.005686 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7q89l_6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82/ovn-acl-logging/0.log" Apr 17 21:13:37.006081 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:37.006053 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" event={"ID":"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82","Type":"ContainerStarted","Data":"d75d09f3573b45576b49dbed3725327fb594ee9ca0afc9261b6c22203be85873"} Apr 17 21:13:37.006081 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:37.006079 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" event={"ID":"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82","Type":"ContainerStarted","Data":"546305a9f08596aeea1118a99b363a357460b2bafdbc352e8e13290c236026a7"} Apr 17 21:13:37.007484 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:37.007464 2572 generic.go:358] "Generic (PLEG): container finished" podID="4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26" containerID="ef8e4df786f9603b24570af8e5a2ff6065c515b33295fbdde25471866a695fd6" exitCode=0 Apr 17 21:13:37.007597 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:37.007528 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w6rk2" event={"ID":"4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26","Type":"ContainerDied","Data":"ef8e4df786f9603b24570af8e5a2ff6065c515b33295fbdde25471866a695fd6"} Apr 17 21:13:37.008943 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:37.008848 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4rxvx" event={"ID":"8aa98af8-e317-4bda-af80-1a5a1ed4fbe3","Type":"ContainerStarted","Data":"32b0ba792402b942dce56aa5ebf966978e5587adb99551f4c4aa1c1efae4152d"} Apr 17 21:13:37.010272 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:37.010213 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qsc7l" event={"ID":"500b1ed9-0358-4553-b67c-814ae8a286af","Type":"ContainerStarted","Data":"2521a6191196d622fb145e77c0b655e20d41ea9cb6a74ba454cae98a58a5fc67"} Apr 17 21:13:37.043411 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:37.043370 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-hllsm" podStartSLOduration=3.774404485 podStartE2EDuration="21.043357889s" podCreationTimestamp="2026-04-17 21:13:16 +0000 UTC" firstStartedPulling="2026-04-17 21:13:18.104875177 +0000 UTC m=+1.801430226" lastFinishedPulling="2026-04-17 21:13:35.37382858 +0000 UTC m=+19.070383630" observedRunningTime="2026-04-17 21:13:37.043141708 +0000 UTC m=+20.739696780" watchObservedRunningTime="2026-04-17 21:13:37.043357889 +0000 UTC m=+20.739912962" Apr 17 21:13:37.055982 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:37.055924 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-qsc7l" podStartSLOduration=3.7326293 podStartE2EDuration="21.055909144s" podCreationTimestamp="2026-04-17 21:13:16 +0000 UTC" firstStartedPulling="2026-04-17 21:13:18.050969588 +0000 UTC m=+1.747524638" lastFinishedPulling="2026-04-17 21:13:35.374249431 +0000 UTC m=+19.070804482" observedRunningTime="2026-04-17 21:13:37.055648577 +0000 UTC m=+20.752203652" watchObservedRunningTime="2026-04-17 21:13:37.055909144 +0000 UTC m=+20.752464217" Apr 17 21:13:37.068797 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:37.068751 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-4rxvx" podStartSLOduration=3.785871975 podStartE2EDuration="21.068736347s" podCreationTimestamp="2026-04-17 21:13:16 +0000 UTC" firstStartedPulling="2026-04-17 21:13:18.091234502 +0000 UTC m=+1.787789555" lastFinishedPulling="2026-04-17 21:13:35.374098866 +0000 UTC m=+19.070653927" observedRunningTime="2026-04-17 21:13:37.068628502 +0000 UTC m=+20.765183575" watchObservedRunningTime="2026-04-17 21:13:37.068736347 +0000 UTC m=+20.765291420" Apr 17 21:13:37.085102 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:37.085036 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-qnj2m" podStartSLOduration=3.826531366 podStartE2EDuration="21.08502425s" podCreationTimestamp="2026-04-17 21:13:16 +0000 UTC" firstStartedPulling="2026-04-17 21:13:18.115450186 +0000 UTC m=+1.812005236" lastFinishedPulling="2026-04-17 21:13:35.373943055 +0000 UTC m=+19.070498120" observedRunningTime="2026-04-17 21:13:37.084636739 +0000 UTC m=+20.781191812" watchObservedRunningTime="2026-04-17 21:13:37.08502425 +0000 UTC m=+20.781579322" Apr 17 21:13:37.271224 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:37.271201 2572 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 21:13:37.782758 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:37.782569 2572 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T21:13:37.271220728Z","UUID":"7ff63f1d-0d5b-4460-9d8d-27bc2f436f6a","Handler":null,"Name":"","Endpoint":""} Apr 17 21:13:37.785844 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:37.785792 2572 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 21:13:37.785844 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:37.785823 2572 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 21:13:37.863676 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:37.863642 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dzwbk" Apr 17 21:13:37.863874 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:37.863774 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dzwbk" podUID="063917df-783e-488d-9088-c5b98092ea29" Apr 17 21:13:37.863874 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:37.863835 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nprtf" Apr 17 21:13:37.863997 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:37.863958 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nprtf" podUID="4574ebc4-f71f-480d-9e07-5e822f12bb1a" Apr 17 21:13:38.013825 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:38.013794 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-174.ec2.internal" event={"ID":"c375478ce01e8633f2b111db25ba3d4f","Type":"ContainerStarted","Data":"25b3e9c0b26edbf60457228d7676fae2b26c5d6e3b233cd99f7b9ffdcf5e3e63"} Apr 17 21:13:38.017741 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:38.017708 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffqrk" event={"ID":"d7ac1837-11d5-4b96-a8c1-023420cd9d90","Type":"ContainerStarted","Data":"fdd3071c0f1dfd9d3c64d4497792d921ff3171f234dfe1a014b791b43b0a8d24"} Apr 17 21:13:38.027758 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:38.027708 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-174.ec2.internal" podStartSLOduration=22.027692052 podStartE2EDuration="22.027692052s" podCreationTimestamp="2026-04-17 21:13:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:13:38.027328582 +0000 UTC m=+21.723883655" watchObservedRunningTime="2026-04-17 21:13:38.027692052 +0000 UTC m=+21.724247124" Apr 17 21:13:39.022349 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:39.022303 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffqrk" event={"ID":"d7ac1837-11d5-4b96-a8c1-023420cd9d90","Type":"ContainerStarted","Data":"5bd0e39ed7fd94b770dab536fba7e9d650763405be447d91c68b9bc6001986f3"} Apr 17 21:13:39.025718 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:39.025672 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7q89l_6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82/ovn-acl-logging/0.log" Apr 17 21:13:39.026102 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:39.026075 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" event={"ID":"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82","Type":"ContainerStarted","Data":"7409d32bc6e5881b7caab196234607285682a69d97a659dbe574063024c7f826"} Apr 17 21:13:39.038554 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:39.038498 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffqrk" podStartSLOduration=3.150179806 podStartE2EDuration="23.038484303s" podCreationTimestamp="2026-04-17 21:13:16 +0000 UTC" firstStartedPulling="2026-04-17 21:13:18.111388244 +0000 UTC m=+1.807943295" lastFinishedPulling="2026-04-17 21:13:37.999692742 +0000 UTC m=+21.696247792" observedRunningTime="2026-04-17 21:13:39.037424876 +0000 UTC m=+22.733979948" watchObservedRunningTime="2026-04-17 21:13:39.038484303 +0000 UTC m=+22.735039375" Apr 17 21:13:39.863839 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:39.863804 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nprtf" Apr 17 21:13:39.863839 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:39.863844 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dzwbk" Apr 17 21:13:39.864048 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:39.863926 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nprtf" podUID="4574ebc4-f71f-480d-9e07-5e822f12bb1a" Apr 17 21:13:39.864091 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:39.864035 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dzwbk" podUID="063917df-783e-488d-9088-c5b98092ea29" Apr 17 21:13:40.880876 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:40.880669 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-hllsm" Apr 17 21:13:40.881598 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:40.881574 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-hllsm" Apr 17 21:13:41.030514 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:41.030482 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-hllsm" Apr 17 21:13:41.031035 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:41.031015 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-hllsm" Apr 17 21:13:41.864101 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:41.864075 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nprtf" Apr 17 21:13:41.864243 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:41.864084 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dzwbk" Apr 17 21:13:41.864243 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:41.864209 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nprtf" podUID="4574ebc4-f71f-480d-9e07-5e822f12bb1a" Apr 17 21:13:41.864326 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:41.864289 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dzwbk" podUID="063917df-783e-488d-9088-c5b98092ea29" Apr 17 21:13:42.034593 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:42.034517 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7q89l_6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82/ovn-acl-logging/0.log" Apr 17 21:13:42.034940 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:42.034816 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" event={"ID":"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82","Type":"ContainerStarted","Data":"d0e00849512bb3080c0fcb710ed2f0b046f3dbc8ce21612fb0df80c6d109b4a6"} Apr 17 21:13:42.035291 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:42.035264 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:42.035504 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:42.035488 2572 scope.go:117] "RemoveContainer" containerID="f1a91a85c9dd25d0db744fd6d7d7854be3d70ceff85c6946a0c9c4867f2a2774" Apr 17 21:13:42.050463 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:42.050432 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:43.047475 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:43.047280 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7q89l_6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82/ovn-acl-logging/0.log" Apr 17 21:13:43.047885 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:43.047850 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" event={"ID":"6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82","Type":"ContainerStarted","Data":"0c79b16f46e0b6ac948f5fa71296558676747a940abe1ec6b3908de6c1d6d9a6"} Apr 17 21:13:43.048235 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:43.048203 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:43.048339 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:43.048255 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:43.049990 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:43.049968 2572 generic.go:358] "Generic (PLEG): container finished" podID="4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26" containerID="03ce3006efe3ce0760a11bba629f2a8d2babe35972f3c40a591ef087a410be74" exitCode=0 Apr 17 21:13:43.050096 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:43.050054 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w6rk2" event={"ID":"4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26","Type":"ContainerDied","Data":"03ce3006efe3ce0760a11bba629f2a8d2babe35972f3c40a591ef087a410be74"} Apr 17 21:13:43.065282 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:43.065249 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:13:43.074487 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:43.073762 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" podStartSLOduration=9.483834971 podStartE2EDuration="27.073742456s" podCreationTimestamp="2026-04-17 21:13:16 +0000 UTC" firstStartedPulling="2026-04-17 21:13:18.099931495 +0000 UTC m=+1.796486546" lastFinishedPulling="2026-04-17 21:13:35.689838978 +0000 UTC m=+19.386394031" observedRunningTime="2026-04-17 21:13:43.07244902 +0000 UTC m=+26.769004120" watchObservedRunningTime="2026-04-17 21:13:43.073742456 +0000 UTC m=+26.770297530" Apr 17 21:13:43.812535 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:43.812501 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-dzwbk"] Apr 17 21:13:43.812656 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:43.812639 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dzwbk" Apr 17 21:13:43.812776 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:43.812721 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dzwbk" podUID="063917df-783e-488d-9088-c5b98092ea29" Apr 17 21:13:43.814132 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:43.814104 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-nprtf"] Apr 17 21:13:43.814265 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:43.814252 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nprtf" Apr 17 21:13:43.814402 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:43.814377 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nprtf" podUID="4574ebc4-f71f-480d-9e07-5e822f12bb1a" Apr 17 21:13:44.053711 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:44.053675 2572 generic.go:358] "Generic (PLEG): container finished" podID="4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26" containerID="c7700c32ca5dfc655094e8d7e3c3d3b91126c31d120e13220a4691d87178a038" exitCode=0 Apr 17 21:13:44.054448 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:44.053757 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w6rk2" event={"ID":"4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26","Type":"ContainerDied","Data":"c7700c32ca5dfc655094e8d7e3c3d3b91126c31d120e13220a4691d87178a038"} Apr 17 21:13:44.864413 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:44.864332 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nprtf" Apr 17 21:13:44.864529 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:44.864468 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nprtf" podUID="4574ebc4-f71f-480d-9e07-5e822f12bb1a" Apr 17 21:13:45.058048 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:45.058012 2572 generic.go:358] "Generic (PLEG): container finished" podID="4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26" containerID="9d69c9c30272214dac83ec91924b08b35f2779e03974bab624017428bc0f9626" exitCode=0 Apr 17 21:13:45.058504 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:45.058095 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w6rk2" event={"ID":"4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26","Type":"ContainerDied","Data":"9d69c9c30272214dac83ec91924b08b35f2779e03974bab624017428bc0f9626"} Apr 17 21:13:45.864198 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:45.863961 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dzwbk" Apr 17 21:13:45.864373 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:45.864299 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dzwbk" podUID="063917df-783e-488d-9088-c5b98092ea29" Apr 17 21:13:46.864536 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:46.864448 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nprtf" Apr 17 21:13:46.864933 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:46.864582 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nprtf" podUID="4574ebc4-f71f-480d-9e07-5e822f12bb1a" Apr 17 21:13:47.863888 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:47.863844 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dzwbk" Apr 17 21:13:47.864079 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:47.863967 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dzwbk" podUID="063917df-783e-488d-9088-c5b98092ea29" Apr 17 21:13:48.608826 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:48.608755 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-174.ec2.internal" event="NodeReady" Apr 17 21:13:48.609212 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:48.608891 2572 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 21:13:48.648480 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:48.648447 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-s9qzj"] Apr 17 21:13:48.666052 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:48.666017 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-dfvbk"] Apr 17 21:13:48.666237 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:48.666215 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-s9qzj" Apr 17 21:13:48.668396 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:48.668371 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 21:13:48.668541 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:48.668506 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-k464c\"" Apr 17 21:13:48.668655 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:48.668639 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 21:13:48.690447 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:48.690395 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-s9qzj"] Apr 17 21:13:48.690447 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:48.690433 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dfvbk"] Apr 17 21:13:48.690678 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:48.690569 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dfvbk" Apr 17 21:13:48.693031 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:48.692988 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 21:13:48.693031 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:48.693008 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 21:13:48.693275 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:48.693043 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 21:13:48.693275 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:48.693068 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ggwsl\"" Apr 17 21:13:48.764997 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:48.764953 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4c6d8acf-b971-44ec-b14c-d6af7a3da43c-metrics-tls\") pod \"dns-default-s9qzj\" (UID: \"4c6d8acf-b971-44ec-b14c-d6af7a3da43c\") " pod="openshift-dns/dns-default-s9qzj" Apr 17 21:13:48.764997 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:48.765004 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5cd6\" (UniqueName: \"kubernetes.io/projected/03a3935c-b9b0-4fda-8142-8cd63a6b5a88-kube-api-access-s5cd6\") pod \"ingress-canary-dfvbk\" (UID: \"03a3935c-b9b0-4fda-8142-8cd63a6b5a88\") " pod="openshift-ingress-canary/ingress-canary-dfvbk" Apr 17 21:13:48.765255 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:48.765044 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4c6d8acf-b971-44ec-b14c-d6af7a3da43c-tmp-dir\") pod \"dns-default-s9qzj\" (UID: \"4c6d8acf-b971-44ec-b14c-d6af7a3da43c\") " pod="openshift-dns/dns-default-s9qzj" Apr 17 21:13:48.765255 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:48.765093 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m44g\" (UniqueName: \"kubernetes.io/projected/4c6d8acf-b971-44ec-b14c-d6af7a3da43c-kube-api-access-2m44g\") pod \"dns-default-s9qzj\" (UID: \"4c6d8acf-b971-44ec-b14c-d6af7a3da43c\") " pod="openshift-dns/dns-default-s9qzj" Apr 17 21:13:48.765255 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:48.765196 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/03a3935c-b9b0-4fda-8142-8cd63a6b5a88-cert\") pod \"ingress-canary-dfvbk\" (UID: \"03a3935c-b9b0-4fda-8142-8cd63a6b5a88\") " pod="openshift-ingress-canary/ingress-canary-dfvbk" Apr 17 21:13:48.765255 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:48.765244 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4c6d8acf-b971-44ec-b14c-d6af7a3da43c-config-volume\") pod \"dns-default-s9qzj\" (UID: \"4c6d8acf-b971-44ec-b14c-d6af7a3da43c\") " pod="openshift-dns/dns-default-s9qzj" Apr 17 21:13:48.863989 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:48.863909 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nprtf" Apr 17 21:13:48.865563 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:48.865532 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4c6d8acf-b971-44ec-b14c-d6af7a3da43c-config-volume\") pod \"dns-default-s9qzj\" (UID: \"4c6d8acf-b971-44ec-b14c-d6af7a3da43c\") " pod="openshift-dns/dns-default-s9qzj" Apr 17 21:13:48.865688 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:48.865601 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4c6d8acf-b971-44ec-b14c-d6af7a3da43c-metrics-tls\") pod \"dns-default-s9qzj\" (UID: \"4c6d8acf-b971-44ec-b14c-d6af7a3da43c\") " pod="openshift-dns/dns-default-s9qzj" Apr 17 21:13:48.865688 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:48.865631 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s5cd6\" (UniqueName: \"kubernetes.io/projected/03a3935c-b9b0-4fda-8142-8cd63a6b5a88-kube-api-access-s5cd6\") pod \"ingress-canary-dfvbk\" (UID: \"03a3935c-b9b0-4fda-8142-8cd63a6b5a88\") " pod="openshift-ingress-canary/ingress-canary-dfvbk" Apr 17 21:13:48.865688 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:48.865667 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4c6d8acf-b971-44ec-b14c-d6af7a3da43c-tmp-dir\") pod \"dns-default-s9qzj\" (UID: \"4c6d8acf-b971-44ec-b14c-d6af7a3da43c\") " pod="openshift-dns/dns-default-s9qzj" Apr 17 21:13:48.865840 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:48.865692 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2m44g\" (UniqueName: \"kubernetes.io/projected/4c6d8acf-b971-44ec-b14c-d6af7a3da43c-kube-api-access-2m44g\") pod \"dns-default-s9qzj\" (UID: \"4c6d8acf-b971-44ec-b14c-d6af7a3da43c\") " pod="openshift-dns/dns-default-s9qzj" Apr 17 21:13:48.865840 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:48.865721 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/03a3935c-b9b0-4fda-8142-8cd63a6b5a88-cert\") pod \"ingress-canary-dfvbk\" (UID: \"03a3935c-b9b0-4fda-8142-8cd63a6b5a88\") " pod="openshift-ingress-canary/ingress-canary-dfvbk" Apr 17 21:13:48.865840 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:48.865773 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 21:13:48.865840 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:48.865802 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 21:13:48.866027 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:48.865847 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c6d8acf-b971-44ec-b14c-d6af7a3da43c-metrics-tls podName:4c6d8acf-b971-44ec-b14c-d6af7a3da43c nodeName:}" failed. No retries permitted until 2026-04-17 21:13:49.365826653 +0000 UTC m=+33.062382083 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4c6d8acf-b971-44ec-b14c-d6af7a3da43c-metrics-tls") pod "dns-default-s9qzj" (UID: "4c6d8acf-b971-44ec-b14c-d6af7a3da43c") : secret "dns-default-metrics-tls" not found Apr 17 21:13:48.866027 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:48.865869 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03a3935c-b9b0-4fda-8142-8cd63a6b5a88-cert podName:03a3935c-b9b0-4fda-8142-8cd63a6b5a88 nodeName:}" failed. No retries permitted until 2026-04-17 21:13:49.365858142 +0000 UTC m=+33.062413197 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/03a3935c-b9b0-4fda-8142-8cd63a6b5a88-cert") pod "ingress-canary-dfvbk" (UID: "03a3935c-b9b0-4fda-8142-8cd63a6b5a88") : secret "canary-serving-cert" not found Apr 17 21:13:48.866235 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:48.866162 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4c6d8acf-b971-44ec-b14c-d6af7a3da43c-config-volume\") pod \"dns-default-s9qzj\" (UID: \"4c6d8acf-b971-44ec-b14c-d6af7a3da43c\") " pod="openshift-dns/dns-default-s9qzj" Apr 17 21:13:48.866393 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:48.866350 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4c6d8acf-b971-44ec-b14c-d6af7a3da43c-tmp-dir\") pod \"dns-default-s9qzj\" (UID: \"4c6d8acf-b971-44ec-b14c-d6af7a3da43c\") " pod="openshift-dns/dns-default-s9qzj" Apr 17 21:13:48.866589 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:48.866572 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 21:13:48.866664 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:48.866648 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-zzp4n\"" Apr 17 21:13:48.876499 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:48.876472 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m44g\" (UniqueName: \"kubernetes.io/projected/4c6d8acf-b971-44ec-b14c-d6af7a3da43c-kube-api-access-2m44g\") pod \"dns-default-s9qzj\" (UID: \"4c6d8acf-b971-44ec-b14c-d6af7a3da43c\") " pod="openshift-dns/dns-default-s9qzj" Apr 17 21:13:48.876613 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:48.876549 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5cd6\" (UniqueName: \"kubernetes.io/projected/03a3935c-b9b0-4fda-8142-8cd63a6b5a88-kube-api-access-s5cd6\") pod \"ingress-canary-dfvbk\" (UID: \"03a3935c-b9b0-4fda-8142-8cd63a6b5a88\") " pod="openshift-ingress-canary/ingress-canary-dfvbk" Apr 17 21:13:49.369735 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:49.369687 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4c6d8acf-b971-44ec-b14c-d6af7a3da43c-metrics-tls\") pod \"dns-default-s9qzj\" (UID: \"4c6d8acf-b971-44ec-b14c-d6af7a3da43c\") " pod="openshift-dns/dns-default-s9qzj" Apr 17 21:13:49.369945 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:49.369761 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/03a3935c-b9b0-4fda-8142-8cd63a6b5a88-cert\") pod \"ingress-canary-dfvbk\" (UID: \"03a3935c-b9b0-4fda-8142-8cd63a6b5a88\") " pod="openshift-ingress-canary/ingress-canary-dfvbk" Apr 17 21:13:49.369945 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:49.369833 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 21:13:49.369945 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:49.369883 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 21:13:49.369945 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:49.369909 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c6d8acf-b971-44ec-b14c-d6af7a3da43c-metrics-tls podName:4c6d8acf-b971-44ec-b14c-d6af7a3da43c nodeName:}" failed. No retries permitted until 2026-04-17 21:13:50.369890864 +0000 UTC m=+34.066445925 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4c6d8acf-b971-44ec-b14c-d6af7a3da43c-metrics-tls") pod "dns-default-s9qzj" (UID: "4c6d8acf-b971-44ec-b14c-d6af7a3da43c") : secret "dns-default-metrics-tls" not found Apr 17 21:13:49.369945 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:49.369940 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03a3935c-b9b0-4fda-8142-8cd63a6b5a88-cert podName:03a3935c-b9b0-4fda-8142-8cd63a6b5a88 nodeName:}" failed. No retries permitted until 2026-04-17 21:13:50.369922859 +0000 UTC m=+34.066477916 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/03a3935c-b9b0-4fda-8142-8cd63a6b5a88-cert") pod "ingress-canary-dfvbk" (UID: "03a3935c-b9b0-4fda-8142-8cd63a6b5a88") : secret "canary-serving-cert" not found Apr 17 21:13:49.470648 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:49.470614 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4574ebc4-f71f-480d-9e07-5e822f12bb1a-metrics-certs\") pod \"network-metrics-daemon-nprtf\" (UID: \"4574ebc4-f71f-480d-9e07-5e822f12bb1a\") " pod="openshift-multus/network-metrics-daemon-nprtf" Apr 17 21:13:49.470838 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:49.470816 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 21:13:49.470912 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:49.470900 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4574ebc4-f71f-480d-9e07-5e822f12bb1a-metrics-certs podName:4574ebc4-f71f-480d-9e07-5e822f12bb1a nodeName:}" failed. No retries permitted until 2026-04-17 21:14:21.470878156 +0000 UTC m=+65.167433210 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4574ebc4-f71f-480d-9e07-5e822f12bb1a-metrics-certs") pod "network-metrics-daemon-nprtf" (UID: "4574ebc4-f71f-480d-9e07-5e822f12bb1a") : secret "metrics-daemon-secret" not found Apr 17 21:13:49.571931 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:49.571893 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8l792\" (UniqueName: \"kubernetes.io/projected/063917df-783e-488d-9088-c5b98092ea29-kube-api-access-8l792\") pod \"network-check-target-dzwbk\" (UID: \"063917df-783e-488d-9088-c5b98092ea29\") " pod="openshift-network-diagnostics/network-check-target-dzwbk" Apr 17 21:13:49.572108 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:49.572085 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 21:13:49.572153 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:49.572115 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 21:13:49.572153 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:49.572131 2572 projected.go:194] Error preparing data for projected volume kube-api-access-8l792 for pod openshift-network-diagnostics/network-check-target-dzwbk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:13:49.572281 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:49.572211 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/063917df-783e-488d-9088-c5b98092ea29-kube-api-access-8l792 podName:063917df-783e-488d-9088-c5b98092ea29 nodeName:}" failed. No retries permitted until 2026-04-17 21:14:21.57219155 +0000 UTC m=+65.268746602 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-8l792" (UniqueName: "kubernetes.io/projected/063917df-783e-488d-9088-c5b98092ea29-kube-api-access-8l792") pod "network-check-target-dzwbk" (UID: "063917df-783e-488d-9088-c5b98092ea29") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:13:49.863715 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:49.863678 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dzwbk" Apr 17 21:13:49.866563 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:49.866533 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-cgbn6\"" Apr 17 21:13:49.866563 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:49.866537 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 21:13:49.866746 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:49.866602 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 21:13:50.377427 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:50.377385 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4c6d8acf-b971-44ec-b14c-d6af7a3da43c-metrics-tls\") pod \"dns-default-s9qzj\" (UID: \"4c6d8acf-b971-44ec-b14c-d6af7a3da43c\") " pod="openshift-dns/dns-default-s9qzj" Apr 17 21:13:50.377640 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:50.377450 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/03a3935c-b9b0-4fda-8142-8cd63a6b5a88-cert\") pod \"ingress-canary-dfvbk\" (UID: \"03a3935c-b9b0-4fda-8142-8cd63a6b5a88\") " pod="openshift-ingress-canary/ingress-canary-dfvbk" Apr 17 21:13:50.377640 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:50.377540 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 21:13:50.377640 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:50.377562 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 21:13:50.377640 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:50.377624 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c6d8acf-b971-44ec-b14c-d6af7a3da43c-metrics-tls podName:4c6d8acf-b971-44ec-b14c-d6af7a3da43c nodeName:}" failed. No retries permitted until 2026-04-17 21:13:52.377602672 +0000 UTC m=+36.074157738 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4c6d8acf-b971-44ec-b14c-d6af7a3da43c-metrics-tls") pod "dns-default-s9qzj" (UID: "4c6d8acf-b971-44ec-b14c-d6af7a3da43c") : secret "dns-default-metrics-tls" not found Apr 17 21:13:50.377820 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:50.377667 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03a3935c-b9b0-4fda-8142-8cd63a6b5a88-cert podName:03a3935c-b9b0-4fda-8142-8cd63a6b5a88 nodeName:}" failed. No retries permitted until 2026-04-17 21:13:52.377637747 +0000 UTC m=+36.074192863 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/03a3935c-b9b0-4fda-8142-8cd63a6b5a88-cert") pod "ingress-canary-dfvbk" (UID: "03a3935c-b9b0-4fda-8142-8cd63a6b5a88") : secret "canary-serving-cert" not found Apr 17 21:13:52.073578 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:52.073546 2572 generic.go:358] "Generic (PLEG): container finished" podID="4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26" containerID="92946786e920340cfd82da431f847c2cb774948be14ecd6bbeb20017356ed518" exitCode=0 Apr 17 21:13:52.073974 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:52.073596 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w6rk2" event={"ID":"4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26","Type":"ContainerDied","Data":"92946786e920340cfd82da431f847c2cb774948be14ecd6bbeb20017356ed518"} Apr 17 21:13:52.394832 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:52.394744 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4c6d8acf-b971-44ec-b14c-d6af7a3da43c-metrics-tls\") pod \"dns-default-s9qzj\" (UID: \"4c6d8acf-b971-44ec-b14c-d6af7a3da43c\") " pod="openshift-dns/dns-default-s9qzj" Apr 17 21:13:52.394832 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:52.394789 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/03a3935c-b9b0-4fda-8142-8cd63a6b5a88-cert\") pod \"ingress-canary-dfvbk\" (UID: \"03a3935c-b9b0-4fda-8142-8cd63a6b5a88\") " pod="openshift-ingress-canary/ingress-canary-dfvbk" Apr 17 21:13:52.395000 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:52.394882 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 21:13:52.395000 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:52.394887 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 21:13:52.395000 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:52.394932 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03a3935c-b9b0-4fda-8142-8cd63a6b5a88-cert podName:03a3935c-b9b0-4fda-8142-8cd63a6b5a88 nodeName:}" failed. No retries permitted until 2026-04-17 21:13:56.394919217 +0000 UTC m=+40.091474267 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/03a3935c-b9b0-4fda-8142-8cd63a6b5a88-cert") pod "ingress-canary-dfvbk" (UID: "03a3935c-b9b0-4fda-8142-8cd63a6b5a88") : secret "canary-serving-cert" not found Apr 17 21:13:52.395000 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:52.394946 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c6d8acf-b971-44ec-b14c-d6af7a3da43c-metrics-tls podName:4c6d8acf-b971-44ec-b14c-d6af7a3da43c nodeName:}" failed. No retries permitted until 2026-04-17 21:13:56.394940018 +0000 UTC m=+40.091495067 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4c6d8acf-b971-44ec-b14c-d6af7a3da43c-metrics-tls") pod "dns-default-s9qzj" (UID: "4c6d8acf-b971-44ec-b14c-d6af7a3da43c") : secret "dns-default-metrics-tls" not found Apr 17 21:13:53.078451 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:53.078421 2572 generic.go:358] "Generic (PLEG): container finished" podID="4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26" containerID="7d6f18b37b07de851fff99cbce6e44a5744a0844ce254f73761bafb493b589b8" exitCode=0 Apr 17 21:13:53.078783 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:53.078463 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w6rk2" event={"ID":"4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26","Type":"ContainerDied","Data":"7d6f18b37b07de851fff99cbce6e44a5744a0844ce254f73761bafb493b589b8"} Apr 17 21:13:54.083459 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:54.083295 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w6rk2" event={"ID":"4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26","Type":"ContainerStarted","Data":"2d3bbd6560e3373b5c403ec5379a4be5a4afe7f45bb4e50139d63b2f9a48a612"} Apr 17 21:13:54.105065 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:54.104972 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-w6rk2" podStartSLOduration=4.974592839 podStartE2EDuration="38.104955752s" podCreationTimestamp="2026-04-17 21:13:16 +0000 UTC" firstStartedPulling="2026-04-17 21:13:18.069131589 +0000 UTC m=+1.765686639" lastFinishedPulling="2026-04-17 21:13:51.199494489 +0000 UTC m=+34.896049552" observedRunningTime="2026-04-17 21:13:54.104149344 +0000 UTC m=+37.800704417" watchObservedRunningTime="2026-04-17 21:13:54.104955752 +0000 UTC m=+37.801510824" Apr 17 21:13:56.424371 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:56.424325 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/03a3935c-b9b0-4fda-8142-8cd63a6b5a88-cert\") pod \"ingress-canary-dfvbk\" (UID: \"03a3935c-b9b0-4fda-8142-8cd63a6b5a88\") " pod="openshift-ingress-canary/ingress-canary-dfvbk" Apr 17 21:13:56.424773 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:13:56.424405 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4c6d8acf-b971-44ec-b14c-d6af7a3da43c-metrics-tls\") pod \"dns-default-s9qzj\" (UID: \"4c6d8acf-b971-44ec-b14c-d6af7a3da43c\") " pod="openshift-dns/dns-default-s9qzj" Apr 17 21:13:56.424773 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:56.424479 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 21:13:56.424773 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:56.424547 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03a3935c-b9b0-4fda-8142-8cd63a6b5a88-cert podName:03a3935c-b9b0-4fda-8142-8cd63a6b5a88 nodeName:}" failed. No retries permitted until 2026-04-17 21:14:04.424532197 +0000 UTC m=+48.121087247 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/03a3935c-b9b0-4fda-8142-8cd63a6b5a88-cert") pod "ingress-canary-dfvbk" (UID: "03a3935c-b9b0-4fda-8142-8cd63a6b5a88") : secret "canary-serving-cert" not found Apr 17 21:13:56.424773 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:56.424490 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 21:13:56.424773 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:13:56.424619 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c6d8acf-b971-44ec-b14c-d6af7a3da43c-metrics-tls podName:4c6d8acf-b971-44ec-b14c-d6af7a3da43c nodeName:}" failed. No retries permitted until 2026-04-17 21:14:04.42460664 +0000 UTC m=+48.121161689 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4c6d8acf-b971-44ec-b14c-d6af7a3da43c-metrics-tls") pod "dns-default-s9qzj" (UID: "4c6d8acf-b971-44ec-b14c-d6af7a3da43c") : secret "dns-default-metrics-tls" not found Apr 17 21:14:04.480803 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:14:04.480765 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/03a3935c-b9b0-4fda-8142-8cd63a6b5a88-cert\") pod \"ingress-canary-dfvbk\" (UID: \"03a3935c-b9b0-4fda-8142-8cd63a6b5a88\") " pod="openshift-ingress-canary/ingress-canary-dfvbk" Apr 17 21:14:04.481208 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:14:04.480830 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4c6d8acf-b971-44ec-b14c-d6af7a3da43c-metrics-tls\") pod \"dns-default-s9qzj\" (UID: \"4c6d8acf-b971-44ec-b14c-d6af7a3da43c\") " pod="openshift-dns/dns-default-s9qzj" Apr 17 21:14:04.481208 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:14:04.480920 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 21:14:04.481208 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:14:04.480984 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c6d8acf-b971-44ec-b14c-d6af7a3da43c-metrics-tls podName:4c6d8acf-b971-44ec-b14c-d6af7a3da43c nodeName:}" failed. No retries permitted until 2026-04-17 21:14:20.480969462 +0000 UTC m=+64.177524512 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4c6d8acf-b971-44ec-b14c-d6af7a3da43c-metrics-tls") pod "dns-default-s9qzj" (UID: "4c6d8acf-b971-44ec-b14c-d6af7a3da43c") : secret "dns-default-metrics-tls" not found Apr 17 21:14:04.481208 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:14:04.480920 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 21:14:04.481208 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:14:04.481056 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03a3935c-b9b0-4fda-8142-8cd63a6b5a88-cert podName:03a3935c-b9b0-4fda-8142-8cd63a6b5a88 nodeName:}" failed. No retries permitted until 2026-04-17 21:14:20.48104598 +0000 UTC m=+64.177601049 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/03a3935c-b9b0-4fda-8142-8cd63a6b5a88-cert") pod "ingress-canary-dfvbk" (UID: "03a3935c-b9b0-4fda-8142-8cd63a6b5a88") : secret "canary-serving-cert" not found Apr 17 21:14:05.454518 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:14:05.454483 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-xjq7n"] Apr 17 21:14:05.467118 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:14:05.467090 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xjq7n" Apr 17 21:14:05.467948 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:14:05.467921 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-xjq7n"] Apr 17 21:14:05.469413 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:14:05.469392 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 21:14:05.586601 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:14:05.586572 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e1afcb7f-a57d-4014-a29d-5313bb6b154c-kubelet-config\") pod \"global-pull-secret-syncer-xjq7n\" (UID: \"e1afcb7f-a57d-4014-a29d-5313bb6b154c\") " pod="kube-system/global-pull-secret-syncer-xjq7n" Apr 17 21:14:05.586897 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:14:05.586640 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e1afcb7f-a57d-4014-a29d-5313bb6b154c-original-pull-secret\") pod \"global-pull-secret-syncer-xjq7n\" (UID: \"e1afcb7f-a57d-4014-a29d-5313bb6b154c\") " pod="kube-system/global-pull-secret-syncer-xjq7n" Apr 17 21:14:05.586897 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:14:05.586679 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e1afcb7f-a57d-4014-a29d-5313bb6b154c-dbus\") pod \"global-pull-secret-syncer-xjq7n\" (UID: \"e1afcb7f-a57d-4014-a29d-5313bb6b154c\") " pod="kube-system/global-pull-secret-syncer-xjq7n" Apr 17 21:14:05.687544 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:14:05.687517 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e1afcb7f-a57d-4014-a29d-5313bb6b154c-dbus\") pod \"global-pull-secret-syncer-xjq7n\" (UID: \"e1afcb7f-a57d-4014-a29d-5313bb6b154c\") " pod="kube-system/global-pull-secret-syncer-xjq7n" Apr 17 21:14:05.687544 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:14:05.687547 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e1afcb7f-a57d-4014-a29d-5313bb6b154c-kubelet-config\") pod \"global-pull-secret-syncer-xjq7n\" (UID: \"e1afcb7f-a57d-4014-a29d-5313bb6b154c\") " pod="kube-system/global-pull-secret-syncer-xjq7n" Apr 17 21:14:05.687724 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:14:05.687703 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e1afcb7f-a57d-4014-a29d-5313bb6b154c-original-pull-secret\") pod \"global-pull-secret-syncer-xjq7n\" (UID: \"e1afcb7f-a57d-4014-a29d-5313bb6b154c\") " pod="kube-system/global-pull-secret-syncer-xjq7n" Apr 17 21:14:05.687771 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:14:05.687761 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e1afcb7f-a57d-4014-a29d-5313bb6b154c-kubelet-config\") pod \"global-pull-secret-syncer-xjq7n\" (UID: \"e1afcb7f-a57d-4014-a29d-5313bb6b154c\") " pod="kube-system/global-pull-secret-syncer-xjq7n" Apr 17 21:14:05.687805 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:14:05.687764 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e1afcb7f-a57d-4014-a29d-5313bb6b154c-dbus\") pod \"global-pull-secret-syncer-xjq7n\" (UID: \"e1afcb7f-a57d-4014-a29d-5313bb6b154c\") " pod="kube-system/global-pull-secret-syncer-xjq7n" Apr 17 21:14:05.690982 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:14:05.690955 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e1afcb7f-a57d-4014-a29d-5313bb6b154c-original-pull-secret\") pod \"global-pull-secret-syncer-xjq7n\" (UID: \"e1afcb7f-a57d-4014-a29d-5313bb6b154c\") " pod="kube-system/global-pull-secret-syncer-xjq7n" Apr 17 21:14:05.776314 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:14:05.776292 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xjq7n" Apr 17 21:14:05.956658 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:14:05.956620 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-xjq7n"] Apr 17 21:14:06.105975 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:14:06.105907 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-xjq7n" event={"ID":"e1afcb7f-a57d-4014-a29d-5313bb6b154c","Type":"ContainerStarted","Data":"906add2e066e95ce53a01516f431d2edcda3b7ced8f5c3872ba5b504513a81e7"} Apr 17 21:14:10.115642 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:14:10.115563 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-xjq7n" event={"ID":"e1afcb7f-a57d-4014-a29d-5313bb6b154c","Type":"ContainerStarted","Data":"a40bd2ab1ee70b7bced6eb01c9a6b22e457792c983cb0be84f44267b50d22383"} Apr 17 21:14:10.128644 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:14:10.128486 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-xjq7n" podStartSLOduration=1.293048279 podStartE2EDuration="5.128468046s" podCreationTimestamp="2026-04-17 21:14:05 +0000 UTC" firstStartedPulling="2026-04-17 21:14:05.961553497 +0000 UTC m=+49.658108551" lastFinishedPulling="2026-04-17 21:14:09.796973267 +0000 UTC m=+53.493528318" observedRunningTime="2026-04-17 21:14:10.128135877 +0000 UTC m=+53.824690951" watchObservedRunningTime="2026-04-17 21:14:10.128468046 +0000 UTC m=+53.825023118" Apr 17 21:14:15.069879 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:14:15.069851 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7q89l" Apr 17 21:14:20.491335 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:14:20.491277 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/03a3935c-b9b0-4fda-8142-8cd63a6b5a88-cert\") pod \"ingress-canary-dfvbk\" (UID: \"03a3935c-b9b0-4fda-8142-8cd63a6b5a88\") " pod="openshift-ingress-canary/ingress-canary-dfvbk" Apr 17 21:14:20.491732 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:14:20.491376 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4c6d8acf-b971-44ec-b14c-d6af7a3da43c-metrics-tls\") pod \"dns-default-s9qzj\" (UID: \"4c6d8acf-b971-44ec-b14c-d6af7a3da43c\") " pod="openshift-dns/dns-default-s9qzj" Apr 17 21:14:20.491732 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:14:20.491451 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 21:14:20.491732 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:14:20.491464 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 21:14:20.491732 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:14:20.491527 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03a3935c-b9b0-4fda-8142-8cd63a6b5a88-cert podName:03a3935c-b9b0-4fda-8142-8cd63a6b5a88 nodeName:}" failed. No retries permitted until 2026-04-17 21:14:52.491505703 +0000 UTC m=+96.188060768 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/03a3935c-b9b0-4fda-8142-8cd63a6b5a88-cert") pod "ingress-canary-dfvbk" (UID: "03a3935c-b9b0-4fda-8142-8cd63a6b5a88") : secret "canary-serving-cert" not found Apr 17 21:14:20.491732 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:14:20.491545 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c6d8acf-b971-44ec-b14c-d6af7a3da43c-metrics-tls podName:4c6d8acf-b971-44ec-b14c-d6af7a3da43c nodeName:}" failed. No retries permitted until 2026-04-17 21:14:52.491536255 +0000 UTC m=+96.188091306 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4c6d8acf-b971-44ec-b14c-d6af7a3da43c-metrics-tls") pod "dns-default-s9qzj" (UID: "4c6d8acf-b971-44ec-b14c-d6af7a3da43c") : secret "dns-default-metrics-tls" not found Apr 17 21:14:21.498274 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:14:21.498238 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4574ebc4-f71f-480d-9e07-5e822f12bb1a-metrics-certs\") pod \"network-metrics-daemon-nprtf\" (UID: \"4574ebc4-f71f-480d-9e07-5e822f12bb1a\") " pod="openshift-multus/network-metrics-daemon-nprtf" Apr 17 21:14:21.498657 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:14:21.498380 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 21:14:21.498657 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:14:21.498449 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4574ebc4-f71f-480d-9e07-5e822f12bb1a-metrics-certs podName:4574ebc4-f71f-480d-9e07-5e822f12bb1a nodeName:}" failed. No retries permitted until 2026-04-17 21:15:25.498434413 +0000 UTC m=+129.194989466 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4574ebc4-f71f-480d-9e07-5e822f12bb1a-metrics-certs") pod "network-metrics-daemon-nprtf" (UID: "4574ebc4-f71f-480d-9e07-5e822f12bb1a") : secret "metrics-daemon-secret" not found Apr 17 21:14:21.599506 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:14:21.599469 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8l792\" (UniqueName: \"kubernetes.io/projected/063917df-783e-488d-9088-c5b98092ea29-kube-api-access-8l792\") pod \"network-check-target-dzwbk\" (UID: \"063917df-783e-488d-9088-c5b98092ea29\") " pod="openshift-network-diagnostics/network-check-target-dzwbk" Apr 17 21:14:21.602133 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:14:21.602109 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 21:14:21.611861 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:14:21.611843 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 21:14:21.622866 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:14:21.622847 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l792\" (UniqueName: \"kubernetes.io/projected/063917df-783e-488d-9088-c5b98092ea29-kube-api-access-8l792\") pod \"network-check-target-dzwbk\" (UID: \"063917df-783e-488d-9088-c5b98092ea29\") " pod="openshift-network-diagnostics/network-check-target-dzwbk" Apr 17 21:14:21.676204 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:14:21.676162 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-cgbn6\"" Apr 17 21:14:21.684024 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:14:21.684006 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dzwbk" Apr 17 21:14:21.788684 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:14:21.788643 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-dzwbk"] Apr 17 21:14:21.792966 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:14:21.792938 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod063917df_783e_488d_9088_c5b98092ea29.slice/crio-01364ad9c2716c56aacd93688f4f603719b90dcad51f0f014914e515c3689783 WatchSource:0}: Error finding container 01364ad9c2716c56aacd93688f4f603719b90dcad51f0f014914e515c3689783: Status 404 returned error can't find the container with id 01364ad9c2716c56aacd93688f4f603719b90dcad51f0f014914e515c3689783 Apr 17 21:14:22.137144 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:14:22.137068 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-dzwbk" event={"ID":"063917df-783e-488d-9088-c5b98092ea29","Type":"ContainerStarted","Data":"01364ad9c2716c56aacd93688f4f603719b90dcad51f0f014914e515c3689783"} Apr 17 21:14:25.143434 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:14:25.143374 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-dzwbk" event={"ID":"063917df-783e-488d-9088-c5b98092ea29","Type":"ContainerStarted","Data":"a88c481d79c4e9be1853a739613c9309ef26469a037da5c41baa526c0e4b202b"} Apr 17 21:14:25.143864 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:14:25.143477 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-dzwbk" Apr 17 21:14:25.159918 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:14:25.159873 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-dzwbk" podStartSLOduration=66.357575864 podStartE2EDuration="1m9.159863031s" podCreationTimestamp="2026-04-17 21:13:16 +0000 UTC" firstStartedPulling="2026-04-17 21:14:21.794913181 +0000 UTC m=+65.491468234" lastFinishedPulling="2026-04-17 21:14:24.597200347 +0000 UTC m=+68.293755401" observedRunningTime="2026-04-17 21:14:25.158933943 +0000 UTC m=+68.855489014" watchObservedRunningTime="2026-04-17 21:14:25.159863031 +0000 UTC m=+68.856418102" Apr 17 21:14:52.587934 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:14:52.587889 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/03a3935c-b9b0-4fda-8142-8cd63a6b5a88-cert\") pod \"ingress-canary-dfvbk\" (UID: \"03a3935c-b9b0-4fda-8142-8cd63a6b5a88\") " pod="openshift-ingress-canary/ingress-canary-dfvbk" Apr 17 21:14:52.588374 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:14:52.587957 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4c6d8acf-b971-44ec-b14c-d6af7a3da43c-metrics-tls\") pod \"dns-default-s9qzj\" (UID: \"4c6d8acf-b971-44ec-b14c-d6af7a3da43c\") " pod="openshift-dns/dns-default-s9qzj" Apr 17 21:14:52.588374 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:14:52.588023 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 21:14:52.588374 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:14:52.588058 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 21:14:52.588374 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:14:52.588085 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03a3935c-b9b0-4fda-8142-8cd63a6b5a88-cert podName:03a3935c-b9b0-4fda-8142-8cd63a6b5a88 nodeName:}" failed. No retries permitted until 2026-04-17 21:15:56.588068447 +0000 UTC m=+160.284623497 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/03a3935c-b9b0-4fda-8142-8cd63a6b5a88-cert") pod "ingress-canary-dfvbk" (UID: "03a3935c-b9b0-4fda-8142-8cd63a6b5a88") : secret "canary-serving-cert" not found Apr 17 21:14:52.588374 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:14:52.588104 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c6d8acf-b971-44ec-b14c-d6af7a3da43c-metrics-tls podName:4c6d8acf-b971-44ec-b14c-d6af7a3da43c nodeName:}" failed. No retries permitted until 2026-04-17 21:15:56.588092535 +0000 UTC m=+160.284647586 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4c6d8acf-b971-44ec-b14c-d6af7a3da43c-metrics-tls") pod "dns-default-s9qzj" (UID: "4c6d8acf-b971-44ec-b14c-d6af7a3da43c") : secret "dns-default-metrics-tls" not found Apr 17 21:14:56.148221 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:14:56.148191 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-dzwbk" Apr 17 21:15:25.517804 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:25.517758 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4574ebc4-f71f-480d-9e07-5e822f12bb1a-metrics-certs\") pod \"network-metrics-daemon-nprtf\" (UID: \"4574ebc4-f71f-480d-9e07-5e822f12bb1a\") " pod="openshift-multus/network-metrics-daemon-nprtf" Apr 17 21:15:25.518336 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:15:25.517906 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 21:15:25.518336 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:15:25.517966 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4574ebc4-f71f-480d-9e07-5e822f12bb1a-metrics-certs podName:4574ebc4-f71f-480d-9e07-5e822f12bb1a nodeName:}" failed. No retries permitted until 2026-04-17 21:17:27.51795155 +0000 UTC m=+251.214506600 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4574ebc4-f71f-480d-9e07-5e822f12bb1a-metrics-certs") pod "network-metrics-daemon-nprtf" (UID: "4574ebc4-f71f-480d-9e07-5e822f12bb1a") : secret "metrics-daemon-secret" not found Apr 17 21:15:30.661944 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.661907 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-d2h87"] Apr 17 21:15:30.664711 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.664688 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-d2h87" Apr 17 21:15:30.665339 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.665321 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-7c95fdb69b-rgqgn"] Apr 17 21:15:30.666993 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.666970 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 21:15:30.667105 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.667043 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 17 21:15:30.667105 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.667043 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-2kccd\"" Apr 17 21:15:30.667307 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.667165 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 21:15:30.667352 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.667325 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 17 21:15:30.667890 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.667875 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7c95fdb69b-rgqgn" Apr 17 21:15:30.669739 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.669721 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 17 21:15:30.669828 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.669816 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 17 21:15:30.669885 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.669835 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 17 21:15:30.669885 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.669851 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-vhzx2\"" Apr 17 21:15:30.670068 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.670052 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 21:15:30.670135 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.670117 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 21:15:30.670192 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.670142 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 17 21:15:30.671794 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.671763 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 17 21:15:30.672402 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.672376 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-d2h87"] Apr 17 21:15:30.683464 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.683443 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7c95fdb69b-rgqgn"] Apr 17 21:15:30.755270 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.755232 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bd70117-3252-4f39-aaf3-d2d5efefcc3b-service-ca-bundle\") pod \"insights-operator-585dfdc468-d2h87\" (UID: \"6bd70117-3252-4f39-aaf3-d2d5efefcc3b\") " pod="openshift-insights/insights-operator-585dfdc468-d2h87" Apr 17 21:15:30.755436 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.755284 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr9nn\" (UniqueName: \"kubernetes.io/projected/a7386c80-9465-42c2-b807-c545f0e73d34-kube-api-access-sr9nn\") pod \"router-default-7c95fdb69b-rgqgn\" (UID: \"a7386c80-9465-42c2-b807-c545f0e73d34\") " pod="openshift-ingress/router-default-7c95fdb69b-rgqgn" Apr 17 21:15:30.755436 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.755343 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a7386c80-9465-42c2-b807-c545f0e73d34-stats-auth\") pod \"router-default-7c95fdb69b-rgqgn\" (UID: \"a7386c80-9465-42c2-b807-c545f0e73d34\") " pod="openshift-ingress/router-default-7c95fdb69b-rgqgn" Apr 17 21:15:30.755436 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.755370 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6bd70117-3252-4f39-aaf3-d2d5efefcc3b-tmp\") pod \"insights-operator-585dfdc468-d2h87\" (UID: \"6bd70117-3252-4f39-aaf3-d2d5efefcc3b\") " pod="openshift-insights/insights-operator-585dfdc468-d2h87" Apr 17 21:15:30.755436 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.755385 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/6bd70117-3252-4f39-aaf3-d2d5efefcc3b-snapshots\") pod \"insights-operator-585dfdc468-d2h87\" (UID: \"6bd70117-3252-4f39-aaf3-d2d5efefcc3b\") " pod="openshift-insights/insights-operator-585dfdc468-d2h87" Apr 17 21:15:30.755436 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.755401 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bd70117-3252-4f39-aaf3-d2d5efefcc3b-serving-cert\") pod \"insights-operator-585dfdc468-d2h87\" (UID: \"6bd70117-3252-4f39-aaf3-d2d5efefcc3b\") " pod="openshift-insights/insights-operator-585dfdc468-d2h87" Apr 17 21:15:30.755436 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.755416 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnq5s\" (UniqueName: \"kubernetes.io/projected/6bd70117-3252-4f39-aaf3-d2d5efefcc3b-kube-api-access-gnq5s\") pod \"insights-operator-585dfdc468-d2h87\" (UID: \"6bd70117-3252-4f39-aaf3-d2d5efefcc3b\") " pod="openshift-insights/insights-operator-585dfdc468-d2h87" Apr 17 21:15:30.755622 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.755483 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7386c80-9465-42c2-b807-c545f0e73d34-service-ca-bundle\") pod \"router-default-7c95fdb69b-rgqgn\" (UID: \"a7386c80-9465-42c2-b807-c545f0e73d34\") " pod="openshift-ingress/router-default-7c95fdb69b-rgqgn" Apr 17 21:15:30.755622 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.755540 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7386c80-9465-42c2-b807-c545f0e73d34-metrics-certs\") pod \"router-default-7c95fdb69b-rgqgn\" (UID: \"a7386c80-9465-42c2-b807-c545f0e73d34\") " pod="openshift-ingress/router-default-7c95fdb69b-rgqgn" Apr 17 21:15:30.755622 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.755587 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bd70117-3252-4f39-aaf3-d2d5efefcc3b-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-d2h87\" (UID: \"6bd70117-3252-4f39-aaf3-d2d5efefcc3b\") " pod="openshift-insights/insights-operator-585dfdc468-d2h87" Apr 17 21:15:30.755622 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.755610 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a7386c80-9465-42c2-b807-c545f0e73d34-default-certificate\") pod \"router-default-7c95fdb69b-rgqgn\" (UID: \"a7386c80-9465-42c2-b807-c545f0e73d34\") " pod="openshift-ingress/router-default-7c95fdb69b-rgqgn" Apr 17 21:15:30.856570 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.856510 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7386c80-9465-42c2-b807-c545f0e73d34-metrics-certs\") pod \"router-default-7c95fdb69b-rgqgn\" (UID: \"a7386c80-9465-42c2-b807-c545f0e73d34\") " pod="openshift-ingress/router-default-7c95fdb69b-rgqgn" Apr 17 21:15:30.856776 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.856584 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bd70117-3252-4f39-aaf3-d2d5efefcc3b-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-d2h87\" (UID: \"6bd70117-3252-4f39-aaf3-d2d5efefcc3b\") " pod="openshift-insights/insights-operator-585dfdc468-d2h87" Apr 17 21:15:30.856776 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.856613 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a7386c80-9465-42c2-b807-c545f0e73d34-default-certificate\") pod \"router-default-7c95fdb69b-rgqgn\" (UID: \"a7386c80-9465-42c2-b807-c545f0e73d34\") " pod="openshift-ingress/router-default-7c95fdb69b-rgqgn" Apr 17 21:15:30.856776 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.856636 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bd70117-3252-4f39-aaf3-d2d5efefcc3b-service-ca-bundle\") pod \"insights-operator-585dfdc468-d2h87\" (UID: \"6bd70117-3252-4f39-aaf3-d2d5efefcc3b\") " pod="openshift-insights/insights-operator-585dfdc468-d2h87" Apr 17 21:15:30.856776 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:15:30.856656 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 21:15:30.856776 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.856693 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sr9nn\" (UniqueName: \"kubernetes.io/projected/a7386c80-9465-42c2-b807-c545f0e73d34-kube-api-access-sr9nn\") pod \"router-default-7c95fdb69b-rgqgn\" (UID: \"a7386c80-9465-42c2-b807-c545f0e73d34\") " pod="openshift-ingress/router-default-7c95fdb69b-rgqgn" Apr 17 21:15:30.856776 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:15:30.856721 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7386c80-9465-42c2-b807-c545f0e73d34-metrics-certs podName:a7386c80-9465-42c2-b807-c545f0e73d34 nodeName:}" failed. No retries permitted until 2026-04-17 21:15:31.356705329 +0000 UTC m=+135.053260378 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7386c80-9465-42c2-b807-c545f0e73d34-metrics-certs") pod "router-default-7c95fdb69b-rgqgn" (UID: "a7386c80-9465-42c2-b807-c545f0e73d34") : secret "router-metrics-certs-default" not found Apr 17 21:15:30.857114 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.856777 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a7386c80-9465-42c2-b807-c545f0e73d34-stats-auth\") pod \"router-default-7c95fdb69b-rgqgn\" (UID: \"a7386c80-9465-42c2-b807-c545f0e73d34\") " pod="openshift-ingress/router-default-7c95fdb69b-rgqgn" Apr 17 21:15:30.857114 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.856802 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6bd70117-3252-4f39-aaf3-d2d5efefcc3b-tmp\") pod \"insights-operator-585dfdc468-d2h87\" (UID: \"6bd70117-3252-4f39-aaf3-d2d5efefcc3b\") " pod="openshift-insights/insights-operator-585dfdc468-d2h87" Apr 17 21:15:30.857114 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.856824 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/6bd70117-3252-4f39-aaf3-d2d5efefcc3b-snapshots\") pod \"insights-operator-585dfdc468-d2h87\" (UID: \"6bd70117-3252-4f39-aaf3-d2d5efefcc3b\") " pod="openshift-insights/insights-operator-585dfdc468-d2h87" Apr 17 21:15:30.857114 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.856846 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bd70117-3252-4f39-aaf3-d2d5efefcc3b-serving-cert\") pod \"insights-operator-585dfdc468-d2h87\" (UID: \"6bd70117-3252-4f39-aaf3-d2d5efefcc3b\") " pod="openshift-insights/insights-operator-585dfdc468-d2h87" Apr 17 21:15:30.857114 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.856870 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gnq5s\" (UniqueName: \"kubernetes.io/projected/6bd70117-3252-4f39-aaf3-d2d5efefcc3b-kube-api-access-gnq5s\") pod \"insights-operator-585dfdc468-d2h87\" (UID: \"6bd70117-3252-4f39-aaf3-d2d5efefcc3b\") " pod="openshift-insights/insights-operator-585dfdc468-d2h87" Apr 17 21:15:30.857114 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.856898 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7386c80-9465-42c2-b807-c545f0e73d34-service-ca-bundle\") pod \"router-default-7c95fdb69b-rgqgn\" (UID: \"a7386c80-9465-42c2-b807-c545f0e73d34\") " pod="openshift-ingress/router-default-7c95fdb69b-rgqgn" Apr 17 21:15:30.857114 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:15:30.857002 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a7386c80-9465-42c2-b807-c545f0e73d34-service-ca-bundle podName:a7386c80-9465-42c2-b807-c545f0e73d34 nodeName:}" failed. No retries permitted until 2026-04-17 21:15:31.356987975 +0000 UTC m=+135.053543041 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a7386c80-9465-42c2-b807-c545f0e73d34-service-ca-bundle") pod "router-default-7c95fdb69b-rgqgn" (UID: "a7386c80-9465-42c2-b807-c545f0e73d34") : configmap references non-existent config key: service-ca.crt Apr 17 21:15:30.857461 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.857300 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bd70117-3252-4f39-aaf3-d2d5efefcc3b-service-ca-bundle\") pod \"insights-operator-585dfdc468-d2h87\" (UID: \"6bd70117-3252-4f39-aaf3-d2d5efefcc3b\") " pod="openshift-insights/insights-operator-585dfdc468-d2h87" Apr 17 21:15:30.857758 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.857737 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6bd70117-3252-4f39-aaf3-d2d5efefcc3b-tmp\") pod \"insights-operator-585dfdc468-d2h87\" (UID: \"6bd70117-3252-4f39-aaf3-d2d5efefcc3b\") " pod="openshift-insights/insights-operator-585dfdc468-d2h87" Apr 17 21:15:30.857819 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.857775 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/6bd70117-3252-4f39-aaf3-d2d5efefcc3b-snapshots\") pod \"insights-operator-585dfdc468-d2h87\" (UID: \"6bd70117-3252-4f39-aaf3-d2d5efefcc3b\") " pod="openshift-insights/insights-operator-585dfdc468-d2h87" Apr 17 21:15:30.857963 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.857944 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bd70117-3252-4f39-aaf3-d2d5efefcc3b-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-d2h87\" (UID: \"6bd70117-3252-4f39-aaf3-d2d5efefcc3b\") " pod="openshift-insights/insights-operator-585dfdc468-d2h87" Apr 17 21:15:30.859591 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.859568 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bd70117-3252-4f39-aaf3-d2d5efefcc3b-serving-cert\") pod \"insights-operator-585dfdc468-d2h87\" (UID: \"6bd70117-3252-4f39-aaf3-d2d5efefcc3b\") " pod="openshift-insights/insights-operator-585dfdc468-d2h87" Apr 17 21:15:30.859696 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.859645 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a7386c80-9465-42c2-b807-c545f0e73d34-default-certificate\") pod \"router-default-7c95fdb69b-rgqgn\" (UID: \"a7386c80-9465-42c2-b807-c545f0e73d34\") " pod="openshift-ingress/router-default-7c95fdb69b-rgqgn" Apr 17 21:15:30.859696 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.859670 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a7386c80-9465-42c2-b807-c545f0e73d34-stats-auth\") pod \"router-default-7c95fdb69b-rgqgn\" (UID: \"a7386c80-9465-42c2-b807-c545f0e73d34\") " pod="openshift-ingress/router-default-7c95fdb69b-rgqgn" Apr 17 21:15:30.864967 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.864944 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnq5s\" (UniqueName: \"kubernetes.io/projected/6bd70117-3252-4f39-aaf3-d2d5efefcc3b-kube-api-access-gnq5s\") pod \"insights-operator-585dfdc468-d2h87\" (UID: \"6bd70117-3252-4f39-aaf3-d2d5efefcc3b\") " pod="openshift-insights/insights-operator-585dfdc468-d2h87" Apr 17 21:15:30.865346 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.865330 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr9nn\" (UniqueName: \"kubernetes.io/projected/a7386c80-9465-42c2-b807-c545f0e73d34-kube-api-access-sr9nn\") pod \"router-default-7c95fdb69b-rgqgn\" (UID: \"a7386c80-9465-42c2-b807-c545f0e73d34\") " pod="openshift-ingress/router-default-7c95fdb69b-rgqgn" Apr 17 21:15:30.975399 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:30.975367 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-d2h87" Apr 17 21:15:31.089456 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:31.089417 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-d2h87"] Apr 17 21:15:31.093476 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:15:31.093445 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bd70117_3252_4f39_aaf3_d2d5efefcc3b.slice/crio-b2aaf3a08b2c23bfbecd9ca34b864c5592797cc2d209c60fabc4b682090721a9 WatchSource:0}: Error finding container b2aaf3a08b2c23bfbecd9ca34b864c5592797cc2d209c60fabc4b682090721a9: Status 404 returned error can't find the container with id b2aaf3a08b2c23bfbecd9ca34b864c5592797cc2d209c60fabc4b682090721a9 Apr 17 21:15:31.272216 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:31.272118 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-d2h87" event={"ID":"6bd70117-3252-4f39-aaf3-d2d5efefcc3b","Type":"ContainerStarted","Data":"b2aaf3a08b2c23bfbecd9ca34b864c5592797cc2d209c60fabc4b682090721a9"} Apr 17 21:15:31.359414 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:31.359368 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7386c80-9465-42c2-b807-c545f0e73d34-service-ca-bundle\") pod \"router-default-7c95fdb69b-rgqgn\" (UID: \"a7386c80-9465-42c2-b807-c545f0e73d34\") " pod="openshift-ingress/router-default-7c95fdb69b-rgqgn" Apr 17 21:15:31.359414 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:31.359417 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7386c80-9465-42c2-b807-c545f0e73d34-metrics-certs\") pod \"router-default-7c95fdb69b-rgqgn\" (UID: \"a7386c80-9465-42c2-b807-c545f0e73d34\") " pod="openshift-ingress/router-default-7c95fdb69b-rgqgn" Apr 17 21:15:31.359634 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:15:31.359512 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 21:15:31.359634 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:15:31.359535 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a7386c80-9465-42c2-b807-c545f0e73d34-service-ca-bundle podName:a7386c80-9465-42c2-b807-c545f0e73d34 nodeName:}" failed. No retries permitted until 2026-04-17 21:15:32.359517687 +0000 UTC m=+136.056072736 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a7386c80-9465-42c2-b807-c545f0e73d34-service-ca-bundle") pod "router-default-7c95fdb69b-rgqgn" (UID: "a7386c80-9465-42c2-b807-c545f0e73d34") : configmap references non-existent config key: service-ca.crt Apr 17 21:15:31.359634 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:15:31.359558 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7386c80-9465-42c2-b807-c545f0e73d34-metrics-certs podName:a7386c80-9465-42c2-b807-c545f0e73d34 nodeName:}" failed. No retries permitted until 2026-04-17 21:15:32.359551212 +0000 UTC m=+136.056106261 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7386c80-9465-42c2-b807-c545f0e73d34-metrics-certs") pod "router-default-7c95fdb69b-rgqgn" (UID: "a7386c80-9465-42c2-b807-c545f0e73d34") : secret "router-metrics-certs-default" not found Apr 17 21:15:32.366102 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:32.366057 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7386c80-9465-42c2-b807-c545f0e73d34-service-ca-bundle\") pod \"router-default-7c95fdb69b-rgqgn\" (UID: \"a7386c80-9465-42c2-b807-c545f0e73d34\") " pod="openshift-ingress/router-default-7c95fdb69b-rgqgn" Apr 17 21:15:32.366556 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:32.366126 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7386c80-9465-42c2-b807-c545f0e73d34-metrics-certs\") pod \"router-default-7c95fdb69b-rgqgn\" (UID: \"a7386c80-9465-42c2-b807-c545f0e73d34\") " pod="openshift-ingress/router-default-7c95fdb69b-rgqgn" Apr 17 21:15:32.366556 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:15:32.366264 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 21:15:32.366556 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:15:32.366294 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a7386c80-9465-42c2-b807-c545f0e73d34-service-ca-bundle podName:a7386c80-9465-42c2-b807-c545f0e73d34 nodeName:}" failed. No retries permitted until 2026-04-17 21:15:34.366266444 +0000 UTC m=+138.062821499 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a7386c80-9465-42c2-b807-c545f0e73d34-service-ca-bundle") pod "router-default-7c95fdb69b-rgqgn" (UID: "a7386c80-9465-42c2-b807-c545f0e73d34") : configmap references non-existent config key: service-ca.crt Apr 17 21:15:32.366556 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:15:32.366326 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7386c80-9465-42c2-b807-c545f0e73d34-metrics-certs podName:a7386c80-9465-42c2-b807-c545f0e73d34 nodeName:}" failed. No retries permitted until 2026-04-17 21:15:34.366314408 +0000 UTC m=+138.062869468 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7386c80-9465-42c2-b807-c545f0e73d34-metrics-certs") pod "router-default-7c95fdb69b-rgqgn" (UID: "a7386c80-9465-42c2-b807-c545f0e73d34") : secret "router-metrics-certs-default" not found Apr 17 21:15:33.277299 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:33.277265 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-d2h87" event={"ID":"6bd70117-3252-4f39-aaf3-d2d5efefcc3b","Type":"ContainerStarted","Data":"45d2a3728fab1e0df9c97def2480c8f4954325fba8f0d5c502a5be0f96114e4b"} Apr 17 21:15:33.292382 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:33.292325 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-d2h87" podStartSLOduration=1.54468119 podStartE2EDuration="3.292305354s" podCreationTimestamp="2026-04-17 21:15:30 +0000 UTC" firstStartedPulling="2026-04-17 21:15:31.095246496 +0000 UTC m=+134.791801550" lastFinishedPulling="2026-04-17 21:15:32.84287065 +0000 UTC m=+136.539425714" observedRunningTime="2026-04-17 21:15:33.291531486 +0000 UTC m=+136.988086552" watchObservedRunningTime="2026-04-17 21:15:33.292305354 +0000 UTC m=+136.988860428" Apr 17 21:15:34.382410 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:34.382363 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7386c80-9465-42c2-b807-c545f0e73d34-service-ca-bundle\") pod \"router-default-7c95fdb69b-rgqgn\" (UID: \"a7386c80-9465-42c2-b807-c545f0e73d34\") " pod="openshift-ingress/router-default-7c95fdb69b-rgqgn" Apr 17 21:15:34.382410 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:34.382415 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7386c80-9465-42c2-b807-c545f0e73d34-metrics-certs\") pod \"router-default-7c95fdb69b-rgqgn\" (UID: \"a7386c80-9465-42c2-b807-c545f0e73d34\") " pod="openshift-ingress/router-default-7c95fdb69b-rgqgn" Apr 17 21:15:34.382840 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:15:34.382496 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 21:15:34.382840 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:15:34.382531 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a7386c80-9465-42c2-b807-c545f0e73d34-service-ca-bundle podName:a7386c80-9465-42c2-b807-c545f0e73d34 nodeName:}" failed. No retries permitted until 2026-04-17 21:15:38.382502933 +0000 UTC m=+142.079057983 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a7386c80-9465-42c2-b807-c545f0e73d34-service-ca-bundle") pod "router-default-7c95fdb69b-rgqgn" (UID: "a7386c80-9465-42c2-b807-c545f0e73d34") : configmap references non-existent config key: service-ca.crt Apr 17 21:15:34.382840 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:15:34.382558 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7386c80-9465-42c2-b807-c545f0e73d34-metrics-certs podName:a7386c80-9465-42c2-b807-c545f0e73d34 nodeName:}" failed. No retries permitted until 2026-04-17 21:15:38.382548478 +0000 UTC m=+142.079103528 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7386c80-9465-42c2-b807-c545f0e73d34-metrics-certs") pod "router-default-7c95fdb69b-rgqgn" (UID: "a7386c80-9465-42c2-b807-c545f0e73d34") : secret "router-metrics-certs-default" not found Apr 17 21:15:36.203727 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:36.203698 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qnj2m_778ea300-0830-4ba5-8bc4-8bc4314e7652/dns-node-resolver/0.log" Apr 17 21:15:37.004365 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:37.004339 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-qsc7l_500b1ed9-0358-4553-b67c-814ae8a286af/node-ca/0.log" Apr 17 21:15:38.411535 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:38.411494 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7386c80-9465-42c2-b807-c545f0e73d34-service-ca-bundle\") pod \"router-default-7c95fdb69b-rgqgn\" (UID: \"a7386c80-9465-42c2-b807-c545f0e73d34\") " pod="openshift-ingress/router-default-7c95fdb69b-rgqgn" Apr 17 21:15:38.411535 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:38.411540 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7386c80-9465-42c2-b807-c545f0e73d34-metrics-certs\") pod \"router-default-7c95fdb69b-rgqgn\" (UID: \"a7386c80-9465-42c2-b807-c545f0e73d34\") " pod="openshift-ingress/router-default-7c95fdb69b-rgqgn" Apr 17 21:15:38.411957 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:15:38.411641 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 21:15:38.411957 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:15:38.411663 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a7386c80-9465-42c2-b807-c545f0e73d34-service-ca-bundle podName:a7386c80-9465-42c2-b807-c545f0e73d34 nodeName:}" failed. No retries permitted until 2026-04-17 21:15:46.411644775 +0000 UTC m=+150.108199825 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a7386c80-9465-42c2-b807-c545f0e73d34-service-ca-bundle") pod "router-default-7c95fdb69b-rgqgn" (UID: "a7386c80-9465-42c2-b807-c545f0e73d34") : configmap references non-existent config key: service-ca.crt Apr 17 21:15:38.411957 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:15:38.411688 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7386c80-9465-42c2-b807-c545f0e73d34-metrics-certs podName:a7386c80-9465-42c2-b807-c545f0e73d34 nodeName:}" failed. No retries permitted until 2026-04-17 21:15:46.411680203 +0000 UTC m=+150.108235253 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7386c80-9465-42c2-b807-c545f0e73d34-metrics-certs") pod "router-default-7c95fdb69b-rgqgn" (UID: "a7386c80-9465-42c2-b807-c545f0e73d34") : secret "router-metrics-certs-default" not found Apr 17 21:15:38.627748 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:38.627716 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xtmbm"] Apr 17 21:15:38.630363 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:38.630348 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xtmbm" Apr 17 21:15:38.632493 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:38.632471 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 17 21:15:38.632627 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:38.632594 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 17 21:15:38.632627 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:38.632603 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 17 21:15:38.633334 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:38.633319 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-bd44x\"" Apr 17 21:15:38.633386 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:38.633362 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 17 21:15:38.639653 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:38.639627 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xtmbm"] Apr 17 21:15:38.713201 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:38.713084 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfbb2b11-b5bf-4910-b508-d65f63da7218-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-xtmbm\" (UID: \"bfbb2b11-b5bf-4910-b508-d65f63da7218\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xtmbm" Apr 17 21:15:38.713201 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:38.713193 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt6cj\" (UniqueName: \"kubernetes.io/projected/bfbb2b11-b5bf-4910-b508-d65f63da7218-kube-api-access-tt6cj\") pod \"kube-storage-version-migrator-operator-6769c5d45-xtmbm\" (UID: \"bfbb2b11-b5bf-4910-b508-d65f63da7218\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xtmbm" Apr 17 21:15:38.713379 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:38.713223 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfbb2b11-b5bf-4910-b508-d65f63da7218-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-xtmbm\" (UID: \"bfbb2b11-b5bf-4910-b508-d65f63da7218\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xtmbm" Apr 17 21:15:38.813815 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:38.813783 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfbb2b11-b5bf-4910-b508-d65f63da7218-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-xtmbm\" (UID: \"bfbb2b11-b5bf-4910-b508-d65f63da7218\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xtmbm" Apr 17 21:15:38.813966 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:38.813849 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tt6cj\" (UniqueName: \"kubernetes.io/projected/bfbb2b11-b5bf-4910-b508-d65f63da7218-kube-api-access-tt6cj\") pod \"kube-storage-version-migrator-operator-6769c5d45-xtmbm\" (UID: \"bfbb2b11-b5bf-4910-b508-d65f63da7218\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xtmbm" Apr 17 21:15:38.813966 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:38.813874 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfbb2b11-b5bf-4910-b508-d65f63da7218-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-xtmbm\" (UID: \"bfbb2b11-b5bf-4910-b508-d65f63da7218\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xtmbm" Apr 17 21:15:38.814476 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:38.814444 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfbb2b11-b5bf-4910-b508-d65f63da7218-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-xtmbm\" (UID: \"bfbb2b11-b5bf-4910-b508-d65f63da7218\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xtmbm" Apr 17 21:15:38.816093 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:38.816074 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfbb2b11-b5bf-4910-b508-d65f63da7218-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-xtmbm\" (UID: \"bfbb2b11-b5bf-4910-b508-d65f63da7218\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xtmbm" Apr 17 21:15:38.821361 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:38.821342 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt6cj\" (UniqueName: \"kubernetes.io/projected/bfbb2b11-b5bf-4910-b508-d65f63da7218-kube-api-access-tt6cj\") pod \"kube-storage-version-migrator-operator-6769c5d45-xtmbm\" (UID: \"bfbb2b11-b5bf-4910-b508-d65f63da7218\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xtmbm" Apr 17 21:15:38.938978 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:38.938944 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xtmbm" Apr 17 21:15:39.048080 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:39.048051 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xtmbm"] Apr 17 21:15:39.051027 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:15:39.050995 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfbb2b11_b5bf_4910_b508_d65f63da7218.slice/crio-5bfe4043cb8c2e8ac94ae3f970ddf244bdf9196e826abe0a6d35429c3a7d93e6 WatchSource:0}: Error finding container 5bfe4043cb8c2e8ac94ae3f970ddf244bdf9196e826abe0a6d35429c3a7d93e6: Status 404 returned error can't find the container with id 5bfe4043cb8c2e8ac94ae3f970ddf244bdf9196e826abe0a6d35429c3a7d93e6 Apr 17 21:15:39.290121 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:39.290039 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xtmbm" event={"ID":"bfbb2b11-b5bf-4910-b508-d65f63da7218","Type":"ContainerStarted","Data":"5bfe4043cb8c2e8ac94ae3f970ddf244bdf9196e826abe0a6d35429c3a7d93e6"} Apr 17 21:15:40.705337 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:40.705281 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c54gp"] Apr 17 21:15:40.708423 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:40.708401 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c54gp" Apr 17 21:15:40.710615 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:40.710557 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 17 21:15:40.710615 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:40.710567 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 17 21:15:40.710779 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:40.710627 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 17 21:15:40.710779 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:40.710705 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 17 21:15:40.711455 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:40.711432 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-pcrrd\"" Apr 17 21:15:40.714951 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:40.714909 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c54gp"] Apr 17 21:15:40.828891 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:40.828852 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6tgp\" (UniqueName: \"kubernetes.io/projected/b2eabc41-fed0-438c-adb4-7eda4a023735-kube-api-access-z6tgp\") pod \"service-ca-operator-d6fc45fc5-c54gp\" (UID: \"b2eabc41-fed0-438c-adb4-7eda4a023735\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c54gp" Apr 17 21:15:40.829086 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:40.828910 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2eabc41-fed0-438c-adb4-7eda4a023735-serving-cert\") pod \"service-ca-operator-d6fc45fc5-c54gp\" (UID: \"b2eabc41-fed0-438c-adb4-7eda4a023735\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c54gp" Apr 17 21:15:40.829086 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:40.829073 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2eabc41-fed0-438c-adb4-7eda4a023735-config\") pod \"service-ca-operator-d6fc45fc5-c54gp\" (UID: \"b2eabc41-fed0-438c-adb4-7eda4a023735\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c54gp" Apr 17 21:15:40.929607 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:40.929581 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z6tgp\" (UniqueName: \"kubernetes.io/projected/b2eabc41-fed0-438c-adb4-7eda4a023735-kube-api-access-z6tgp\") pod \"service-ca-operator-d6fc45fc5-c54gp\" (UID: \"b2eabc41-fed0-438c-adb4-7eda4a023735\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c54gp" Apr 17 21:15:40.929744 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:40.929634 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2eabc41-fed0-438c-adb4-7eda4a023735-serving-cert\") pod \"service-ca-operator-d6fc45fc5-c54gp\" (UID: \"b2eabc41-fed0-438c-adb4-7eda4a023735\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c54gp" Apr 17 21:15:40.929794 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:40.929753 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2eabc41-fed0-438c-adb4-7eda4a023735-config\") pod \"service-ca-operator-d6fc45fc5-c54gp\" (UID: \"b2eabc41-fed0-438c-adb4-7eda4a023735\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c54gp" Apr 17 21:15:40.930649 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:40.930624 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2eabc41-fed0-438c-adb4-7eda4a023735-config\") pod \"service-ca-operator-d6fc45fc5-c54gp\" (UID: \"b2eabc41-fed0-438c-adb4-7eda4a023735\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c54gp" Apr 17 21:15:40.931823 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:40.931806 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2eabc41-fed0-438c-adb4-7eda4a023735-serving-cert\") pod \"service-ca-operator-d6fc45fc5-c54gp\" (UID: \"b2eabc41-fed0-438c-adb4-7eda4a023735\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c54gp" Apr 17 21:15:40.936529 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:40.936508 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6tgp\" (UniqueName: \"kubernetes.io/projected/b2eabc41-fed0-438c-adb4-7eda4a023735-kube-api-access-z6tgp\") pod \"service-ca-operator-d6fc45fc5-c54gp\" (UID: \"b2eabc41-fed0-438c-adb4-7eda4a023735\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c54gp" Apr 17 21:15:41.019472 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:41.019433 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c54gp" Apr 17 21:15:41.130773 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:41.130725 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c54gp"] Apr 17 21:15:41.135245 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:15:41.135222 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2eabc41_fed0_438c_adb4_7eda4a023735.slice/crio-c4e13389e0ea394273fa9e37efc0790df72c5719bc90c2434cbfe26b8c5433b6 WatchSource:0}: Error finding container c4e13389e0ea394273fa9e37efc0790df72c5719bc90c2434cbfe26b8c5433b6: Status 404 returned error can't find the container with id c4e13389e0ea394273fa9e37efc0790df72c5719bc90c2434cbfe26b8c5433b6 Apr 17 21:15:41.295331 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:41.295299 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xtmbm" event={"ID":"bfbb2b11-b5bf-4910-b508-d65f63da7218","Type":"ContainerStarted","Data":"16f7d83fdb88b8ad235185059a79d4afbc678781dd48835a0c7ec91761fc6d88"} Apr 17 21:15:41.296362 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:41.296330 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c54gp" event={"ID":"b2eabc41-fed0-438c-adb4-7eda4a023735","Type":"ContainerStarted","Data":"c4e13389e0ea394273fa9e37efc0790df72c5719bc90c2434cbfe26b8c5433b6"} Apr 17 21:15:41.310978 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:41.310936 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xtmbm" podStartSLOduration=1.4398820159999999 podStartE2EDuration="3.310923778s" podCreationTimestamp="2026-04-17 21:15:38 +0000 UTC" firstStartedPulling="2026-04-17 21:15:39.052633805 +0000 UTC m=+142.749188858" lastFinishedPulling="2026-04-17 21:15:40.92367557 +0000 UTC m=+144.620230620" observedRunningTime="2026-04-17 21:15:41.310540792 +0000 UTC m=+145.007095867" watchObservedRunningTime="2026-04-17 21:15:41.310923778 +0000 UTC m=+145.007478850" Apr 17 21:15:41.368375 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:41.368345 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-7f8kn"] Apr 17 21:15:41.372613 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:41.372597 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7f8kn" Apr 17 21:15:41.374619 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:41.374599 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-jb2bj\"" Apr 17 21:15:41.377573 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:41.377550 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-7f8kn"] Apr 17 21:15:41.432359 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:41.432338 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p5gg\" (UniqueName: \"kubernetes.io/projected/840ce509-ff42-4b0d-8c8b-7646e4c627f2-kube-api-access-5p5gg\") pod \"network-check-source-8894fc9bd-7f8kn\" (UID: \"840ce509-ff42-4b0d-8c8b-7646e4c627f2\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7f8kn" Apr 17 21:15:41.533012 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:41.532982 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5p5gg\" (UniqueName: \"kubernetes.io/projected/840ce509-ff42-4b0d-8c8b-7646e4c627f2-kube-api-access-5p5gg\") pod \"network-check-source-8894fc9bd-7f8kn\" (UID: \"840ce509-ff42-4b0d-8c8b-7646e4c627f2\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7f8kn" Apr 17 21:15:41.542235 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:41.542193 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p5gg\" (UniqueName: \"kubernetes.io/projected/840ce509-ff42-4b0d-8c8b-7646e4c627f2-kube-api-access-5p5gg\") pod \"network-check-source-8894fc9bd-7f8kn\" (UID: \"840ce509-ff42-4b0d-8c8b-7646e4c627f2\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7f8kn" Apr 17 21:15:41.682363 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:41.682295 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7f8kn" Apr 17 21:15:41.809200 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:41.809155 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-7f8kn"] Apr 17 21:15:41.813138 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:15:41.813100 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod840ce509_ff42_4b0d_8c8b_7646e4c627f2.slice/crio-7e38866697a8b58f4d30fc33764676812773ee8e8a160544951c0ab7e91a4714 WatchSource:0}: Error finding container 7e38866697a8b58f4d30fc33764676812773ee8e8a160544951c0ab7e91a4714: Status 404 returned error can't find the container with id 7e38866697a8b58f4d30fc33764676812773ee8e8a160544951c0ab7e91a4714 Apr 17 21:15:42.300809 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:42.300761 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7f8kn" event={"ID":"840ce509-ff42-4b0d-8c8b-7646e4c627f2","Type":"ContainerStarted","Data":"c7c4a508e59caf076b2d5d3c3017eb569a072be429f1ac0c56fdec9263955e9f"} Apr 17 21:15:42.300995 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:42.300820 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7f8kn" event={"ID":"840ce509-ff42-4b0d-8c8b-7646e4c627f2","Type":"ContainerStarted","Data":"7e38866697a8b58f4d30fc33764676812773ee8e8a160544951c0ab7e91a4714"} Apr 17 21:15:42.314506 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:42.314462 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7f8kn" podStartSLOduration=1.314449746 podStartE2EDuration="1.314449746s" podCreationTimestamp="2026-04-17 21:15:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:15:42.313427005 +0000 UTC m=+146.009982068" watchObservedRunningTime="2026-04-17 21:15:42.314449746 +0000 UTC m=+146.011004818" Apr 17 21:15:43.304882 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:43.304845 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c54gp" event={"ID":"b2eabc41-fed0-438c-adb4-7eda4a023735","Type":"ContainerStarted","Data":"977ce6f91fbc8e4930ef7fb9f60785ba4aec92ef037f063c9eb92b60b267f175"} Apr 17 21:15:43.319331 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:43.319240 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c54gp" podStartSLOduration=1.398644984 podStartE2EDuration="3.319227816s" podCreationTimestamp="2026-04-17 21:15:40 +0000 UTC" firstStartedPulling="2026-04-17 21:15:41.137135416 +0000 UTC m=+144.833690589" lastFinishedPulling="2026-04-17 21:15:43.057718371 +0000 UTC m=+146.754273421" observedRunningTime="2026-04-17 21:15:43.317731184 +0000 UTC m=+147.014286256" watchObservedRunningTime="2026-04-17 21:15:43.319227816 +0000 UTC m=+147.015782882" Apr 17 21:15:46.474960 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:46.474927 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7386c80-9465-42c2-b807-c545f0e73d34-metrics-certs\") pod \"router-default-7c95fdb69b-rgqgn\" (UID: \"a7386c80-9465-42c2-b807-c545f0e73d34\") " pod="openshift-ingress/router-default-7c95fdb69b-rgqgn" Apr 17 21:15:46.475399 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:46.475011 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7386c80-9465-42c2-b807-c545f0e73d34-service-ca-bundle\") pod \"router-default-7c95fdb69b-rgqgn\" (UID: \"a7386c80-9465-42c2-b807-c545f0e73d34\") " pod="openshift-ingress/router-default-7c95fdb69b-rgqgn" Apr 17 21:15:46.475399 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:15:46.475069 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 21:15:46.475399 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:15:46.475122 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a7386c80-9465-42c2-b807-c545f0e73d34-service-ca-bundle podName:a7386c80-9465-42c2-b807-c545f0e73d34 nodeName:}" failed. No retries permitted until 2026-04-17 21:16:02.475106729 +0000 UTC m=+166.171661779 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a7386c80-9465-42c2-b807-c545f0e73d34-service-ca-bundle") pod "router-default-7c95fdb69b-rgqgn" (UID: "a7386c80-9465-42c2-b807-c545f0e73d34") : configmap references non-existent config key: service-ca.crt Apr 17 21:15:46.475399 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:15:46.475136 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7386c80-9465-42c2-b807-c545f0e73d34-metrics-certs podName:a7386c80-9465-42c2-b807-c545f0e73d34 nodeName:}" failed. No retries permitted until 2026-04-17 21:16:02.475129883 +0000 UTC m=+166.171684933 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7386c80-9465-42c2-b807-c545f0e73d34-metrics-certs") pod "router-default-7c95fdb69b-rgqgn" (UID: "a7386c80-9465-42c2-b807-c545f0e73d34") : secret "router-metrics-certs-default" not found Apr 17 21:15:46.685734 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:46.685698 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-hj29w"] Apr 17 21:15:46.688347 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:46.688332 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-hj29w" Apr 17 21:15:46.690732 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:46.690713 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 17 21:15:46.690859 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:46.690783 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 17 21:15:46.690950 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:46.690936 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 17 21:15:46.691576 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:46.691559 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-qdvfc\"" Apr 17 21:15:46.691576 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:46.691570 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 17 21:15:46.694967 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:46.694943 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-hj29w"] Apr 17 21:15:46.776159 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:46.776094 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/13b6205e-ab82-4610-a2cc-b0fb4957873a-signing-key\") pod \"service-ca-865cb79987-hj29w\" (UID: \"13b6205e-ab82-4610-a2cc-b0fb4957873a\") " pod="openshift-service-ca/service-ca-865cb79987-hj29w" Apr 17 21:15:46.776313 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:46.776195 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gbxp\" (UniqueName: \"kubernetes.io/projected/13b6205e-ab82-4610-a2cc-b0fb4957873a-kube-api-access-4gbxp\") pod \"service-ca-865cb79987-hj29w\" (UID: \"13b6205e-ab82-4610-a2cc-b0fb4957873a\") " pod="openshift-service-ca/service-ca-865cb79987-hj29w" Apr 17 21:15:46.776313 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:46.776235 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/13b6205e-ab82-4610-a2cc-b0fb4957873a-signing-cabundle\") pod \"service-ca-865cb79987-hj29w\" (UID: \"13b6205e-ab82-4610-a2cc-b0fb4957873a\") " pod="openshift-service-ca/service-ca-865cb79987-hj29w" Apr 17 21:15:46.876766 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:46.876745 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/13b6205e-ab82-4610-a2cc-b0fb4957873a-signing-key\") pod \"service-ca-865cb79987-hj29w\" (UID: \"13b6205e-ab82-4610-a2cc-b0fb4957873a\") " pod="openshift-service-ca/service-ca-865cb79987-hj29w" Apr 17 21:15:46.876914 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:46.876796 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gbxp\" (UniqueName: \"kubernetes.io/projected/13b6205e-ab82-4610-a2cc-b0fb4957873a-kube-api-access-4gbxp\") pod \"service-ca-865cb79987-hj29w\" (UID: \"13b6205e-ab82-4610-a2cc-b0fb4957873a\") " pod="openshift-service-ca/service-ca-865cb79987-hj29w" Apr 17 21:15:46.876914 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:46.876824 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/13b6205e-ab82-4610-a2cc-b0fb4957873a-signing-cabundle\") pod \"service-ca-865cb79987-hj29w\" (UID: \"13b6205e-ab82-4610-a2cc-b0fb4957873a\") " pod="openshift-service-ca/service-ca-865cb79987-hj29w" Apr 17 21:15:46.877419 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:46.877401 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/13b6205e-ab82-4610-a2cc-b0fb4957873a-signing-cabundle\") pod \"service-ca-865cb79987-hj29w\" (UID: \"13b6205e-ab82-4610-a2cc-b0fb4957873a\") " pod="openshift-service-ca/service-ca-865cb79987-hj29w" Apr 17 21:15:46.879268 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:46.879247 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/13b6205e-ab82-4610-a2cc-b0fb4957873a-signing-key\") pod \"service-ca-865cb79987-hj29w\" (UID: \"13b6205e-ab82-4610-a2cc-b0fb4957873a\") " pod="openshift-service-ca/service-ca-865cb79987-hj29w" Apr 17 21:15:46.884764 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:46.884744 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gbxp\" (UniqueName: \"kubernetes.io/projected/13b6205e-ab82-4610-a2cc-b0fb4957873a-kube-api-access-4gbxp\") pod \"service-ca-865cb79987-hj29w\" (UID: \"13b6205e-ab82-4610-a2cc-b0fb4957873a\") " pod="openshift-service-ca/service-ca-865cb79987-hj29w" Apr 17 21:15:46.996696 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:46.996675 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-hj29w" Apr 17 21:15:47.109431 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:47.109406 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-hj29w"] Apr 17 21:15:47.112393 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:15:47.112367 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13b6205e_ab82_4610_a2cc_b0fb4957873a.slice/crio-a18b2e34c958539bd63d71ae1836b2dafac83703f19e5b6ca61a621b65af951f WatchSource:0}: Error finding container a18b2e34c958539bd63d71ae1836b2dafac83703f19e5b6ca61a621b65af951f: Status 404 returned error can't find the container with id a18b2e34c958539bd63d71ae1836b2dafac83703f19e5b6ca61a621b65af951f Apr 17 21:15:47.315456 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:47.315362 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-hj29w" event={"ID":"13b6205e-ab82-4610-a2cc-b0fb4957873a","Type":"ContainerStarted","Data":"20d9428fa618237ce6ae268b5a8c176204820a16206f01f1daa502929af3f790"} Apr 17 21:15:47.315456 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:47.315403 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-hj29w" event={"ID":"13b6205e-ab82-4610-a2cc-b0fb4957873a","Type":"ContainerStarted","Data":"a18b2e34c958539bd63d71ae1836b2dafac83703f19e5b6ca61a621b65af951f"} Apr 17 21:15:47.330360 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:47.330318 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-hj29w" podStartSLOduration=1.330305636 podStartE2EDuration="1.330305636s" podCreationTimestamp="2026-04-17 21:15:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:15:47.329704061 +0000 UTC m=+151.026259137" watchObservedRunningTime="2026-04-17 21:15:47.330305636 +0000 UTC m=+151.026860707" Apr 17 21:15:51.675869 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:15:51.675818 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-s9qzj" podUID="4c6d8acf-b971-44ec-b14c-d6af7a3da43c" Apr 17 21:15:51.699092 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:15:51.699071 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-dfvbk" podUID="03a3935c-b9b0-4fda-8142-8cd63a6b5a88" Apr 17 21:15:51.876165 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:15:51.876116 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-nprtf" podUID="4574ebc4-f71f-480d-9e07-5e822f12bb1a" Apr 17 21:15:52.327246 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:52.327218 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-s9qzj" Apr 17 21:15:56.657368 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:56.657330 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/03a3935c-b9b0-4fda-8142-8cd63a6b5a88-cert\") pod \"ingress-canary-dfvbk\" (UID: \"03a3935c-b9b0-4fda-8142-8cd63a6b5a88\") " pod="openshift-ingress-canary/ingress-canary-dfvbk" Apr 17 21:15:56.657796 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:56.657392 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4c6d8acf-b971-44ec-b14c-d6af7a3da43c-metrics-tls\") pod \"dns-default-s9qzj\" (UID: \"4c6d8acf-b971-44ec-b14c-d6af7a3da43c\") " pod="openshift-dns/dns-default-s9qzj" Apr 17 21:15:56.659730 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:56.659705 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/03a3935c-b9b0-4fda-8142-8cd63a6b5a88-cert\") pod \"ingress-canary-dfvbk\" (UID: \"03a3935c-b9b0-4fda-8142-8cd63a6b5a88\") " pod="openshift-ingress-canary/ingress-canary-dfvbk" Apr 17 21:15:56.660040 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:56.660026 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4c6d8acf-b971-44ec-b14c-d6af7a3da43c-metrics-tls\") pod \"dns-default-s9qzj\" (UID: \"4c6d8acf-b971-44ec-b14c-d6af7a3da43c\") " pod="openshift-dns/dns-default-s9qzj" Apr 17 21:15:56.830383 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:56.830346 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-k464c\"" Apr 17 21:15:56.839088 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:56.839063 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-s9qzj" Apr 17 21:15:56.961554 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:56.961522 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-s9qzj"] Apr 17 21:15:56.964828 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:15:56.964803 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c6d8acf_b971_44ec_b14c_d6af7a3da43c.slice/crio-bf7cb2d7fd75756990e5a668537a34177bbf833455ba877bf9468dfee1c48b56 WatchSource:0}: Error finding container bf7cb2d7fd75756990e5a668537a34177bbf833455ba877bf9468dfee1c48b56: Status 404 returned error can't find the container with id bf7cb2d7fd75756990e5a668537a34177bbf833455ba877bf9468dfee1c48b56 Apr 17 21:15:57.341127 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:57.341091 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-s9qzj" event={"ID":"4c6d8acf-b971-44ec-b14c-d6af7a3da43c","Type":"ContainerStarted","Data":"bf7cb2d7fd75756990e5a668537a34177bbf833455ba877bf9468dfee1c48b56"} Apr 17 21:15:59.347954 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:59.347915 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-s9qzj" event={"ID":"4c6d8acf-b971-44ec-b14c-d6af7a3da43c","Type":"ContainerStarted","Data":"6557bee0ff8417f747a9da17fd69fac564d4abca80c418479e4619af19a07aa9"} Apr 17 21:15:59.347954 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:59.347950 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-s9qzj" event={"ID":"4c6d8acf-b971-44ec-b14c-d6af7a3da43c","Type":"ContainerStarted","Data":"680bfb92753cfde83ded21d0dfa70af72e5c609f37ff781bb931657e631e3450"} Apr 17 21:15:59.348403 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:59.348045 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-s9qzj" Apr 17 21:15:59.363757 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:15:59.363715 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-s9qzj" podStartSLOduration=129.758300807 podStartE2EDuration="2m11.363703366s" podCreationTimestamp="2026-04-17 21:13:48 +0000 UTC" firstStartedPulling="2026-04-17 21:15:56.966685898 +0000 UTC m=+160.663240947" lastFinishedPulling="2026-04-17 21:15:58.572088456 +0000 UTC m=+162.268643506" observedRunningTime="2026-04-17 21:15:59.362541985 +0000 UTC m=+163.059097057" watchObservedRunningTime="2026-04-17 21:15:59.363703366 +0000 UTC m=+163.060258437" Apr 17 21:16:02.501486 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:02.501442 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7386c80-9465-42c2-b807-c545f0e73d34-metrics-certs\") pod \"router-default-7c95fdb69b-rgqgn\" (UID: \"a7386c80-9465-42c2-b807-c545f0e73d34\") " pod="openshift-ingress/router-default-7c95fdb69b-rgqgn" Apr 17 21:16:02.501939 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:02.501497 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7386c80-9465-42c2-b807-c545f0e73d34-service-ca-bundle\") pod \"router-default-7c95fdb69b-rgqgn\" (UID: \"a7386c80-9465-42c2-b807-c545f0e73d34\") " pod="openshift-ingress/router-default-7c95fdb69b-rgqgn" Apr 17 21:16:02.502068 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:02.502050 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7386c80-9465-42c2-b807-c545f0e73d34-service-ca-bundle\") pod \"router-default-7c95fdb69b-rgqgn\" (UID: \"a7386c80-9465-42c2-b807-c545f0e73d34\") " pod="openshift-ingress/router-default-7c95fdb69b-rgqgn" Apr 17 21:16:02.503700 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:02.503683 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7386c80-9465-42c2-b807-c545f0e73d34-metrics-certs\") pod \"router-default-7c95fdb69b-rgqgn\" (UID: \"a7386c80-9465-42c2-b807-c545f0e73d34\") " pod="openshift-ingress/router-default-7c95fdb69b-rgqgn" Apr 17 21:16:02.785222 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:02.785126 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7c95fdb69b-rgqgn" Apr 17 21:16:02.900285 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:02.900256 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7c95fdb69b-rgqgn"] Apr 17 21:16:02.903074 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:16:02.903046 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7386c80_9465_42c2_b807_c545f0e73d34.slice/crio-0dd6a96504b1569593c92098f308ecd1d62583e1c7ac40f5f1899b00f322326f WatchSource:0}: Error finding container 0dd6a96504b1569593c92098f308ecd1d62583e1c7ac40f5f1899b00f322326f: Status 404 returned error can't find the container with id 0dd6a96504b1569593c92098f308ecd1d62583e1c7ac40f5f1899b00f322326f Apr 17 21:16:03.358793 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:03.358761 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7c95fdb69b-rgqgn" event={"ID":"a7386c80-9465-42c2-b807-c545f0e73d34","Type":"ContainerStarted","Data":"7f5442a6bf14cc4c20914f3674627199bea4e7cd308deb3557c7b0b576ab2bab"} Apr 17 21:16:03.358962 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:03.358799 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7c95fdb69b-rgqgn" event={"ID":"a7386c80-9465-42c2-b807-c545f0e73d34","Type":"ContainerStarted","Data":"0dd6a96504b1569593c92098f308ecd1d62583e1c7ac40f5f1899b00f322326f"} Apr 17 21:16:03.376257 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:03.376217 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-7c95fdb69b-rgqgn" podStartSLOduration=33.376204146 podStartE2EDuration="33.376204146s" podCreationTimestamp="2026-04-17 21:15:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:16:03.375592862 +0000 UTC m=+167.072147933" watchObservedRunningTime="2026-04-17 21:16:03.376204146 +0000 UTC m=+167.072759218" Apr 17 21:16:03.785742 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:03.785712 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7c95fdb69b-rgqgn" Apr 17 21:16:03.788187 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:03.788149 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-7c95fdb69b-rgqgn" Apr 17 21:16:03.863712 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:03.863691 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dfvbk" Apr 17 21:16:03.866213 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:03.866190 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ggwsl\"" Apr 17 21:16:03.874993 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:03.874973 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dfvbk" Apr 17 21:16:03.984295 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:03.984268 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dfvbk"] Apr 17 21:16:03.987265 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:16:03.987237 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03a3935c_b9b0_4fda_8142_8cd63a6b5a88.slice/crio-3fc78aa2ecabe28f3f5d9e83b086c00761780162453bc86ca4fb5df9d5afa0ad WatchSource:0}: Error finding container 3fc78aa2ecabe28f3f5d9e83b086c00761780162453bc86ca4fb5df9d5afa0ad: Status 404 returned error can't find the container with id 3fc78aa2ecabe28f3f5d9e83b086c00761780162453bc86ca4fb5df9d5afa0ad Apr 17 21:16:04.363021 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:04.362984 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dfvbk" event={"ID":"03a3935c-b9b0-4fda-8142-8cd63a6b5a88","Type":"ContainerStarted","Data":"3fc78aa2ecabe28f3f5d9e83b086c00761780162453bc86ca4fb5df9d5afa0ad"} Apr 17 21:16:04.363302 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:04.363285 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-7c95fdb69b-rgqgn" Apr 17 21:16:04.364534 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:04.364517 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-7c95fdb69b-rgqgn" Apr 17 21:16:04.864234 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:04.864194 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nprtf" Apr 17 21:16:05.721560 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:05.721532 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fz26b"] Apr 17 21:16:05.724473 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:05.724453 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fz26b" Apr 17 21:16:05.726652 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:05.726627 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 17 21:16:05.726775 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:05.726633 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-25bq9\"" Apr 17 21:16:05.734515 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:05.734485 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fz26b"] Apr 17 21:16:05.826528 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:05.826492 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/e5ddf28a-4f8a-4a7c-af29-4c0befc12d5f-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-fz26b\" (UID: \"e5ddf28a-4f8a-4a7c-af29-4c0befc12d5f\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fz26b" Apr 17 21:16:05.840649 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:05.840621 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-rjkj7"] Apr 17 21:16:05.843625 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:05.843605 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-rjkj7" Apr 17 21:16:05.845797 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:05.845762 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-5zvnz\"" Apr 17 21:16:05.845892 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:05.845851 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 21:16:05.845978 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:05.845960 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 21:16:05.856223 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:05.856201 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-rjkj7"] Apr 17 21:16:05.927112 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:05.927027 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/e5ddf28a-4f8a-4a7c-af29-4c0befc12d5f-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-fz26b\" (UID: \"e5ddf28a-4f8a-4a7c-af29-4c0befc12d5f\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fz26b" Apr 17 21:16:05.929602 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:05.929577 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/e5ddf28a-4f8a-4a7c-af29-4c0befc12d5f-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-fz26b\" (UID: \"e5ddf28a-4f8a-4a7c-af29-4c0befc12d5f\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fz26b" Apr 17 21:16:06.028343 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:06.028309 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/88bea75e-5668-4862-a73b-e197ee6429fe-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rjkj7\" (UID: \"88bea75e-5668-4862-a73b-e197ee6429fe\") " pod="openshift-insights/insights-runtime-extractor-rjkj7" Apr 17 21:16:06.028495 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:06.028350 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/88bea75e-5668-4862-a73b-e197ee6429fe-crio-socket\") pod \"insights-runtime-extractor-rjkj7\" (UID: \"88bea75e-5668-4862-a73b-e197ee6429fe\") " pod="openshift-insights/insights-runtime-extractor-rjkj7" Apr 17 21:16:06.028495 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:06.028384 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/88bea75e-5668-4862-a73b-e197ee6429fe-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rjkj7\" (UID: \"88bea75e-5668-4862-a73b-e197ee6429fe\") " pod="openshift-insights/insights-runtime-extractor-rjkj7" Apr 17 21:16:06.028495 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:06.028443 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r2k2\" (UniqueName: \"kubernetes.io/projected/88bea75e-5668-4862-a73b-e197ee6429fe-kube-api-access-7r2k2\") pod \"insights-runtime-extractor-rjkj7\" (UID: \"88bea75e-5668-4862-a73b-e197ee6429fe\") " pod="openshift-insights/insights-runtime-extractor-rjkj7" Apr 17 21:16:06.028495 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:06.028483 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/88bea75e-5668-4862-a73b-e197ee6429fe-data-volume\") pod \"insights-runtime-extractor-rjkj7\" (UID: \"88bea75e-5668-4862-a73b-e197ee6429fe\") " pod="openshift-insights/insights-runtime-extractor-rjkj7" Apr 17 21:16:06.035274 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:06.035245 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fz26b" Apr 17 21:16:06.129712 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:06.129676 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/88bea75e-5668-4862-a73b-e197ee6429fe-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rjkj7\" (UID: \"88bea75e-5668-4862-a73b-e197ee6429fe\") " pod="openshift-insights/insights-runtime-extractor-rjkj7" Apr 17 21:16:06.129844 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:06.129735 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/88bea75e-5668-4862-a73b-e197ee6429fe-crio-socket\") pod \"insights-runtime-extractor-rjkj7\" (UID: \"88bea75e-5668-4862-a73b-e197ee6429fe\") " pod="openshift-insights/insights-runtime-extractor-rjkj7" Apr 17 21:16:06.129844 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:06.129772 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/88bea75e-5668-4862-a73b-e197ee6429fe-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rjkj7\" (UID: \"88bea75e-5668-4862-a73b-e197ee6429fe\") " pod="openshift-insights/insights-runtime-extractor-rjkj7" Apr 17 21:16:06.129844 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:06.129825 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7r2k2\" (UniqueName: \"kubernetes.io/projected/88bea75e-5668-4862-a73b-e197ee6429fe-kube-api-access-7r2k2\") pod \"insights-runtime-extractor-rjkj7\" (UID: \"88bea75e-5668-4862-a73b-e197ee6429fe\") " pod="openshift-insights/insights-runtime-extractor-rjkj7" Apr 17 21:16:06.129998 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:06.129847 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/88bea75e-5668-4862-a73b-e197ee6429fe-crio-socket\") pod \"insights-runtime-extractor-rjkj7\" (UID: \"88bea75e-5668-4862-a73b-e197ee6429fe\") " pod="openshift-insights/insights-runtime-extractor-rjkj7" Apr 17 21:16:06.129998 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:06.129865 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/88bea75e-5668-4862-a73b-e197ee6429fe-data-volume\") pod \"insights-runtime-extractor-rjkj7\" (UID: \"88bea75e-5668-4862-a73b-e197ee6429fe\") " pod="openshift-insights/insights-runtime-extractor-rjkj7" Apr 17 21:16:06.130219 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:06.130198 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/88bea75e-5668-4862-a73b-e197ee6429fe-data-volume\") pod \"insights-runtime-extractor-rjkj7\" (UID: \"88bea75e-5668-4862-a73b-e197ee6429fe\") " pod="openshift-insights/insights-runtime-extractor-rjkj7" Apr 17 21:16:06.131110 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:06.131090 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/88bea75e-5668-4862-a73b-e197ee6429fe-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rjkj7\" (UID: \"88bea75e-5668-4862-a73b-e197ee6429fe\") " pod="openshift-insights/insights-runtime-extractor-rjkj7" Apr 17 21:16:06.132286 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:06.132268 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/88bea75e-5668-4862-a73b-e197ee6429fe-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rjkj7\" (UID: \"88bea75e-5668-4862-a73b-e197ee6429fe\") " pod="openshift-insights/insights-runtime-extractor-rjkj7" Apr 17 21:16:06.136782 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:06.136761 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r2k2\" (UniqueName: \"kubernetes.io/projected/88bea75e-5668-4862-a73b-e197ee6429fe-kube-api-access-7r2k2\") pod \"insights-runtime-extractor-rjkj7\" (UID: \"88bea75e-5668-4862-a73b-e197ee6429fe\") " pod="openshift-insights/insights-runtime-extractor-rjkj7" Apr 17 21:16:06.145366 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:06.145346 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fz26b"] Apr 17 21:16:06.148261 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:16:06.148230 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5ddf28a_4f8a_4a7c_af29_4c0befc12d5f.slice/crio-73496f8cb1f92876fe55bb19bc9f6bbf53c72fffbd4010dfe50099422e1a469e WatchSource:0}: Error finding container 73496f8cb1f92876fe55bb19bc9f6bbf53c72fffbd4010dfe50099422e1a469e: Status 404 returned error can't find the container with id 73496f8cb1f92876fe55bb19bc9f6bbf53c72fffbd4010dfe50099422e1a469e Apr 17 21:16:06.154476 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:06.154457 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-rjkj7" Apr 17 21:16:06.268694 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:06.268659 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-rjkj7"] Apr 17 21:16:06.272339 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:16:06.272307 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88bea75e_5668_4862_a73b_e197ee6429fe.slice/crio-8a1ba9d16898f54802782c12e3188a28c908c754c0af4f9f1f3e20db0ca968f4 WatchSource:0}: Error finding container 8a1ba9d16898f54802782c12e3188a28c908c754c0af4f9f1f3e20db0ca968f4: Status 404 returned error can't find the container with id 8a1ba9d16898f54802782c12e3188a28c908c754c0af4f9f1f3e20db0ca968f4 Apr 17 21:16:06.369210 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:06.369162 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dfvbk" event={"ID":"03a3935c-b9b0-4fda-8142-8cd63a6b5a88","Type":"ContainerStarted","Data":"7f0a683f3d288e0173ac5225a8dd5a5711c52c777a0cdde962aff95ba02dc960"} Apr 17 21:16:06.370429 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:06.370405 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rjkj7" event={"ID":"88bea75e-5668-4862-a73b-e197ee6429fe","Type":"ContainerStarted","Data":"2169e2aa0692f1679446733c65bf257604cb327e11fa238d29fce5cf64d0d3cb"} Apr 17 21:16:06.370429 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:06.370430 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rjkj7" event={"ID":"88bea75e-5668-4862-a73b-e197ee6429fe","Type":"ContainerStarted","Data":"8a1ba9d16898f54802782c12e3188a28c908c754c0af4f9f1f3e20db0ca968f4"} Apr 17 21:16:06.371272 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:06.371252 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fz26b" event={"ID":"e5ddf28a-4f8a-4a7c-af29-4c0befc12d5f","Type":"ContainerStarted","Data":"73496f8cb1f92876fe55bb19bc9f6bbf53c72fffbd4010dfe50099422e1a469e"} Apr 17 21:16:06.383215 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:06.383153 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-dfvbk" podStartSLOduration=136.747955415 podStartE2EDuration="2m18.38314095s" podCreationTimestamp="2026-04-17 21:13:48 +0000 UTC" firstStartedPulling="2026-04-17 21:16:03.989068117 +0000 UTC m=+167.685623168" lastFinishedPulling="2026-04-17 21:16:05.624253652 +0000 UTC m=+169.320808703" observedRunningTime="2026-04-17 21:16:06.382290499 +0000 UTC m=+170.078845571" watchObservedRunningTime="2026-04-17 21:16:06.38314095 +0000 UTC m=+170.079696027" Apr 17 21:16:08.377783 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:08.377739 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rjkj7" event={"ID":"88bea75e-5668-4862-a73b-e197ee6429fe","Type":"ContainerStarted","Data":"186c37e70b17d25c42edf518045df4adb063b0047d1985990685e84cb36e9829"} Apr 17 21:16:08.379126 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:08.379097 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fz26b" event={"ID":"e5ddf28a-4f8a-4a7c-af29-4c0befc12d5f","Type":"ContainerStarted","Data":"45afc8438494006193e96977eecb06cdea84f79124e1db253770587d3853fc29"} Apr 17 21:16:08.379419 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:08.379396 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fz26b" Apr 17 21:16:08.385728 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:08.385677 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fz26b" Apr 17 21:16:08.393927 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:08.393874 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fz26b" podStartSLOduration=2.199131156 podStartE2EDuration="3.39385745s" podCreationTimestamp="2026-04-17 21:16:05 +0000 UTC" firstStartedPulling="2026-04-17 21:16:06.150157027 +0000 UTC m=+169.846712078" lastFinishedPulling="2026-04-17 21:16:07.344883322 +0000 UTC m=+171.041438372" observedRunningTime="2026-04-17 21:16:08.392602746 +0000 UTC m=+172.089157819" watchObservedRunningTime="2026-04-17 21:16:08.39385745 +0000 UTC m=+172.090412527" Apr 17 21:16:08.553571 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:08.553534 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-7rgcx"] Apr 17 21:16:08.557998 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:08.557973 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-7rgcx" Apr 17 21:16:08.560828 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:08.560652 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 21:16:08.560828 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:08.560666 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 17 21:16:08.560828 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:08.560678 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 21:16:08.560828 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:08.560701 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 21:16:08.561106 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:08.560832 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-zrdpp\"" Apr 17 21:16:08.561106 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:08.560993 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 17 21:16:08.566939 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:08.566916 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-7rgcx"] Apr 17 21:16:08.748997 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:08.748960 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2a733fe4-f6c7-40c4-9948-73411f5dd161-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-7rgcx\" (UID: \"2a733fe4-f6c7-40c4-9948-73411f5dd161\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7rgcx" Apr 17 21:16:08.749193 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:08.749008 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/2a733fe4-f6c7-40c4-9948-73411f5dd161-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-7rgcx\" (UID: \"2a733fe4-f6c7-40c4-9948-73411f5dd161\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7rgcx" Apr 17 21:16:08.749193 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:08.749044 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcdzn\" (UniqueName: \"kubernetes.io/projected/2a733fe4-f6c7-40c4-9948-73411f5dd161-kube-api-access-lcdzn\") pod \"prometheus-operator-5676c8c784-7rgcx\" (UID: \"2a733fe4-f6c7-40c4-9948-73411f5dd161\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7rgcx" Apr 17 21:16:08.749193 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:08.749099 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2a733fe4-f6c7-40c4-9948-73411f5dd161-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-7rgcx\" (UID: \"2a733fe4-f6c7-40c4-9948-73411f5dd161\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7rgcx" Apr 17 21:16:08.849949 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:08.849911 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2a733fe4-f6c7-40c4-9948-73411f5dd161-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-7rgcx\" (UID: \"2a733fe4-f6c7-40c4-9948-73411f5dd161\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7rgcx" Apr 17 21:16:08.850103 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:08.849958 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/2a733fe4-f6c7-40c4-9948-73411f5dd161-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-7rgcx\" (UID: \"2a733fe4-f6c7-40c4-9948-73411f5dd161\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7rgcx" Apr 17 21:16:08.850103 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:08.850090 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lcdzn\" (UniqueName: \"kubernetes.io/projected/2a733fe4-f6c7-40c4-9948-73411f5dd161-kube-api-access-lcdzn\") pod \"prometheus-operator-5676c8c784-7rgcx\" (UID: \"2a733fe4-f6c7-40c4-9948-73411f5dd161\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7rgcx" Apr 17 21:16:08.850200 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:08.850152 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2a733fe4-f6c7-40c4-9948-73411f5dd161-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-7rgcx\" (UID: \"2a733fe4-f6c7-40c4-9948-73411f5dd161\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7rgcx" Apr 17 21:16:08.850540 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:08.850521 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2a733fe4-f6c7-40c4-9948-73411f5dd161-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-7rgcx\" (UID: \"2a733fe4-f6c7-40c4-9948-73411f5dd161\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7rgcx" Apr 17 21:16:08.852306 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:08.852286 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/2a733fe4-f6c7-40c4-9948-73411f5dd161-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-7rgcx\" (UID: \"2a733fe4-f6c7-40c4-9948-73411f5dd161\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7rgcx" Apr 17 21:16:08.852393 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:08.852320 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2a733fe4-f6c7-40c4-9948-73411f5dd161-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-7rgcx\" (UID: \"2a733fe4-f6c7-40c4-9948-73411f5dd161\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7rgcx" Apr 17 21:16:08.857376 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:08.857352 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcdzn\" (UniqueName: \"kubernetes.io/projected/2a733fe4-f6c7-40c4-9948-73411f5dd161-kube-api-access-lcdzn\") pod \"prometheus-operator-5676c8c784-7rgcx\" (UID: \"2a733fe4-f6c7-40c4-9948-73411f5dd161\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7rgcx" Apr 17 21:16:08.869213 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:08.869186 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-7rgcx" Apr 17 21:16:08.985743 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:08.985712 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-7rgcx"] Apr 17 21:16:08.988887 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:16:08.988862 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a733fe4_f6c7_40c4_9948_73411f5dd161.slice/crio-9106aba7c14912f2946de0a13b27b824f03a96558173f51f7b530bc1a251b28a WatchSource:0}: Error finding container 9106aba7c14912f2946de0a13b27b824f03a96558173f51f7b530bc1a251b28a: Status 404 returned error can't find the container with id 9106aba7c14912f2946de0a13b27b824f03a96558173f51f7b530bc1a251b28a Apr 17 21:16:09.352790 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:09.352760 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-s9qzj" Apr 17 21:16:09.383878 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:09.383846 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rjkj7" event={"ID":"88bea75e-5668-4862-a73b-e197ee6429fe","Type":"ContainerStarted","Data":"c4c6a0f9d0b381e1bf0418a102f260faf43d7873694b658cfcfa4407c6958f52"} Apr 17 21:16:09.385012 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:09.384987 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-7rgcx" event={"ID":"2a733fe4-f6c7-40c4-9948-73411f5dd161","Type":"ContainerStarted","Data":"9106aba7c14912f2946de0a13b27b824f03a96558173f51f7b530bc1a251b28a"} Apr 17 21:16:09.398850 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:09.398803 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-rjkj7" podStartSLOduration=1.950664119 podStartE2EDuration="4.39878688s" podCreationTimestamp="2026-04-17 21:16:05 +0000 UTC" firstStartedPulling="2026-04-17 21:16:06.33028975 +0000 UTC m=+170.026844814" lastFinishedPulling="2026-04-17 21:16:08.778412519 +0000 UTC m=+172.474967575" observedRunningTime="2026-04-17 21:16:09.398511815 +0000 UTC m=+173.095066886" watchObservedRunningTime="2026-04-17 21:16:09.39878688 +0000 UTC m=+173.095341953" Apr 17 21:16:10.396162 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:10.396129 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-7rgcx" event={"ID":"2a733fe4-f6c7-40c4-9948-73411f5dd161","Type":"ContainerStarted","Data":"eb561c3f62d2b39a7e20071e0338391d478b92ff891f981f9a67444b4963cd38"} Apr 17 21:16:11.399596 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:11.399561 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-7rgcx" event={"ID":"2a733fe4-f6c7-40c4-9948-73411f5dd161","Type":"ContainerStarted","Data":"ca9bd36e3ba47cb0d5fb2bfc1449877b4e62e37958911166410f5d06f78da34d"} Apr 17 21:16:11.414413 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:11.414366 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-7rgcx" podStartSLOduration=2.092884124 podStartE2EDuration="3.41435362s" podCreationTimestamp="2026-04-17 21:16:08 +0000 UTC" firstStartedPulling="2026-04-17 21:16:08.990638157 +0000 UTC m=+172.687193208" lastFinishedPulling="2026-04-17 21:16:10.312107639 +0000 UTC m=+174.008662704" observedRunningTime="2026-04-17 21:16:11.413593646 +0000 UTC m=+175.110148717" watchObservedRunningTime="2026-04-17 21:16:11.41435362 +0000 UTC m=+175.110908689" Apr 17 21:16:12.884596 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:12.884564 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-8l9gf"] Apr 17 21:16:12.888190 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:12.888154 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8l9gf" Apr 17 21:16:12.890534 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:12.890492 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-67mrz\"" Apr 17 21:16:12.890642 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:12.890532 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 17 21:16:12.890642 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:12.890553 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 17 21:16:12.897768 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:12.897747 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-8l9gf"] Apr 17 21:16:12.914536 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:12.914516 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-szdt5"] Apr 17 21:16:12.917801 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:12.917782 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-szdt5" Apr 17 21:16:12.920230 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:12.919952 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 21:16:12.920230 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:12.920004 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-m8ph4\"" Apr 17 21:16:12.920230 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:12.920096 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 21:16:12.920230 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:12.919955 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 21:16:12.980435 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:12.980414 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/977c9f1c-37d4-44d9-b178-b63a50b4a640-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-8l9gf\" (UID: \"977c9f1c-37d4-44d9-b178-b63a50b4a640\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8l9gf" Apr 17 21:16:12.980543 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:12.980450 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d51a735f-f9c9-457e-9805-9c1c8b0e30cc-node-exporter-wtmp\") pod \"node-exporter-szdt5\" (UID: \"d51a735f-f9c9-457e-9805-9c1c8b0e30cc\") " pod="openshift-monitoring/node-exporter-szdt5" Apr 17 21:16:12.980543 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:12.980480 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d51a735f-f9c9-457e-9805-9c1c8b0e30cc-sys\") pod \"node-exporter-szdt5\" (UID: \"d51a735f-f9c9-457e-9805-9c1c8b0e30cc\") " pod="openshift-monitoring/node-exporter-szdt5" Apr 17 21:16:12.980631 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:12.980561 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/977c9f1c-37d4-44d9-b178-b63a50b4a640-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-8l9gf\" (UID: \"977c9f1c-37d4-44d9-b178-b63a50b4a640\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8l9gf" Apr 17 21:16:12.980631 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:12.980597 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d51a735f-f9c9-457e-9805-9c1c8b0e30cc-metrics-client-ca\") pod \"node-exporter-szdt5\" (UID: \"d51a735f-f9c9-457e-9805-9c1c8b0e30cc\") " pod="openshift-monitoring/node-exporter-szdt5" Apr 17 21:16:12.980631 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:12.980623 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d51a735f-f9c9-457e-9805-9c1c8b0e30cc-node-exporter-tls\") pod \"node-exporter-szdt5\" (UID: \"d51a735f-f9c9-457e-9805-9c1c8b0e30cc\") " pod="openshift-monitoring/node-exporter-szdt5" Apr 17 21:16:12.980738 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:12.980649 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d51a735f-f9c9-457e-9805-9c1c8b0e30cc-node-exporter-accelerators-collector-config\") pod \"node-exporter-szdt5\" (UID: \"d51a735f-f9c9-457e-9805-9c1c8b0e30cc\") " pod="openshift-monitoring/node-exporter-szdt5" Apr 17 21:16:12.980738 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:12.980677 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d51a735f-f9c9-457e-9805-9c1c8b0e30cc-root\") pod \"node-exporter-szdt5\" (UID: \"d51a735f-f9c9-457e-9805-9c1c8b0e30cc\") " pod="openshift-monitoring/node-exporter-szdt5" Apr 17 21:16:12.980810 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:12.980735 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d51a735f-f9c9-457e-9805-9c1c8b0e30cc-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-szdt5\" (UID: \"d51a735f-f9c9-457e-9805-9c1c8b0e30cc\") " pod="openshift-monitoring/node-exporter-szdt5" Apr 17 21:16:12.980810 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:12.980753 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chlmx\" (UniqueName: \"kubernetes.io/projected/d51a735f-f9c9-457e-9805-9c1c8b0e30cc-kube-api-access-chlmx\") pod \"node-exporter-szdt5\" (UID: \"d51a735f-f9c9-457e-9805-9c1c8b0e30cc\") " pod="openshift-monitoring/node-exporter-szdt5" Apr 17 21:16:12.980810 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:12.980770 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/977c9f1c-37d4-44d9-b178-b63a50b4a640-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-8l9gf\" (UID: \"977c9f1c-37d4-44d9-b178-b63a50b4a640\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8l9gf" Apr 17 21:16:12.980923 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:12.980811 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kjtz\" (UniqueName: \"kubernetes.io/projected/977c9f1c-37d4-44d9-b178-b63a50b4a640-kube-api-access-6kjtz\") pod \"openshift-state-metrics-9d44df66c-8l9gf\" (UID: \"977c9f1c-37d4-44d9-b178-b63a50b4a640\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8l9gf" Apr 17 21:16:12.980923 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:12.980834 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d51a735f-f9c9-457e-9805-9c1c8b0e30cc-node-exporter-textfile\") pod \"node-exporter-szdt5\" (UID: \"d51a735f-f9c9-457e-9805-9c1c8b0e30cc\") " pod="openshift-monitoring/node-exporter-szdt5" Apr 17 21:16:13.081427 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:13.081397 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d51a735f-f9c9-457e-9805-9c1c8b0e30cc-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-szdt5\" (UID: \"d51a735f-f9c9-457e-9805-9c1c8b0e30cc\") " pod="openshift-monitoring/node-exporter-szdt5" Apr 17 21:16:13.081427 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:13.081428 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-chlmx\" (UniqueName: \"kubernetes.io/projected/d51a735f-f9c9-457e-9805-9c1c8b0e30cc-kube-api-access-chlmx\") pod \"node-exporter-szdt5\" (UID: \"d51a735f-f9c9-457e-9805-9c1c8b0e30cc\") " pod="openshift-monitoring/node-exporter-szdt5" Apr 17 21:16:13.081624 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:13.081446 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/977c9f1c-37d4-44d9-b178-b63a50b4a640-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-8l9gf\" (UID: \"977c9f1c-37d4-44d9-b178-b63a50b4a640\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8l9gf" Apr 17 21:16:13.081624 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:13.081468 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6kjtz\" (UniqueName: \"kubernetes.io/projected/977c9f1c-37d4-44d9-b178-b63a50b4a640-kube-api-access-6kjtz\") pod \"openshift-state-metrics-9d44df66c-8l9gf\" (UID: \"977c9f1c-37d4-44d9-b178-b63a50b4a640\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8l9gf" Apr 17 21:16:13.081624 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:13.081488 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d51a735f-f9c9-457e-9805-9c1c8b0e30cc-node-exporter-textfile\") pod \"node-exporter-szdt5\" (UID: \"d51a735f-f9c9-457e-9805-9c1c8b0e30cc\") " pod="openshift-monitoring/node-exporter-szdt5" Apr 17 21:16:13.081624 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:13.081507 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/977c9f1c-37d4-44d9-b178-b63a50b4a640-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-8l9gf\" (UID: \"977c9f1c-37d4-44d9-b178-b63a50b4a640\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8l9gf" Apr 17 21:16:13.081817 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:13.081625 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d51a735f-f9c9-457e-9805-9c1c8b0e30cc-node-exporter-wtmp\") pod \"node-exporter-szdt5\" (UID: \"d51a735f-f9c9-457e-9805-9c1c8b0e30cc\") " pod="openshift-monitoring/node-exporter-szdt5" Apr 17 21:16:13.081817 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:13.081669 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d51a735f-f9c9-457e-9805-9c1c8b0e30cc-sys\") pod \"node-exporter-szdt5\" (UID: \"d51a735f-f9c9-457e-9805-9c1c8b0e30cc\") " pod="openshift-monitoring/node-exporter-szdt5" Apr 17 21:16:13.081817 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:13.081712 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/977c9f1c-37d4-44d9-b178-b63a50b4a640-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-8l9gf\" (UID: \"977c9f1c-37d4-44d9-b178-b63a50b4a640\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8l9gf" Apr 17 21:16:13.081817 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:13.081744 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d51a735f-f9c9-457e-9805-9c1c8b0e30cc-metrics-client-ca\") pod \"node-exporter-szdt5\" (UID: \"d51a735f-f9c9-457e-9805-9c1c8b0e30cc\") " pod="openshift-monitoring/node-exporter-szdt5" Apr 17 21:16:13.081817 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:13.081766 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d51a735f-f9c9-457e-9805-9c1c8b0e30cc-node-exporter-tls\") pod \"node-exporter-szdt5\" (UID: \"d51a735f-f9c9-457e-9805-9c1c8b0e30cc\") " pod="openshift-monitoring/node-exporter-szdt5" Apr 17 21:16:13.081817 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:13.081788 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d51a735f-f9c9-457e-9805-9c1c8b0e30cc-node-exporter-accelerators-collector-config\") pod \"node-exporter-szdt5\" (UID: \"d51a735f-f9c9-457e-9805-9c1c8b0e30cc\") " pod="openshift-monitoring/node-exporter-szdt5" Apr 17 21:16:13.081817 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:13.081813 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d51a735f-f9c9-457e-9805-9c1c8b0e30cc-root\") pod \"node-exporter-szdt5\" (UID: \"d51a735f-f9c9-457e-9805-9c1c8b0e30cc\") " pod="openshift-monitoring/node-exporter-szdt5" Apr 17 21:16:13.082126 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:13.081831 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d51a735f-f9c9-457e-9805-9c1c8b0e30cc-node-exporter-textfile\") pod \"node-exporter-szdt5\" (UID: \"d51a735f-f9c9-457e-9805-9c1c8b0e30cc\") " pod="openshift-monitoring/node-exporter-szdt5" Apr 17 21:16:13.082126 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:13.081845 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d51a735f-f9c9-457e-9805-9c1c8b0e30cc-node-exporter-wtmp\") pod \"node-exporter-szdt5\" (UID: \"d51a735f-f9c9-457e-9805-9c1c8b0e30cc\") " pod="openshift-monitoring/node-exporter-szdt5" Apr 17 21:16:13.082126 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:13.081894 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d51a735f-f9c9-457e-9805-9c1c8b0e30cc-sys\") pod \"node-exporter-szdt5\" (UID: \"d51a735f-f9c9-457e-9805-9c1c8b0e30cc\") " pod="openshift-monitoring/node-exporter-szdt5" Apr 17 21:16:13.082126 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:13.081897 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d51a735f-f9c9-457e-9805-9c1c8b0e30cc-root\") pod \"node-exporter-szdt5\" (UID: \"d51a735f-f9c9-457e-9805-9c1c8b0e30cc\") " pod="openshift-monitoring/node-exporter-szdt5" Apr 17 21:16:13.082350 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:13.082162 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/977c9f1c-37d4-44d9-b178-b63a50b4a640-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-8l9gf\" (UID: \"977c9f1c-37d4-44d9-b178-b63a50b4a640\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8l9gf" Apr 17 21:16:13.082770 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:13.082739 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d51a735f-f9c9-457e-9805-9c1c8b0e30cc-metrics-client-ca\") pod \"node-exporter-szdt5\" (UID: \"d51a735f-f9c9-457e-9805-9c1c8b0e30cc\") " pod="openshift-monitoring/node-exporter-szdt5" Apr 17 21:16:13.082878 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:13.082790 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d51a735f-f9c9-457e-9805-9c1c8b0e30cc-node-exporter-accelerators-collector-config\") pod \"node-exporter-szdt5\" (UID: \"d51a735f-f9c9-457e-9805-9c1c8b0e30cc\") " pod="openshift-monitoring/node-exporter-szdt5" Apr 17 21:16:13.084149 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:13.084120 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/977c9f1c-37d4-44d9-b178-b63a50b4a640-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-8l9gf\" (UID: \"977c9f1c-37d4-44d9-b178-b63a50b4a640\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8l9gf" Apr 17 21:16:13.084362 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:13.084343 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/977c9f1c-37d4-44d9-b178-b63a50b4a640-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-8l9gf\" (UID: \"977c9f1c-37d4-44d9-b178-b63a50b4a640\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8l9gf" Apr 17 21:16:13.084444 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:13.084407 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d51a735f-f9c9-457e-9805-9c1c8b0e30cc-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-szdt5\" (UID: \"d51a735f-f9c9-457e-9805-9c1c8b0e30cc\") " pod="openshift-monitoring/node-exporter-szdt5" Apr 17 21:16:13.084576 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:13.084561 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d51a735f-f9c9-457e-9805-9c1c8b0e30cc-node-exporter-tls\") pod \"node-exporter-szdt5\" (UID: \"d51a735f-f9c9-457e-9805-9c1c8b0e30cc\") " pod="openshift-monitoring/node-exporter-szdt5" Apr 17 21:16:13.089678 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:13.089654 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kjtz\" (UniqueName: \"kubernetes.io/projected/977c9f1c-37d4-44d9-b178-b63a50b4a640-kube-api-access-6kjtz\") pod \"openshift-state-metrics-9d44df66c-8l9gf\" (UID: \"977c9f1c-37d4-44d9-b178-b63a50b4a640\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8l9gf" Apr 17 21:16:13.089817 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:13.089800 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-chlmx\" (UniqueName: \"kubernetes.io/projected/d51a735f-f9c9-457e-9805-9c1c8b0e30cc-kube-api-access-chlmx\") pod \"node-exporter-szdt5\" (UID: \"d51a735f-f9c9-457e-9805-9c1c8b0e30cc\") " pod="openshift-monitoring/node-exporter-szdt5" Apr 17 21:16:13.198371 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:13.198285 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8l9gf" Apr 17 21:16:13.227289 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:13.227261 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-szdt5" Apr 17 21:16:13.236599 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:16:13.236558 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd51a735f_f9c9_457e_9805_9c1c8b0e30cc.slice/crio-fc81b3df33e48af0aa8d2586337745cb5600399df55d1313aa0595a808ac5dd6 WatchSource:0}: Error finding container fc81b3df33e48af0aa8d2586337745cb5600399df55d1313aa0595a808ac5dd6: Status 404 returned error can't find the container with id fc81b3df33e48af0aa8d2586337745cb5600399df55d1313aa0595a808ac5dd6 Apr 17 21:16:13.320811 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:13.320669 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-8l9gf"] Apr 17 21:16:13.323254 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:16:13.323229 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod977c9f1c_37d4_44d9_b178_b63a50b4a640.slice/crio-d56ac4330fe1ddf01669dfd10eb85d51818d2bf9d8f95d95c9f196ff397f358a WatchSource:0}: Error finding container d56ac4330fe1ddf01669dfd10eb85d51818d2bf9d8f95d95c9f196ff397f358a: Status 404 returned error can't find the container with id d56ac4330fe1ddf01669dfd10eb85d51818d2bf9d8f95d95c9f196ff397f358a Apr 17 21:16:13.405037 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:13.405008 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-szdt5" event={"ID":"d51a735f-f9c9-457e-9805-9c1c8b0e30cc","Type":"ContainerStarted","Data":"fc81b3df33e48af0aa8d2586337745cb5600399df55d1313aa0595a808ac5dd6"} Apr 17 21:16:13.406424 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:13.406388 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8l9gf" event={"ID":"977c9f1c-37d4-44d9-b178-b63a50b4a640","Type":"ContainerStarted","Data":"38df79b0701d2f227bdf6ba82763b492ca1e9eb8502d0cd955756dd87b94021c"} Apr 17 21:16:13.406424 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:13.406414 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8l9gf" event={"ID":"977c9f1c-37d4-44d9-b178-b63a50b4a640","Type":"ContainerStarted","Data":"d56ac4330fe1ddf01669dfd10eb85d51818d2bf9d8f95d95c9f196ff397f358a"} Apr 17 21:16:14.410310 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:14.410233 2572 generic.go:358] "Generic (PLEG): container finished" podID="d51a735f-f9c9-457e-9805-9c1c8b0e30cc" containerID="fea0b9471781a35d6d59d5685845073312f5d4df2ba824848f0817a1fb070b0f" exitCode=0 Apr 17 21:16:14.410683 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:14.410327 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-szdt5" event={"ID":"d51a735f-f9c9-457e-9805-9c1c8b0e30cc","Type":"ContainerDied","Data":"fea0b9471781a35d6d59d5685845073312f5d4df2ba824848f0817a1fb070b0f"} Apr 17 21:16:14.411937 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:14.411916 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8l9gf" event={"ID":"977c9f1c-37d4-44d9-b178-b63a50b4a640","Type":"ContainerStarted","Data":"6894ce39f8236939021026fa4357032322007a3ce0dfceb7ec597fbcde59e022"} Apr 17 21:16:15.416653 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:15.416620 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-szdt5" event={"ID":"d51a735f-f9c9-457e-9805-9c1c8b0e30cc","Type":"ContainerStarted","Data":"b43412d6084d871fcf0e5091b1b4608b551a4ba8bff6c53b6cfc52a5f6d0deac"} Apr 17 21:16:15.416653 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:15.416654 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-szdt5" event={"ID":"d51a735f-f9c9-457e-9805-9c1c8b0e30cc","Type":"ContainerStarted","Data":"250a432da390639191796ca944dc7cadd08cbb1f328b7e71e57bbed19ebe74b1"} Apr 17 21:16:15.418274 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:15.418253 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8l9gf" event={"ID":"977c9f1c-37d4-44d9-b178-b63a50b4a640","Type":"ContainerStarted","Data":"9cc0617bb324bce9f9fbcacdd4941b0c74fdf13cb386e130c812ea710128ca18"} Apr 17 21:16:15.434249 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:15.434199 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-szdt5" podStartSLOduration=2.641402175 podStartE2EDuration="3.43416018s" podCreationTimestamp="2026-04-17 21:16:12 +0000 UTC" firstStartedPulling="2026-04-17 21:16:13.238306782 +0000 UTC m=+176.934861838" lastFinishedPulling="2026-04-17 21:16:14.031064792 +0000 UTC m=+177.727619843" observedRunningTime="2026-04-17 21:16:15.433067739 +0000 UTC m=+179.129622810" watchObservedRunningTime="2026-04-17 21:16:15.43416018 +0000 UTC m=+179.130715235" Apr 17 21:16:15.448980 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:15.448937 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8l9gf" podStartSLOduration=2.099803613 podStartE2EDuration="3.448925278s" podCreationTimestamp="2026-04-17 21:16:12 +0000 UTC" firstStartedPulling="2026-04-17 21:16:13.469943386 +0000 UTC m=+177.166498438" lastFinishedPulling="2026-04-17 21:16:14.81906505 +0000 UTC m=+178.515620103" observedRunningTime="2026-04-17 21:16:15.448050978 +0000 UTC m=+179.144606051" watchObservedRunningTime="2026-04-17 21:16:15.448925278 +0000 UTC m=+179.145480394" Apr 17 21:16:15.975361 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:15.975324 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-5dfb58dddc-9ttxq"] Apr 17 21:16:15.978885 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:15.978860 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5dfb58dddc-9ttxq" Apr 17 21:16:15.981275 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:15.981238 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-22j8l\"" Apr 17 21:16:15.981490 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:15.981468 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 17 21:16:15.981584 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:15.981516 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-bi6di03a4betd\"" Apr 17 21:16:15.981584 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:15.981525 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 17 21:16:15.981686 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:15.981586 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 17 21:16:15.981686 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:15.981526 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 17 21:16:15.981767 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:15.981721 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 17 21:16:15.987760 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:15.987737 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5dfb58dddc-9ttxq"] Apr 17 21:16:16.004330 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:16.004308 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/08cc730b-19b5-47e6-9b01-174bc4e3cc13-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5dfb58dddc-9ttxq\" (UID: \"08cc730b-19b5-47e6-9b01-174bc4e3cc13\") " pod="openshift-monitoring/thanos-querier-5dfb58dddc-9ttxq" Apr 17 21:16:16.004330 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:16.004337 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qg4l\" (UniqueName: \"kubernetes.io/projected/08cc730b-19b5-47e6-9b01-174bc4e3cc13-kube-api-access-2qg4l\") pod \"thanos-querier-5dfb58dddc-9ttxq\" (UID: \"08cc730b-19b5-47e6-9b01-174bc4e3cc13\") " pod="openshift-monitoring/thanos-querier-5dfb58dddc-9ttxq" Apr 17 21:16:16.004450 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:16.004359 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/08cc730b-19b5-47e6-9b01-174bc4e3cc13-secret-grpc-tls\") pod \"thanos-querier-5dfb58dddc-9ttxq\" (UID: \"08cc730b-19b5-47e6-9b01-174bc4e3cc13\") " pod="openshift-monitoring/thanos-querier-5dfb58dddc-9ttxq" Apr 17 21:16:16.004450 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:16.004399 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/08cc730b-19b5-47e6-9b01-174bc4e3cc13-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5dfb58dddc-9ttxq\" (UID: \"08cc730b-19b5-47e6-9b01-174bc4e3cc13\") " pod="openshift-monitoring/thanos-querier-5dfb58dddc-9ttxq" Apr 17 21:16:16.004450 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:16.004426 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/08cc730b-19b5-47e6-9b01-174bc4e3cc13-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5dfb58dddc-9ttxq\" (UID: \"08cc730b-19b5-47e6-9b01-174bc4e3cc13\") " pod="openshift-monitoring/thanos-querier-5dfb58dddc-9ttxq" Apr 17 21:16:16.004558 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:16.004477 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/08cc730b-19b5-47e6-9b01-174bc4e3cc13-secret-thanos-querier-tls\") pod \"thanos-querier-5dfb58dddc-9ttxq\" (UID: \"08cc730b-19b5-47e6-9b01-174bc4e3cc13\") " pod="openshift-monitoring/thanos-querier-5dfb58dddc-9ttxq" Apr 17 21:16:16.004558 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:16.004493 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/08cc730b-19b5-47e6-9b01-174bc4e3cc13-metrics-client-ca\") pod \"thanos-querier-5dfb58dddc-9ttxq\" (UID: \"08cc730b-19b5-47e6-9b01-174bc4e3cc13\") " pod="openshift-monitoring/thanos-querier-5dfb58dddc-9ttxq" Apr 17 21:16:16.004558 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:16.004511 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/08cc730b-19b5-47e6-9b01-174bc4e3cc13-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5dfb58dddc-9ttxq\" (UID: \"08cc730b-19b5-47e6-9b01-174bc4e3cc13\") " pod="openshift-monitoring/thanos-querier-5dfb58dddc-9ttxq" Apr 17 21:16:16.105081 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:16.105054 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/08cc730b-19b5-47e6-9b01-174bc4e3cc13-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5dfb58dddc-9ttxq\" (UID: \"08cc730b-19b5-47e6-9b01-174bc4e3cc13\") " pod="openshift-monitoring/thanos-querier-5dfb58dddc-9ttxq" Apr 17 21:16:16.105204 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:16.105091 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2qg4l\" (UniqueName: \"kubernetes.io/projected/08cc730b-19b5-47e6-9b01-174bc4e3cc13-kube-api-access-2qg4l\") pod \"thanos-querier-5dfb58dddc-9ttxq\" (UID: \"08cc730b-19b5-47e6-9b01-174bc4e3cc13\") " pod="openshift-monitoring/thanos-querier-5dfb58dddc-9ttxq" Apr 17 21:16:16.105252 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:16.105208 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/08cc730b-19b5-47e6-9b01-174bc4e3cc13-secret-grpc-tls\") pod \"thanos-querier-5dfb58dddc-9ttxq\" (UID: \"08cc730b-19b5-47e6-9b01-174bc4e3cc13\") " pod="openshift-monitoring/thanos-querier-5dfb58dddc-9ttxq" Apr 17 21:16:16.105303 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:16.105250 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/08cc730b-19b5-47e6-9b01-174bc4e3cc13-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5dfb58dddc-9ttxq\" (UID: \"08cc730b-19b5-47e6-9b01-174bc4e3cc13\") " pod="openshift-monitoring/thanos-querier-5dfb58dddc-9ttxq" Apr 17 21:16:16.105303 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:16.105289 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/08cc730b-19b5-47e6-9b01-174bc4e3cc13-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5dfb58dddc-9ttxq\" (UID: \"08cc730b-19b5-47e6-9b01-174bc4e3cc13\") " pod="openshift-monitoring/thanos-querier-5dfb58dddc-9ttxq" Apr 17 21:16:16.105422 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:16.105350 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/08cc730b-19b5-47e6-9b01-174bc4e3cc13-secret-thanos-querier-tls\") pod \"thanos-querier-5dfb58dddc-9ttxq\" (UID: \"08cc730b-19b5-47e6-9b01-174bc4e3cc13\") " pod="openshift-monitoring/thanos-querier-5dfb58dddc-9ttxq" Apr 17 21:16:16.105422 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:16.105375 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/08cc730b-19b5-47e6-9b01-174bc4e3cc13-metrics-client-ca\") pod \"thanos-querier-5dfb58dddc-9ttxq\" (UID: \"08cc730b-19b5-47e6-9b01-174bc4e3cc13\") " pod="openshift-monitoring/thanos-querier-5dfb58dddc-9ttxq" Apr 17 21:16:16.105422 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:16.105407 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/08cc730b-19b5-47e6-9b01-174bc4e3cc13-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5dfb58dddc-9ttxq\" (UID: \"08cc730b-19b5-47e6-9b01-174bc4e3cc13\") " pod="openshift-monitoring/thanos-querier-5dfb58dddc-9ttxq" Apr 17 21:16:16.106126 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:16.106104 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/08cc730b-19b5-47e6-9b01-174bc4e3cc13-metrics-client-ca\") pod \"thanos-querier-5dfb58dddc-9ttxq\" (UID: \"08cc730b-19b5-47e6-9b01-174bc4e3cc13\") " pod="openshift-monitoring/thanos-querier-5dfb58dddc-9ttxq" Apr 17 21:16:16.107998 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:16.107971 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/08cc730b-19b5-47e6-9b01-174bc4e3cc13-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5dfb58dddc-9ttxq\" (UID: \"08cc730b-19b5-47e6-9b01-174bc4e3cc13\") " pod="openshift-monitoring/thanos-querier-5dfb58dddc-9ttxq" Apr 17 21:16:16.108149 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:16.108127 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/08cc730b-19b5-47e6-9b01-174bc4e3cc13-secret-thanos-querier-tls\") pod \"thanos-querier-5dfb58dddc-9ttxq\" (UID: \"08cc730b-19b5-47e6-9b01-174bc4e3cc13\") " pod="openshift-monitoring/thanos-querier-5dfb58dddc-9ttxq" Apr 17 21:16:16.108596 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:16.108573 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/08cc730b-19b5-47e6-9b01-174bc4e3cc13-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5dfb58dddc-9ttxq\" (UID: \"08cc730b-19b5-47e6-9b01-174bc4e3cc13\") " pod="openshift-monitoring/thanos-querier-5dfb58dddc-9ttxq" Apr 17 21:16:16.108670 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:16.108603 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/08cc730b-19b5-47e6-9b01-174bc4e3cc13-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5dfb58dddc-9ttxq\" (UID: \"08cc730b-19b5-47e6-9b01-174bc4e3cc13\") " pod="openshift-monitoring/thanos-querier-5dfb58dddc-9ttxq" Apr 17 21:16:16.108670 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:16.108622 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/08cc730b-19b5-47e6-9b01-174bc4e3cc13-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5dfb58dddc-9ttxq\" (UID: \"08cc730b-19b5-47e6-9b01-174bc4e3cc13\") " pod="openshift-monitoring/thanos-querier-5dfb58dddc-9ttxq" Apr 17 21:16:16.108751 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:16.108690 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/08cc730b-19b5-47e6-9b01-174bc4e3cc13-secret-grpc-tls\") pod \"thanos-querier-5dfb58dddc-9ttxq\" (UID: \"08cc730b-19b5-47e6-9b01-174bc4e3cc13\") " pod="openshift-monitoring/thanos-querier-5dfb58dddc-9ttxq" Apr 17 21:16:16.111976 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:16.111952 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qg4l\" (UniqueName: \"kubernetes.io/projected/08cc730b-19b5-47e6-9b01-174bc4e3cc13-kube-api-access-2qg4l\") pod \"thanos-querier-5dfb58dddc-9ttxq\" (UID: \"08cc730b-19b5-47e6-9b01-174bc4e3cc13\") " pod="openshift-monitoring/thanos-querier-5dfb58dddc-9ttxq" Apr 17 21:16:16.289358 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:16.289273 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5dfb58dddc-9ttxq" Apr 17 21:16:16.416740 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:16.416710 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5dfb58dddc-9ttxq"] Apr 17 21:16:16.420094 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:16:16.420067 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08cc730b_19b5_47e6_9b01_174bc4e3cc13.slice/crio-766d37fc7949f0163897c43b5605cc42f9211e60ea06fcc6a40369c63b40556e WatchSource:0}: Error finding container 766d37fc7949f0163897c43b5605cc42f9211e60ea06fcc6a40369c63b40556e: Status 404 returned error can't find the container with id 766d37fc7949f0163897c43b5605cc42f9211e60ea06fcc6a40369c63b40556e Apr 17 21:16:17.425077 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:17.425036 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5dfb58dddc-9ttxq" event={"ID":"08cc730b-19b5-47e6-9b01-174bc4e3cc13","Type":"ContainerStarted","Data":"766d37fc7949f0163897c43b5605cc42f9211e60ea06fcc6a40369c63b40556e"} Apr 17 21:16:17.669865 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:17.669830 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-kcrhz"] Apr 17 21:16:17.673093 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:17.673071 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kcrhz" Apr 17 21:16:17.675358 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:17.675310 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-8bsn6\"" Apr 17 21:16:17.675358 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:17.675322 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 17 21:16:17.680450 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:17.680031 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-kcrhz"] Apr 17 21:16:17.718243 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:17.718199 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/621a2863-afe1-4038-a82c-a6786ab54ffc-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-kcrhz\" (UID: \"621a2863-afe1-4038-a82c-a6786ab54ffc\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kcrhz" Apr 17 21:16:17.818871 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:17.818838 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/621a2863-afe1-4038-a82c-a6786ab54ffc-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-kcrhz\" (UID: \"621a2863-afe1-4038-a82c-a6786ab54ffc\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kcrhz" Apr 17 21:16:17.821314 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:17.821288 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/621a2863-afe1-4038-a82c-a6786ab54ffc-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-kcrhz\" (UID: \"621a2863-afe1-4038-a82c-a6786ab54ffc\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kcrhz" Apr 17 21:16:17.984461 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:17.984424 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kcrhz" Apr 17 21:16:18.561090 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:18.561058 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-kcrhz"] Apr 17 21:16:18.564279 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:16:18.564245 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod621a2863_afe1_4038_a82c_a6786ab54ffc.slice/crio-e606ea99a86b891b891a49e97f64d51974777bb62166528ae1cfbd68a402721c WatchSource:0}: Error finding container e606ea99a86b891b891a49e97f64d51974777bb62166528ae1cfbd68a402721c: Status 404 returned error can't find the container with id e606ea99a86b891b891a49e97f64d51974777bb62166528ae1cfbd68a402721c Apr 17 21:16:19.107201 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.105007 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 21:16:19.109465 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.109430 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.112022 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.111997 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-ba4j854sbn2qo\"" Apr 17 21:16:19.112275 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.112247 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-66qpt\"" Apr 17 21:16:19.112336 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.112298 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 21:16:19.112382 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.112258 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 21:16:19.112811 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.112791 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 21:16:19.115377 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.113230 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 21:16:19.115377 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.114822 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 21:16:19.115377 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.113297 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 21:16:19.115377 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.113388 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 21:16:19.115377 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.112898 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 21:16:19.115377 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.113713 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 21:16:19.116510 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.115793 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 21:16:19.116510 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.115993 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 21:16:19.118381 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.118359 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 21:16:19.125795 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.125735 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 21:16:19.127687 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.127648 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 21:16:19.132307 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.132283 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.132490 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.132474 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/99268d51-f6a1-4015-8ab9-033030df0e34-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.132620 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.132607 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-web-config\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.132722 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.132709 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/99268d51-f6a1-4015-8ab9-033030df0e34-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.132813 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.132798 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/99268d51-f6a1-4015-8ab9-033030df0e34-config-out\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.132916 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.132903 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99268d51-f6a1-4015-8ab9-033030df0e34-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.133012 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.132999 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.133101 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.133088 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.133210 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.133195 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.133313 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.133300 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99268d51-f6a1-4015-8ab9-033030df0e34-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.133393 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.133382 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/99268d51-f6a1-4015-8ab9-033030df0e34-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.133470 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.133459 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-config\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.133560 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.133549 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2tsr\" (UniqueName: \"kubernetes.io/projected/99268d51-f6a1-4015-8ab9-033030df0e34-kube-api-access-r2tsr\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.133673 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.133660 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/99268d51-f6a1-4015-8ab9-033030df0e34-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.133758 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.133745 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99268d51-f6a1-4015-8ab9-033030df0e34-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.133889 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.133876 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.133982 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.133970 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.134075 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.134064 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.238309 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.234835 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.238309 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.234883 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/99268d51-f6a1-4015-8ab9-033030df0e34-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.238309 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.234915 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-web-config\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.238309 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.234951 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/99268d51-f6a1-4015-8ab9-033030df0e34-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.238309 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.234983 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/99268d51-f6a1-4015-8ab9-033030df0e34-config-out\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.238309 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.235014 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99268d51-f6a1-4015-8ab9-033030df0e34-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.238309 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.235049 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.238309 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.235074 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.238309 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.235100 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.238309 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.235124 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99268d51-f6a1-4015-8ab9-033030df0e34-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.238309 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.235150 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/99268d51-f6a1-4015-8ab9-033030df0e34-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.238309 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.235194 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-config\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.238309 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.235236 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r2tsr\" (UniqueName: \"kubernetes.io/projected/99268d51-f6a1-4015-8ab9-033030df0e34-kube-api-access-r2tsr\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.238309 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.235296 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/99268d51-f6a1-4015-8ab9-033030df0e34-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.238309 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.235323 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99268d51-f6a1-4015-8ab9-033030df0e34-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.238309 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.235391 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.238309 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.235418 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.239268 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.235448 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.239268 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.236976 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/99268d51-f6a1-4015-8ab9-033030df0e34-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.239268 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.238194 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99268d51-f6a1-4015-8ab9-033030df0e34-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.239268 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.238480 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/99268d51-f6a1-4015-8ab9-033030df0e34-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.240363 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.240334 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.241231 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.241206 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99268d51-f6a1-4015-8ab9-033030df0e34-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.242292 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.241919 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99268d51-f6a1-4015-8ab9-033030df0e34-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.243859 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.243829 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/99268d51-f6a1-4015-8ab9-033030df0e34-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.243859 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.243849 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-config\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.244012 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.243876 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.244284 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.244256 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-web-config\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.245517 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.245493 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.245825 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.245802 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/99268d51-f6a1-4015-8ab9-033030df0e34-config-out\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.246895 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.245999 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.246895 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.246336 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.247376 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.247005 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/99268d51-f6a1-4015-8ab9-033030df0e34-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.247376 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.247052 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.247517 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.247487 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.248640 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.248617 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2tsr\" (UniqueName: \"kubernetes.io/projected/99268d51-f6a1-4015-8ab9-033030df0e34-kube-api-access-r2tsr\") pod \"prometheus-k8s-0\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.427463 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.427387 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:19.438695 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.438578 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5dfb58dddc-9ttxq" event={"ID":"08cc730b-19b5-47e6-9b01-174bc4e3cc13","Type":"ContainerStarted","Data":"bcb3ce0f9005adae99f349c7045f4bc5e84dc45f16baa2b8634432986b9ae977"} Apr 17 21:16:19.438695 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.438644 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5dfb58dddc-9ttxq" event={"ID":"08cc730b-19b5-47e6-9b01-174bc4e3cc13","Type":"ContainerStarted","Data":"025f97da3067be5aa762603bc27e4fe51648bbabae23435f9868688ba366119f"} Apr 17 21:16:19.438695 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.438660 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5dfb58dddc-9ttxq" event={"ID":"08cc730b-19b5-47e6-9b01-174bc4e3cc13","Type":"ContainerStarted","Data":"740c5d0d9566a572318c883a9ce5c88ff2c831cabd59c31a359ea08c29060f84"} Apr 17 21:16:19.439733 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.439699 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kcrhz" event={"ID":"621a2863-afe1-4038-a82c-a6786ab54ffc","Type":"ContainerStarted","Data":"e606ea99a86b891b891a49e97f64d51974777bb62166528ae1cfbd68a402721c"} Apr 17 21:16:19.611045 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:19.611021 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 21:16:19.613996 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:16:19.613971 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99268d51_f6a1_4015_8ab9_033030df0e34.slice/crio-3e9d9eeacce687c4c9ea1284391994c40a90aceb4469a92190f2d678a53f471d WatchSource:0}: Error finding container 3e9d9eeacce687c4c9ea1284391994c40a90aceb4469a92190f2d678a53f471d: Status 404 returned error can't find the container with id 3e9d9eeacce687c4c9ea1284391994c40a90aceb4469a92190f2d678a53f471d Apr 17 21:16:20.445414 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:20.445324 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5dfb58dddc-9ttxq" event={"ID":"08cc730b-19b5-47e6-9b01-174bc4e3cc13","Type":"ContainerStarted","Data":"ceb22b1635fa8d43c6473df37c2dec7b09ece85471bc9fbba4fb32effd567701"} Apr 17 21:16:20.445414 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:20.445372 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5dfb58dddc-9ttxq" event={"ID":"08cc730b-19b5-47e6-9b01-174bc4e3cc13","Type":"ContainerStarted","Data":"f3793da4b730a56984b1ba0c57f7f5e27baa870208324141194e295e18a44433"} Apr 17 21:16:20.445414 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:20.445387 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5dfb58dddc-9ttxq" event={"ID":"08cc730b-19b5-47e6-9b01-174bc4e3cc13","Type":"ContainerStarted","Data":"1e9f59996ddc260e8c716868cb3e5558939330904177241a8c46133a71ac73c8"} Apr 17 21:16:20.445703 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:20.445562 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-5dfb58dddc-9ttxq" Apr 17 21:16:20.446598 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:20.446567 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"99268d51-f6a1-4015-8ab9-033030df0e34","Type":"ContainerStarted","Data":"3e9d9eeacce687c4c9ea1284391994c40a90aceb4469a92190f2d678a53f471d"} Apr 17 21:16:20.447888 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:20.447865 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kcrhz" event={"ID":"621a2863-afe1-4038-a82c-a6786ab54ffc","Type":"ContainerStarted","Data":"b20ea4877f634e75149c985b679712bcad7a319906682f31f8178487c866d2c6"} Apr 17 21:16:20.448085 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:20.448065 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kcrhz" Apr 17 21:16:20.454590 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:20.454567 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kcrhz" Apr 17 21:16:20.471046 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:20.470993 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-5dfb58dddc-9ttxq" podStartSLOduration=2.423634838 podStartE2EDuration="5.470976619s" podCreationTimestamp="2026-04-17 21:16:15 +0000 UTC" firstStartedPulling="2026-04-17 21:16:16.422285944 +0000 UTC m=+180.118840994" lastFinishedPulling="2026-04-17 21:16:19.469627721 +0000 UTC m=+183.166182775" observedRunningTime="2026-04-17 21:16:20.468784603 +0000 UTC m=+184.165339676" watchObservedRunningTime="2026-04-17 21:16:20.470976619 +0000 UTC m=+184.167531694" Apr 17 21:16:20.484974 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:20.484885 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kcrhz" podStartSLOduration=1.951253961 podStartE2EDuration="3.484872471s" podCreationTimestamp="2026-04-17 21:16:17 +0000 UTC" firstStartedPulling="2026-04-17 21:16:18.566282913 +0000 UTC m=+182.262837967" lastFinishedPulling="2026-04-17 21:16:20.099901427 +0000 UTC m=+183.796456477" observedRunningTime="2026-04-17 21:16:20.483712035 +0000 UTC m=+184.180267110" watchObservedRunningTime="2026-04-17 21:16:20.484872471 +0000 UTC m=+184.181427542" Apr 17 21:16:21.451849 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:21.451811 2572 generic.go:358] "Generic (PLEG): container finished" podID="99268d51-f6a1-4015-8ab9-033030df0e34" containerID="d131c370f5a5e0e7553438c3c8ecf19320e3fb21847876bcd93a3bc6cdb849fd" exitCode=0 Apr 17 21:16:21.452303 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:21.451874 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"99268d51-f6a1-4015-8ab9-033030df0e34","Type":"ContainerDied","Data":"d131c370f5a5e0e7553438c3c8ecf19320e3fb21847876bcd93a3bc6cdb849fd"} Apr 17 21:16:24.463910 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:24.463882 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"99268d51-f6a1-4015-8ab9-033030df0e34","Type":"ContainerStarted","Data":"082c7607308a2bcdfa1ba79060457e420620f3e5e14a3328bd3ce224255825af"} Apr 17 21:16:25.469841 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:25.469809 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"99268d51-f6a1-4015-8ab9-033030df0e34","Type":"ContainerStarted","Data":"5a8e300b2a202d29d225ea518ac0a89afaff3d777834317608117c23b8649203"} Apr 17 21:16:25.469841 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:25.469843 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"99268d51-f6a1-4015-8ab9-033030df0e34","Type":"ContainerStarted","Data":"e63badcf6fc2e4f421694c2873937f1ababc48e5184a1a93fc30bcb816cb7bab"} Apr 17 21:16:25.470292 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:25.469853 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"99268d51-f6a1-4015-8ab9-033030df0e34","Type":"ContainerStarted","Data":"cc0835e23020774ea4dc920a8cf74a79654e2cd044955a6c5516f277bd626efd"} Apr 17 21:16:25.470292 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:25.469863 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"99268d51-f6a1-4015-8ab9-033030df0e34","Type":"ContainerStarted","Data":"4f174b5d87d25669e662087d8a809574c1890531bc979bdf48f6c8a125f9ad85"} Apr 17 21:16:25.470292 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:25.469873 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"99268d51-f6a1-4015-8ab9-033030df0e34","Type":"ContainerStarted","Data":"7e2f0797bf839bf77758f87632af0479834b639623ab9d9b5dcfff1b799b389f"} Apr 17 21:16:25.496968 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:25.496916 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.731752693 podStartE2EDuration="6.496901079s" podCreationTimestamp="2026-04-17 21:16:19 +0000 UTC" firstStartedPulling="2026-04-17 21:16:19.61583958 +0000 UTC m=+183.312394631" lastFinishedPulling="2026-04-17 21:16:24.380987964 +0000 UTC m=+188.077543017" observedRunningTime="2026-04-17 21:16:25.494325899 +0000 UTC m=+189.190880971" watchObservedRunningTime="2026-04-17 21:16:25.496901079 +0000 UTC m=+189.193456152" Apr 17 21:16:26.458052 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:26.458023 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-5dfb58dddc-9ttxq" Apr 17 21:16:29.428242 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:29.428202 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:16:38.508660 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:38.508621 2572 generic.go:358] "Generic (PLEG): container finished" podID="6bd70117-3252-4f39-aaf3-d2d5efefcc3b" containerID="45d2a3728fab1e0df9c97def2480c8f4954325fba8f0d5c502a5be0f96114e4b" exitCode=0 Apr 17 21:16:38.509212 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:38.508700 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-d2h87" event={"ID":"6bd70117-3252-4f39-aaf3-d2d5efefcc3b","Type":"ContainerDied","Data":"45d2a3728fab1e0df9c97def2480c8f4954325fba8f0d5c502a5be0f96114e4b"} Apr 17 21:16:38.509212 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:38.509152 2572 scope.go:117] "RemoveContainer" containerID="45d2a3728fab1e0df9c97def2480c8f4954325fba8f0d5c502a5be0f96114e4b" Apr 17 21:16:38.987042 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:38.987006 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7c95fdb69b-rgqgn_a7386c80-9465-42c2-b807-c545f0e73d34/router/0.log" Apr 17 21:16:38.992275 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:38.992250 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-dfvbk_03a3935c-b9b0-4fda-8142-8cd63a6b5a88/serve-healthcheck-canary/0.log" Apr 17 21:16:39.513568 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:39.513538 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-d2h87" event={"ID":"6bd70117-3252-4f39-aaf3-d2d5efefcc3b","Type":"ContainerStarted","Data":"7e01c5ee90deeec6922fad62d148b12f66d8c722e0a3923ef7139a00d2cf8ebb"} Apr 17 21:16:59.574814 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:59.574779 2572 generic.go:358] "Generic (PLEG): container finished" podID="b2eabc41-fed0-438c-adb4-7eda4a023735" containerID="977ce6f91fbc8e4930ef7fb9f60785ba4aec92ef037f063c9eb92b60b267f175" exitCode=0 Apr 17 21:16:59.575296 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:59.574859 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c54gp" event={"ID":"b2eabc41-fed0-438c-adb4-7eda4a023735","Type":"ContainerDied","Data":"977ce6f91fbc8e4930ef7fb9f60785ba4aec92ef037f063c9eb92b60b267f175"} Apr 17 21:16:59.575296 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:16:59.575227 2572 scope.go:117] "RemoveContainer" containerID="977ce6f91fbc8e4930ef7fb9f60785ba4aec92ef037f063c9eb92b60b267f175" Apr 17 21:17:00.579071 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:00.579039 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c54gp" event={"ID":"b2eabc41-fed0-438c-adb4-7eda4a023735","Type":"ContainerStarted","Data":"ecba41b9edc996a12670b7cd5e85ae13b53acc14c171f3d31328be2032e31ea5"} Apr 17 21:17:06.598607 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:06.598522 2572 generic.go:358] "Generic (PLEG): container finished" podID="bfbb2b11-b5bf-4910-b508-d65f63da7218" containerID="16f7d83fdb88b8ad235185059a79d4afbc678781dd48835a0c7ec91761fc6d88" exitCode=0 Apr 17 21:17:06.599000 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:06.598597 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xtmbm" event={"ID":"bfbb2b11-b5bf-4910-b508-d65f63da7218","Type":"ContainerDied","Data":"16f7d83fdb88b8ad235185059a79d4afbc678781dd48835a0c7ec91761fc6d88"} Apr 17 21:17:06.599000 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:06.598919 2572 scope.go:117] "RemoveContainer" containerID="16f7d83fdb88b8ad235185059a79d4afbc678781dd48835a0c7ec91761fc6d88" Apr 17 21:17:07.603587 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:07.603554 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xtmbm" event={"ID":"bfbb2b11-b5bf-4910-b508-d65f63da7218","Type":"ContainerStarted","Data":"37786b9e42d0a960a87ed3b6cf9f64d3d4c17fdea625a42f962efa10b59df332"} Apr 17 21:17:19.428470 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:19.428413 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:19.444538 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:19.444507 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:19.653264 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:19.653233 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:27.528303 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:27.528244 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4574ebc4-f71f-480d-9e07-5e822f12bb1a-metrics-certs\") pod \"network-metrics-daemon-nprtf\" (UID: \"4574ebc4-f71f-480d-9e07-5e822f12bb1a\") " pod="openshift-multus/network-metrics-daemon-nprtf" Apr 17 21:17:27.530599 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:27.530576 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4574ebc4-f71f-480d-9e07-5e822f12bb1a-metrics-certs\") pod \"network-metrics-daemon-nprtf\" (UID: \"4574ebc4-f71f-480d-9e07-5e822f12bb1a\") " pod="openshift-multus/network-metrics-daemon-nprtf" Apr 17 21:17:27.667308 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:27.667273 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-zzp4n\"" Apr 17 21:17:27.675083 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:27.675056 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nprtf" Apr 17 21:17:27.796727 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:27.796643 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-nprtf"] Apr 17 21:17:27.799697 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:17:27.799665 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4574ebc4_f71f_480d_9e07_5e822f12bb1a.slice/crio-32035f41d572befc2f3b8e19b16091995ece5123e8ab838f18ebd80b72d4cd89 WatchSource:0}: Error finding container 32035f41d572befc2f3b8e19b16091995ece5123e8ab838f18ebd80b72d4cd89: Status 404 returned error can't find the container with id 32035f41d572befc2f3b8e19b16091995ece5123e8ab838f18ebd80b72d4cd89 Apr 17 21:17:28.663346 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:28.663302 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nprtf" event={"ID":"4574ebc4-f71f-480d-9e07-5e822f12bb1a","Type":"ContainerStarted","Data":"32035f41d572befc2f3b8e19b16091995ece5123e8ab838f18ebd80b72d4cd89"} Apr 17 21:17:29.668307 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:29.668269 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nprtf" event={"ID":"4574ebc4-f71f-480d-9e07-5e822f12bb1a","Type":"ContainerStarted","Data":"6d5c75ecc0833dcd975ab5070357822ace009a1856cd38898722f2da18e1066b"} Apr 17 21:17:29.668735 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:29.668314 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nprtf" event={"ID":"4574ebc4-f71f-480d-9e07-5e822f12bb1a","Type":"ContainerStarted","Data":"9e5c3dc31511ad60d288e74a27d42fc07cf1e1b0edbc0b754455d032ecadfa5e"} Apr 17 21:17:29.682486 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:29.682434 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-nprtf" podStartSLOduration=252.723137745 podStartE2EDuration="4m13.682413877s" podCreationTimestamp="2026-04-17 21:13:16 +0000 UTC" firstStartedPulling="2026-04-17 21:17:27.801683306 +0000 UTC m=+251.498238360" lastFinishedPulling="2026-04-17 21:17:28.760959438 +0000 UTC m=+252.457514492" observedRunningTime="2026-04-17 21:17:29.681805569 +0000 UTC m=+253.378360641" watchObservedRunningTime="2026-04-17 21:17:29.682413877 +0000 UTC m=+253.378968950" Apr 17 21:17:37.467829 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.467750 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 21:17:37.468347 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.468246 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="99268d51-f6a1-4015-8ab9-033030df0e34" containerName="prometheus" containerID="cri-o://082c7607308a2bcdfa1ba79060457e420620f3e5e14a3328bd3ce224255825af" gracePeriod=600 Apr 17 21:17:37.468427 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.468314 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="99268d51-f6a1-4015-8ab9-033030df0e34" containerName="config-reloader" containerID="cri-o://7e2f0797bf839bf77758f87632af0479834b639623ab9d9b5dcfff1b799b389f" gracePeriod=600 Apr 17 21:17:37.468427 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.468317 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="99268d51-f6a1-4015-8ab9-033030df0e34" containerName="thanos-sidecar" containerID="cri-o://4f174b5d87d25669e662087d8a809574c1890531bc979bdf48f6c8a125f9ad85" gracePeriod=600 Apr 17 21:17:37.468427 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.468371 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="99268d51-f6a1-4015-8ab9-033030df0e34" containerName="kube-rbac-proxy-web" containerID="cri-o://cc0835e23020774ea4dc920a8cf74a79654e2cd044955a6c5516f277bd626efd" gracePeriod=600 Apr 17 21:17:37.468427 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.468318 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="99268d51-f6a1-4015-8ab9-033030df0e34" containerName="kube-rbac-proxy-thanos" containerID="cri-o://5a8e300b2a202d29d225ea518ac0a89afaff3d777834317608117c23b8649203" gracePeriod=600 Apr 17 21:17:37.468624 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.468522 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="99268d51-f6a1-4015-8ab9-033030df0e34" containerName="kube-rbac-proxy" containerID="cri-o://e63badcf6fc2e4f421694c2873937f1ababc48e5184a1a93fc30bcb816cb7bab" gracePeriod=600 Apr 17 21:17:37.707093 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.707058 2572 generic.go:358] "Generic (PLEG): container finished" podID="99268d51-f6a1-4015-8ab9-033030df0e34" containerID="5a8e300b2a202d29d225ea518ac0a89afaff3d777834317608117c23b8649203" exitCode=0 Apr 17 21:17:37.707093 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.707084 2572 generic.go:358] "Generic (PLEG): container finished" podID="99268d51-f6a1-4015-8ab9-033030df0e34" containerID="e63badcf6fc2e4f421694c2873937f1ababc48e5184a1a93fc30bcb816cb7bab" exitCode=0 Apr 17 21:17:37.707093 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.707090 2572 generic.go:358] "Generic (PLEG): container finished" podID="99268d51-f6a1-4015-8ab9-033030df0e34" containerID="cc0835e23020774ea4dc920a8cf74a79654e2cd044955a6c5516f277bd626efd" exitCode=0 Apr 17 21:17:37.707093 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.707096 2572 generic.go:358] "Generic (PLEG): container finished" podID="99268d51-f6a1-4015-8ab9-033030df0e34" containerID="4f174b5d87d25669e662087d8a809574c1890531bc979bdf48f6c8a125f9ad85" exitCode=0 Apr 17 21:17:37.707093 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.707101 2572 generic.go:358] "Generic (PLEG): container finished" podID="99268d51-f6a1-4015-8ab9-033030df0e34" containerID="7e2f0797bf839bf77758f87632af0479834b639623ab9d9b5dcfff1b799b389f" exitCode=0 Apr 17 21:17:37.707093 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.707106 2572 generic.go:358] "Generic (PLEG): container finished" podID="99268d51-f6a1-4015-8ab9-033030df0e34" containerID="082c7607308a2bcdfa1ba79060457e420620f3e5e14a3328bd3ce224255825af" exitCode=0 Apr 17 21:17:37.707467 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.707156 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"99268d51-f6a1-4015-8ab9-033030df0e34","Type":"ContainerDied","Data":"5a8e300b2a202d29d225ea518ac0a89afaff3d777834317608117c23b8649203"} Apr 17 21:17:37.707467 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.707228 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"99268d51-f6a1-4015-8ab9-033030df0e34","Type":"ContainerDied","Data":"e63badcf6fc2e4f421694c2873937f1ababc48e5184a1a93fc30bcb816cb7bab"} Apr 17 21:17:37.707467 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.707243 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"99268d51-f6a1-4015-8ab9-033030df0e34","Type":"ContainerDied","Data":"cc0835e23020774ea4dc920a8cf74a79654e2cd044955a6c5516f277bd626efd"} Apr 17 21:17:37.707467 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.707257 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"99268d51-f6a1-4015-8ab9-033030df0e34","Type":"ContainerDied","Data":"4f174b5d87d25669e662087d8a809574c1890531bc979bdf48f6c8a125f9ad85"} Apr 17 21:17:37.707467 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.707270 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"99268d51-f6a1-4015-8ab9-033030df0e34","Type":"ContainerDied","Data":"7e2f0797bf839bf77758f87632af0479834b639623ab9d9b5dcfff1b799b389f"} Apr 17 21:17:37.707467 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.707282 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"99268d51-f6a1-4015-8ab9-033030df0e34","Type":"ContainerDied","Data":"082c7607308a2bcdfa1ba79060457e420620f3e5e14a3328bd3ce224255825af"} Apr 17 21:17:37.721575 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.721551 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:37.816932 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.816892 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-secret-grpc-tls\") pod \"99268d51-f6a1-4015-8ab9-033030df0e34\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " Apr 17 21:17:37.817124 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.816942 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-thanos-prometheus-http-client-file\") pod \"99268d51-f6a1-4015-8ab9-033030df0e34\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " Apr 17 21:17:37.817124 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.816962 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-secret-metrics-client-certs\") pod \"99268d51-f6a1-4015-8ab9-033030df0e34\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " Apr 17 21:17:37.817124 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.816990 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99268d51-f6a1-4015-8ab9-033030df0e34-configmap-kubelet-serving-ca-bundle\") pod \"99268d51-f6a1-4015-8ab9-033030df0e34\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " Apr 17 21:17:37.817124 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.817025 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-secret-prometheus-k8s-tls\") pod \"99268d51-f6a1-4015-8ab9-033030df0e34\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " Apr 17 21:17:37.817124 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.817047 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/99268d51-f6a1-4015-8ab9-033030df0e34-prometheus-k8s-db\") pod \"99268d51-f6a1-4015-8ab9-033030df0e34\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " Apr 17 21:17:37.817124 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.817091 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2tsr\" (UniqueName: \"kubernetes.io/projected/99268d51-f6a1-4015-8ab9-033030df0e34-kube-api-access-r2tsr\") pod \"99268d51-f6a1-4015-8ab9-033030df0e34\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " Apr 17 21:17:37.817124 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.817116 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"99268d51-f6a1-4015-8ab9-033030df0e34\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " Apr 17 21:17:37.817514 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.817143 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/99268d51-f6a1-4015-8ab9-033030df0e34-configmap-metrics-client-ca\") pod \"99268d51-f6a1-4015-8ab9-033030df0e34\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " Apr 17 21:17:37.817514 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.817212 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-config\") pod \"99268d51-f6a1-4015-8ab9-033030df0e34\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " Apr 17 21:17:37.817514 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.817253 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99268d51-f6a1-4015-8ab9-033030df0e34-prometheus-trusted-ca-bundle\") pod \"99268d51-f6a1-4015-8ab9-033030df0e34\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " Apr 17 21:17:37.817514 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.817278 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/99268d51-f6a1-4015-8ab9-033030df0e34-tls-assets\") pod \"99268d51-f6a1-4015-8ab9-033030df0e34\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " Apr 17 21:17:37.817514 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.817311 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/99268d51-f6a1-4015-8ab9-033030df0e34-prometheus-k8s-rulefiles-0\") pod \"99268d51-f6a1-4015-8ab9-033030df0e34\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " Apr 17 21:17:37.817514 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.817337 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/99268d51-f6a1-4015-8ab9-033030df0e34-config-out\") pod \"99268d51-f6a1-4015-8ab9-033030df0e34\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " Apr 17 21:17:37.817514 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.817369 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-web-config\") pod \"99268d51-f6a1-4015-8ab9-033030df0e34\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " Apr 17 21:17:37.817514 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.817392 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"99268d51-f6a1-4015-8ab9-033030df0e34\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " Apr 17 21:17:37.817514 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.817433 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99268d51-f6a1-4015-8ab9-033030df0e34-configmap-serving-certs-ca-bundle\") pod \"99268d51-f6a1-4015-8ab9-033030df0e34\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " Apr 17 21:17:37.817514 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.817479 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-secret-kube-rbac-proxy\") pod \"99268d51-f6a1-4015-8ab9-033030df0e34\" (UID: \"99268d51-f6a1-4015-8ab9-033030df0e34\") " Apr 17 21:17:37.817514 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.817478 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99268d51-f6a1-4015-8ab9-033030df0e34-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "99268d51-f6a1-4015-8ab9-033030df0e34" (UID: "99268d51-f6a1-4015-8ab9-033030df0e34"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 21:17:37.818054 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.817728 2572 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99268d51-f6a1-4015-8ab9-033030df0e34-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-135-174.ec2.internal\" DevicePath \"\"" Apr 17 21:17:37.818236 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.818201 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99268d51-f6a1-4015-8ab9-033030df0e34-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "99268d51-f6a1-4015-8ab9-033030df0e34" (UID: "99268d51-f6a1-4015-8ab9-033030df0e34"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:17:37.818900 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.818866 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99268d51-f6a1-4015-8ab9-033030df0e34-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "99268d51-f6a1-4015-8ab9-033030df0e34" (UID: "99268d51-f6a1-4015-8ab9-033030df0e34"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 21:17:37.819477 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.819451 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99268d51-f6a1-4015-8ab9-033030df0e34-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "99268d51-f6a1-4015-8ab9-033030df0e34" (UID: "99268d51-f6a1-4015-8ab9-033030df0e34"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 21:17:37.820151 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.819836 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99268d51-f6a1-4015-8ab9-033030df0e34-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "99268d51-f6a1-4015-8ab9-033030df0e34" (UID: "99268d51-f6a1-4015-8ab9-033030df0e34"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 21:17:37.820311 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.820280 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "99268d51-f6a1-4015-8ab9-033030df0e34" (UID: "99268d51-f6a1-4015-8ab9-033030df0e34"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 21:17:37.820706 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.820681 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99268d51-f6a1-4015-8ab9-033030df0e34-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "99268d51-f6a1-4015-8ab9-033030df0e34" (UID: "99268d51-f6a1-4015-8ab9-033030df0e34"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 21:17:37.820787 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.820726 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99268d51-f6a1-4015-8ab9-033030df0e34-kube-api-access-r2tsr" (OuterVolumeSpecName: "kube-api-access-r2tsr") pod "99268d51-f6a1-4015-8ab9-033030df0e34" (UID: "99268d51-f6a1-4015-8ab9-033030df0e34"). InnerVolumeSpecName "kube-api-access-r2tsr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:17:37.820787 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.820758 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "99268d51-f6a1-4015-8ab9-033030df0e34" (UID: "99268d51-f6a1-4015-8ab9-033030df0e34"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 21:17:37.820946 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.820854 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "99268d51-f6a1-4015-8ab9-033030df0e34" (UID: "99268d51-f6a1-4015-8ab9-033030df0e34"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 21:17:37.820946 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.820903 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "99268d51-f6a1-4015-8ab9-033030df0e34" (UID: "99268d51-f6a1-4015-8ab9-033030df0e34"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 21:17:37.821106 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.820943 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "99268d51-f6a1-4015-8ab9-033030df0e34" (UID: "99268d51-f6a1-4015-8ab9-033030df0e34"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 21:17:37.821404 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.821378 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-config" (OuterVolumeSpecName: "config") pod "99268d51-f6a1-4015-8ab9-033030df0e34" (UID: "99268d51-f6a1-4015-8ab9-033030df0e34"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 21:17:37.821737 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.821712 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "99268d51-f6a1-4015-8ab9-033030df0e34" (UID: "99268d51-f6a1-4015-8ab9-033030df0e34"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 21:17:37.822365 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.822329 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99268d51-f6a1-4015-8ab9-033030df0e34-config-out" (OuterVolumeSpecName: "config-out") pod "99268d51-f6a1-4015-8ab9-033030df0e34" (UID: "99268d51-f6a1-4015-8ab9-033030df0e34"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:17:37.822770 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.822748 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99268d51-f6a1-4015-8ab9-033030df0e34-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "99268d51-f6a1-4015-8ab9-033030df0e34" (UID: "99268d51-f6a1-4015-8ab9-033030df0e34"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:17:37.823084 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.823055 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "99268d51-f6a1-4015-8ab9-033030df0e34" (UID: "99268d51-f6a1-4015-8ab9-033030df0e34"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 21:17:37.833226 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.833186 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-web-config" (OuterVolumeSpecName: "web-config") pod "99268d51-f6a1-4015-8ab9-033030df0e34" (UID: "99268d51-f6a1-4015-8ab9-033030df0e34"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 21:17:37.919161 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.919111 2572 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-secret-prometheus-k8s-tls\") on node \"ip-10-0-135-174.ec2.internal\" DevicePath \"\"" Apr 17 21:17:37.919161 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.919153 2572 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/99268d51-f6a1-4015-8ab9-033030df0e34-prometheus-k8s-db\") on node \"ip-10-0-135-174.ec2.internal\" DevicePath \"\"" Apr 17 21:17:37.919161 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.919191 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r2tsr\" (UniqueName: \"kubernetes.io/projected/99268d51-f6a1-4015-8ab9-033030df0e34-kube-api-access-r2tsr\") on node \"ip-10-0-135-174.ec2.internal\" DevicePath \"\"" Apr 17 21:17:37.919456 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.919208 2572 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-135-174.ec2.internal\" DevicePath \"\"" Apr 17 21:17:37.919456 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.919222 2572 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/99268d51-f6a1-4015-8ab9-033030df0e34-configmap-metrics-client-ca\") on node \"ip-10-0-135-174.ec2.internal\" DevicePath \"\"" Apr 17 21:17:37.919456 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.919238 2572 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-config\") on node \"ip-10-0-135-174.ec2.internal\" DevicePath \"\"" Apr 17 21:17:37.919456 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.919251 2572 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99268d51-f6a1-4015-8ab9-033030df0e34-prometheus-trusted-ca-bundle\") on node \"ip-10-0-135-174.ec2.internal\" DevicePath \"\"" Apr 17 21:17:37.919456 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.919264 2572 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/99268d51-f6a1-4015-8ab9-033030df0e34-tls-assets\") on node \"ip-10-0-135-174.ec2.internal\" DevicePath \"\"" Apr 17 21:17:37.919456 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.919276 2572 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/99268d51-f6a1-4015-8ab9-033030df0e34-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-135-174.ec2.internal\" DevicePath \"\"" Apr 17 21:17:37.919456 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.919287 2572 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/99268d51-f6a1-4015-8ab9-033030df0e34-config-out\") on node \"ip-10-0-135-174.ec2.internal\" DevicePath \"\"" Apr 17 21:17:37.919456 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.919299 2572 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-web-config\") on node \"ip-10-0-135-174.ec2.internal\" DevicePath \"\"" Apr 17 21:17:37.919456 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.919311 2572 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-135-174.ec2.internal\" DevicePath \"\"" Apr 17 21:17:37.919456 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.919323 2572 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99268d51-f6a1-4015-8ab9-033030df0e34-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-135-174.ec2.internal\" DevicePath \"\"" Apr 17 21:17:37.919456 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.919336 2572 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-secret-kube-rbac-proxy\") on node \"ip-10-0-135-174.ec2.internal\" DevicePath \"\"" Apr 17 21:17:37.919456 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.919348 2572 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-secret-grpc-tls\") on node \"ip-10-0-135-174.ec2.internal\" DevicePath \"\"" Apr 17 21:17:37.919456 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.919360 2572 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-thanos-prometheus-http-client-file\") on node \"ip-10-0-135-174.ec2.internal\" DevicePath \"\"" Apr 17 21:17:37.919456 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:37.919372 2572 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/99268d51-f6a1-4015-8ab9-033030df0e34-secret-metrics-client-certs\") on node \"ip-10-0-135-174.ec2.internal\" DevicePath \"\"" Apr 17 21:17:38.712548 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.712513 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"99268d51-f6a1-4015-8ab9-033030df0e34","Type":"ContainerDied","Data":"3e9d9eeacce687c4c9ea1284391994c40a90aceb4469a92190f2d678a53f471d"} Apr 17 21:17:38.712548 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.712549 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:38.712984 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.712574 2572 scope.go:117] "RemoveContainer" containerID="5a8e300b2a202d29d225ea518ac0a89afaff3d777834317608117c23b8649203" Apr 17 21:17:38.720237 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.720217 2572 scope.go:117] "RemoveContainer" containerID="e63badcf6fc2e4f421694c2873937f1ababc48e5184a1a93fc30bcb816cb7bab" Apr 17 21:17:38.727328 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.727312 2572 scope.go:117] "RemoveContainer" containerID="cc0835e23020774ea4dc920a8cf74a79654e2cd044955a6c5516f277bd626efd" Apr 17 21:17:38.734021 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.733994 2572 scope.go:117] "RemoveContainer" containerID="4f174b5d87d25669e662087d8a809574c1890531bc979bdf48f6c8a125f9ad85" Apr 17 21:17:38.735066 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.734961 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 21:17:38.740519 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.740490 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 21:17:38.743687 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.743670 2572 scope.go:117] "RemoveContainer" containerID="7e2f0797bf839bf77758f87632af0479834b639623ab9d9b5dcfff1b799b389f" Apr 17 21:17:38.750695 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.750677 2572 scope.go:117] "RemoveContainer" containerID="082c7607308a2bcdfa1ba79060457e420620f3e5e14a3328bd3ce224255825af" Apr 17 21:17:38.757605 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.757581 2572 scope.go:117] "RemoveContainer" containerID="d131c370f5a5e0e7553438c3c8ecf19320e3fb21847876bcd93a3bc6cdb849fd" Apr 17 21:17:38.760515 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.760493 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 21:17:38.760819 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.760806 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99268d51-f6a1-4015-8ab9-033030df0e34" containerName="init-config-reloader" Apr 17 21:17:38.760819 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.760819 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="99268d51-f6a1-4015-8ab9-033030df0e34" containerName="init-config-reloader" Apr 17 21:17:38.760939 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.760833 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99268d51-f6a1-4015-8ab9-033030df0e34" containerName="kube-rbac-proxy" Apr 17 21:17:38.760939 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.760840 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="99268d51-f6a1-4015-8ab9-033030df0e34" containerName="kube-rbac-proxy" Apr 17 21:17:38.760939 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.760847 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99268d51-f6a1-4015-8ab9-033030df0e34" containerName="kube-rbac-proxy-web" Apr 17 21:17:38.760939 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.760853 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="99268d51-f6a1-4015-8ab9-033030df0e34" containerName="kube-rbac-proxy-web" Apr 17 21:17:38.760939 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.760862 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99268d51-f6a1-4015-8ab9-033030df0e34" containerName="thanos-sidecar" Apr 17 21:17:38.760939 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.760868 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="99268d51-f6a1-4015-8ab9-033030df0e34" containerName="thanos-sidecar" Apr 17 21:17:38.760939 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.760873 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99268d51-f6a1-4015-8ab9-033030df0e34" containerName="kube-rbac-proxy-thanos" Apr 17 21:17:38.760939 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.760878 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="99268d51-f6a1-4015-8ab9-033030df0e34" containerName="kube-rbac-proxy-thanos" Apr 17 21:17:38.760939 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.760886 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99268d51-f6a1-4015-8ab9-033030df0e34" containerName="prometheus" Apr 17 21:17:38.760939 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.760891 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="99268d51-f6a1-4015-8ab9-033030df0e34" containerName="prometheus" Apr 17 21:17:38.760939 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.760900 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99268d51-f6a1-4015-8ab9-033030df0e34" containerName="config-reloader" Apr 17 21:17:38.760939 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.760907 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="99268d51-f6a1-4015-8ab9-033030df0e34" containerName="config-reloader" Apr 17 21:17:38.761397 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.760951 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="99268d51-f6a1-4015-8ab9-033030df0e34" containerName="kube-rbac-proxy" Apr 17 21:17:38.761397 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.760960 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="99268d51-f6a1-4015-8ab9-033030df0e34" containerName="thanos-sidecar" Apr 17 21:17:38.761397 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.760967 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="99268d51-f6a1-4015-8ab9-033030df0e34" containerName="prometheus" Apr 17 21:17:38.761397 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.760976 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="99268d51-f6a1-4015-8ab9-033030df0e34" containerName="config-reloader" Apr 17 21:17:38.761397 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.760981 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="99268d51-f6a1-4015-8ab9-033030df0e34" containerName="kube-rbac-proxy-web" Apr 17 21:17:38.761397 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.760988 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="99268d51-f6a1-4015-8ab9-033030df0e34" containerName="kube-rbac-proxy-thanos" Apr 17 21:17:38.766280 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.766257 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:38.768549 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.768516 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 21:17:38.768682 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.768556 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 21:17:38.768682 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.768519 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 21:17:38.768682 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.768519 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 21:17:38.768972 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.768876 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 21:17:38.768972 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.768886 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 21:17:38.769072 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.769052 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 21:17:38.769130 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.769111 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 21:17:38.769615 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.769586 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 21:17:38.769957 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.769940 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 21:17:38.770084 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.770065 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-66qpt\"" Apr 17 21:17:38.770152 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.770081 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-ba4j854sbn2qo\"" Apr 17 21:17:38.770716 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.770355 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 21:17:38.772024 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.772006 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 21:17:38.774879 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.774858 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 21:17:38.776615 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.776593 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 21:17:38.867850 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.867816 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99268d51-f6a1-4015-8ab9-033030df0e34" path="/var/lib/kubelet/pods/99268d51-f6a1-4015-8ab9-033030df0e34/volumes" Apr 17 21:17:38.928762 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.928727 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6291479f-aae0-45f7-b5db-f1870619ef39-web-config\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:38.928762 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.928764 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6291479f-aae0-45f7-b5db-f1870619ef39-config\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:38.928976 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.928792 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxsts\" (UniqueName: \"kubernetes.io/projected/6291479f-aae0-45f7-b5db-f1870619ef39-kube-api-access-qxsts\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:38.928976 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.928857 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6291479f-aae0-45f7-b5db-f1870619ef39-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:38.928976 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.928910 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6291479f-aae0-45f7-b5db-f1870619ef39-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:38.928976 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.928934 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6291479f-aae0-45f7-b5db-f1870619ef39-config-out\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:38.928976 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.928955 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6291479f-aae0-45f7-b5db-f1870619ef39-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:38.929159 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.928995 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6291479f-aae0-45f7-b5db-f1870619ef39-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:38.929159 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.929016 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6291479f-aae0-45f7-b5db-f1870619ef39-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:38.929159 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.929045 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6291479f-aae0-45f7-b5db-f1870619ef39-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:38.929159 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.929076 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6291479f-aae0-45f7-b5db-f1870619ef39-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:38.929159 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.929090 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6291479f-aae0-45f7-b5db-f1870619ef39-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:38.929159 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.929115 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6291479f-aae0-45f7-b5db-f1870619ef39-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:38.929159 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.929133 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6291479f-aae0-45f7-b5db-f1870619ef39-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:38.929159 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.929158 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6291479f-aae0-45f7-b5db-f1870619ef39-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:38.929492 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.929204 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6291479f-aae0-45f7-b5db-f1870619ef39-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:38.929492 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.929233 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6291479f-aae0-45f7-b5db-f1870619ef39-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:38.929492 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:38.929249 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6291479f-aae0-45f7-b5db-f1870619ef39-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:39.030409 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:39.030316 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6291479f-aae0-45f7-b5db-f1870619ef39-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:39.030409 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:39.030354 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6291479f-aae0-45f7-b5db-f1870619ef39-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:39.030409 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:39.030378 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6291479f-aae0-45f7-b5db-f1870619ef39-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:39.030409 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:39.030396 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6291479f-aae0-45f7-b5db-f1870619ef39-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:39.030749 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:39.030420 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6291479f-aae0-45f7-b5db-f1870619ef39-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:39.030749 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:39.030669 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6291479f-aae0-45f7-b5db-f1870619ef39-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:39.030749 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:39.030715 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6291479f-aae0-45f7-b5db-f1870619ef39-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:39.030749 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:39.030744 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6291479f-aae0-45f7-b5db-f1870619ef39-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:39.030959 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:39.030790 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6291479f-aae0-45f7-b5db-f1870619ef39-web-config\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:39.030959 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:39.030820 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6291479f-aae0-45f7-b5db-f1870619ef39-config\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:39.030959 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:39.030848 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qxsts\" (UniqueName: \"kubernetes.io/projected/6291479f-aae0-45f7-b5db-f1870619ef39-kube-api-access-qxsts\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:39.030959 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:39.030875 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6291479f-aae0-45f7-b5db-f1870619ef39-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:39.030959 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:39.030928 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6291479f-aae0-45f7-b5db-f1870619ef39-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:39.031240 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:39.030964 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6291479f-aae0-45f7-b5db-f1870619ef39-config-out\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:39.031240 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:39.031010 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6291479f-aae0-45f7-b5db-f1870619ef39-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:39.031240 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:39.031041 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6291479f-aae0-45f7-b5db-f1870619ef39-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:39.031944 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:39.031616 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6291479f-aae0-45f7-b5db-f1870619ef39-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:39.031944 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:39.031686 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6291479f-aae0-45f7-b5db-f1870619ef39-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:39.032392 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:39.032357 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6291479f-aae0-45f7-b5db-f1870619ef39-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:39.032482 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:39.032412 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6291479f-aae0-45f7-b5db-f1870619ef39-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:39.033288 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:39.032704 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6291479f-aae0-45f7-b5db-f1870619ef39-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:39.033288 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:39.032959 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6291479f-aae0-45f7-b5db-f1870619ef39-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:39.033288 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:39.033133 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6291479f-aae0-45f7-b5db-f1870619ef39-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:39.033824 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:39.033779 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6291479f-aae0-45f7-b5db-f1870619ef39-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:39.033907 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:39.033881 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6291479f-aae0-45f7-b5db-f1870619ef39-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:39.034495 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:39.034451 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6291479f-aae0-45f7-b5db-f1870619ef39-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:39.035095 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:39.035046 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6291479f-aae0-45f7-b5db-f1870619ef39-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:39.035275 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:39.035251 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6291479f-aae0-45f7-b5db-f1870619ef39-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:39.035386 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:39.035366 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6291479f-aae0-45f7-b5db-f1870619ef39-web-config\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:39.035657 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:39.035633 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6291479f-aae0-45f7-b5db-f1870619ef39-config-out\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:39.035920 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:39.035897 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6291479f-aae0-45f7-b5db-f1870619ef39-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:39.036002 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:39.035976 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6291479f-aae0-45f7-b5db-f1870619ef39-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:39.036251 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:39.036235 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6291479f-aae0-45f7-b5db-f1870619ef39-config\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:39.036397 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:39.036381 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6291479f-aae0-45f7-b5db-f1870619ef39-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:39.037543 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:39.037524 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6291479f-aae0-45f7-b5db-f1870619ef39-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:39.038486 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:39.038468 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxsts\" (UniqueName: \"kubernetes.io/projected/6291479f-aae0-45f7-b5db-f1870619ef39-kube-api-access-qxsts\") pod \"prometheus-k8s-0\" (UID: \"6291479f-aae0-45f7-b5db-f1870619ef39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:39.077088 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:39.077053 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:17:39.217499 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:39.217437 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 21:17:39.220280 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:17:39.220241 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6291479f_aae0_45f7_b5db_f1870619ef39.slice/crio-8106acbeceb35498a6455319e3a405d6c314a9be48ab9ed3e619f70a97215f85 WatchSource:0}: Error finding container 8106acbeceb35498a6455319e3a405d6c314a9be48ab9ed3e619f70a97215f85: Status 404 returned error can't find the container with id 8106acbeceb35498a6455319e3a405d6c314a9be48ab9ed3e619f70a97215f85 Apr 17 21:17:39.717633 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:39.717546 2572 generic.go:358] "Generic (PLEG): container finished" podID="6291479f-aae0-45f7-b5db-f1870619ef39" containerID="05d4af5dcb98fc07ee962d6399b2d2db6cc6acc0c14c2a17f7548a7d60b63792" exitCode=0 Apr 17 21:17:39.718020 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:39.717638 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6291479f-aae0-45f7-b5db-f1870619ef39","Type":"ContainerDied","Data":"05d4af5dcb98fc07ee962d6399b2d2db6cc6acc0c14c2a17f7548a7d60b63792"} Apr 17 21:17:39.718020 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:39.717673 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6291479f-aae0-45f7-b5db-f1870619ef39","Type":"ContainerStarted","Data":"8106acbeceb35498a6455319e3a405d6c314a9be48ab9ed3e619f70a97215f85"} Apr 17 21:17:40.723934 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:40.723891 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6291479f-aae0-45f7-b5db-f1870619ef39","Type":"ContainerStarted","Data":"73ae807c25a1d220a6bf2243b0bb5e48711d140ea6dbdef41392f0f7747a1e13"} Apr 17 21:17:40.723934 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:40.723931 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6291479f-aae0-45f7-b5db-f1870619ef39","Type":"ContainerStarted","Data":"18f66be828d5163b60e1efd0c90e6fbd3a23dcacd20870c49b1eaf704375c298"} Apr 17 21:17:40.723934 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:40.723942 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6291479f-aae0-45f7-b5db-f1870619ef39","Type":"ContainerStarted","Data":"d264877b172d923f3302b71874453ed7fe73e6b2ffdef46feca903b68aeda8ee"} Apr 17 21:17:40.724552 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:40.723950 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6291479f-aae0-45f7-b5db-f1870619ef39","Type":"ContainerStarted","Data":"b8e990ceebb868feb788be8d8d0b92ac46560ddf4d313e6f225ccb971cdc008a"} Apr 17 21:17:40.724552 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:40.723959 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6291479f-aae0-45f7-b5db-f1870619ef39","Type":"ContainerStarted","Data":"10be3e46fff12c70d1fed05d01ecf78304cc447a678bbf699d746460dfbf0f75"} Apr 17 21:17:40.724552 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:40.723968 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6291479f-aae0-45f7-b5db-f1870619ef39","Type":"ContainerStarted","Data":"d7a7a4f0734d68b41ceb576c905d18ffc57aac9044005b7b1a310c55c6e232e6"} Apr 17 21:17:40.750367 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:40.750308 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.750287697 podStartE2EDuration="2.750287697s" podCreationTimestamp="2026-04-17 21:17:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:17:40.747850749 +0000 UTC m=+264.444405833" watchObservedRunningTime="2026-04-17 21:17:40.750287697 +0000 UTC m=+264.446842770" Apr 17 21:17:44.077821 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:17:44.077784 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:18:16.744686 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:18:16.744659 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7q89l_6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82/ovn-acl-logging/0.log" Apr 17 21:18:16.745485 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:18:16.745466 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7q89l_6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82/ovn-acl-logging/0.log" Apr 17 21:18:16.750963 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:18:16.750940 2572 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 21:18:39.077860 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:18:39.077815 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:18:39.093414 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:18:39.093386 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:18:39.905009 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:18:39.904980 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 21:21:00.502271 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:00.502235 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-694fdf7c65-zjznf"] Apr 17 21:21:00.505860 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:00.505836 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-zjznf" Apr 17 21:21:00.508139 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:00.508114 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 21:21:00.508139 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:00.508126 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 21:21:00.508337 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:00.508130 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 17 21:21:00.508337 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:00.508200 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-2j2w8\"" Apr 17 21:21:00.508440 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:00.508423 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 17 21:21:00.525423 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:00.525402 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-694fdf7c65-zjznf"] Apr 17 21:21:00.587076 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:00.587042 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/df4c4082-f6ad-45b2-864e-fb191d6d3aa8-apiservice-cert\") pod \"opendatahub-operator-controller-manager-694fdf7c65-zjznf\" (UID: \"df4c4082-f6ad-45b2-864e-fb191d6d3aa8\") " pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-zjznf" Apr 17 21:21:00.587240 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:00.587077 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7kk5\" (UniqueName: \"kubernetes.io/projected/df4c4082-f6ad-45b2-864e-fb191d6d3aa8-kube-api-access-m7kk5\") pod \"opendatahub-operator-controller-manager-694fdf7c65-zjznf\" (UID: \"df4c4082-f6ad-45b2-864e-fb191d6d3aa8\") " pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-zjznf" Apr 17 21:21:00.587240 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:00.587139 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/df4c4082-f6ad-45b2-864e-fb191d6d3aa8-webhook-cert\") pod \"opendatahub-operator-controller-manager-694fdf7c65-zjznf\" (UID: \"df4c4082-f6ad-45b2-864e-fb191d6d3aa8\") " pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-zjznf" Apr 17 21:21:00.688522 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:00.688486 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/df4c4082-f6ad-45b2-864e-fb191d6d3aa8-apiservice-cert\") pod \"opendatahub-operator-controller-manager-694fdf7c65-zjznf\" (UID: \"df4c4082-f6ad-45b2-864e-fb191d6d3aa8\") " pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-zjznf" Apr 17 21:21:00.688685 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:00.688524 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m7kk5\" (UniqueName: \"kubernetes.io/projected/df4c4082-f6ad-45b2-864e-fb191d6d3aa8-kube-api-access-m7kk5\") pod \"opendatahub-operator-controller-manager-694fdf7c65-zjznf\" (UID: \"df4c4082-f6ad-45b2-864e-fb191d6d3aa8\") " pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-zjznf" Apr 17 21:21:00.688685 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:00.688604 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/df4c4082-f6ad-45b2-864e-fb191d6d3aa8-webhook-cert\") pod \"opendatahub-operator-controller-manager-694fdf7c65-zjznf\" (UID: \"df4c4082-f6ad-45b2-864e-fb191d6d3aa8\") " pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-zjznf" Apr 17 21:21:00.691089 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:00.691058 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/df4c4082-f6ad-45b2-864e-fb191d6d3aa8-apiservice-cert\") pod \"opendatahub-operator-controller-manager-694fdf7c65-zjznf\" (UID: \"df4c4082-f6ad-45b2-864e-fb191d6d3aa8\") " pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-zjznf" Apr 17 21:21:00.691226 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:00.691204 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/df4c4082-f6ad-45b2-864e-fb191d6d3aa8-webhook-cert\") pod \"opendatahub-operator-controller-manager-694fdf7c65-zjznf\" (UID: \"df4c4082-f6ad-45b2-864e-fb191d6d3aa8\") " pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-zjznf" Apr 17 21:21:00.705863 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:00.705836 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7kk5\" (UniqueName: \"kubernetes.io/projected/df4c4082-f6ad-45b2-864e-fb191d6d3aa8-kube-api-access-m7kk5\") pod \"opendatahub-operator-controller-manager-694fdf7c65-zjznf\" (UID: \"df4c4082-f6ad-45b2-864e-fb191d6d3aa8\") " pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-zjznf" Apr 17 21:21:00.815539 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:00.815457 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-zjznf" Apr 17 21:21:00.937371 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:00.937343 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-694fdf7c65-zjznf"] Apr 17 21:21:00.941016 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:21:00.940991 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf4c4082_f6ad_45b2_864e_fb191d6d3aa8.slice/crio-1ec5f906eaf6df08323a056c0a7dff729cb5d3103dd2e28b9d8746ca0c304c4b WatchSource:0}: Error finding container 1ec5f906eaf6df08323a056c0a7dff729cb5d3103dd2e28b9d8746ca0c304c4b: Status 404 returned error can't find the container with id 1ec5f906eaf6df08323a056c0a7dff729cb5d3103dd2e28b9d8746ca0c304c4b Apr 17 21:21:00.942728 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:00.942711 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 21:21:01.301432 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:01.301401 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-zjznf" event={"ID":"df4c4082-f6ad-45b2-864e-fb191d6d3aa8","Type":"ContainerStarted","Data":"1ec5f906eaf6df08323a056c0a7dff729cb5d3103dd2e28b9d8746ca0c304c4b"} Apr 17 21:21:04.312824 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:04.312787 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-zjznf" event={"ID":"df4c4082-f6ad-45b2-864e-fb191d6d3aa8","Type":"ContainerStarted","Data":"70e7d443f57b72d9467723e5cfe9119d8329243bc63acad39349cc90b6bba415"} Apr 17 21:21:04.313251 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:04.312859 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-zjznf" Apr 17 21:21:04.334068 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:04.334007 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-zjznf" podStartSLOduration=1.936527222 podStartE2EDuration="4.333986922s" podCreationTimestamp="2026-04-17 21:21:00 +0000 UTC" firstStartedPulling="2026-04-17 21:21:00.942827531 +0000 UTC m=+464.639382581" lastFinishedPulling="2026-04-17 21:21:03.340287225 +0000 UTC m=+467.036842281" observedRunningTime="2026-04-17 21:21:04.331877015 +0000 UTC m=+468.028432101" watchObservedRunningTime="2026-04-17 21:21:04.333986922 +0000 UTC m=+468.030541995" Apr 17 21:21:12.459072 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:12.459038 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-54f8864c6c-j26q5"] Apr 17 21:21:12.464099 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:12.464083 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-54f8864c6c-j26q5" Apr 17 21:21:12.469717 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:12.469696 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 17 21:21:12.470774 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:12.470750 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-z2pgk\"" Apr 17 21:21:12.470871 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:12.470767 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 17 21:21:12.470871 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:12.470791 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 17 21:21:12.470981 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:12.470866 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 17 21:21:12.470981 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:12.470922 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 17 21:21:12.473675 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:12.473656 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-54f8864c6c-j26q5"] Apr 17 21:21:12.580660 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:12.580630 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgv6w\" (UniqueName: \"kubernetes.io/projected/f500dce8-6c1c-4523-9bf1-49f02322f25f-kube-api-access-rgv6w\") pod \"lws-controller-manager-54f8864c6c-j26q5\" (UID: \"f500dce8-6c1c-4523-9bf1-49f02322f25f\") " pod="openshift-lws-operator/lws-controller-manager-54f8864c6c-j26q5" Apr 17 21:21:12.580789 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:12.580668 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/f500dce8-6c1c-4523-9bf1-49f02322f25f-manager-config\") pod \"lws-controller-manager-54f8864c6c-j26q5\" (UID: \"f500dce8-6c1c-4523-9bf1-49f02322f25f\") " pod="openshift-lws-operator/lws-controller-manager-54f8864c6c-j26q5" Apr 17 21:21:12.580789 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:12.580703 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f500dce8-6c1c-4523-9bf1-49f02322f25f-cert\") pod \"lws-controller-manager-54f8864c6c-j26q5\" (UID: \"f500dce8-6c1c-4523-9bf1-49f02322f25f\") " pod="openshift-lws-operator/lws-controller-manager-54f8864c6c-j26q5" Apr 17 21:21:12.580789 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:12.580718 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/f500dce8-6c1c-4523-9bf1-49f02322f25f-metrics-cert\") pod \"lws-controller-manager-54f8864c6c-j26q5\" (UID: \"f500dce8-6c1c-4523-9bf1-49f02322f25f\") " pod="openshift-lws-operator/lws-controller-manager-54f8864c6c-j26q5" Apr 17 21:21:12.681481 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:12.681453 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/f500dce8-6c1c-4523-9bf1-49f02322f25f-manager-config\") pod \"lws-controller-manager-54f8864c6c-j26q5\" (UID: \"f500dce8-6c1c-4523-9bf1-49f02322f25f\") " pod="openshift-lws-operator/lws-controller-manager-54f8864c6c-j26q5" Apr 17 21:21:12.681600 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:12.681491 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f500dce8-6c1c-4523-9bf1-49f02322f25f-cert\") pod \"lws-controller-manager-54f8864c6c-j26q5\" (UID: \"f500dce8-6c1c-4523-9bf1-49f02322f25f\") " pod="openshift-lws-operator/lws-controller-manager-54f8864c6c-j26q5" Apr 17 21:21:12.681600 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:12.681508 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/f500dce8-6c1c-4523-9bf1-49f02322f25f-metrics-cert\") pod \"lws-controller-manager-54f8864c6c-j26q5\" (UID: \"f500dce8-6c1c-4523-9bf1-49f02322f25f\") " pod="openshift-lws-operator/lws-controller-manager-54f8864c6c-j26q5" Apr 17 21:21:12.681600 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:12.681564 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rgv6w\" (UniqueName: \"kubernetes.io/projected/f500dce8-6c1c-4523-9bf1-49f02322f25f-kube-api-access-rgv6w\") pod \"lws-controller-manager-54f8864c6c-j26q5\" (UID: \"f500dce8-6c1c-4523-9bf1-49f02322f25f\") " pod="openshift-lws-operator/lws-controller-manager-54f8864c6c-j26q5" Apr 17 21:21:12.682055 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:12.682036 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/f500dce8-6c1c-4523-9bf1-49f02322f25f-manager-config\") pod \"lws-controller-manager-54f8864c6c-j26q5\" (UID: \"f500dce8-6c1c-4523-9bf1-49f02322f25f\") " pod="openshift-lws-operator/lws-controller-manager-54f8864c6c-j26q5" Apr 17 21:21:12.683908 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:12.683887 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f500dce8-6c1c-4523-9bf1-49f02322f25f-cert\") pod \"lws-controller-manager-54f8864c6c-j26q5\" (UID: \"f500dce8-6c1c-4523-9bf1-49f02322f25f\") " pod="openshift-lws-operator/lws-controller-manager-54f8864c6c-j26q5" Apr 17 21:21:12.683987 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:12.683944 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/f500dce8-6c1c-4523-9bf1-49f02322f25f-metrics-cert\") pod \"lws-controller-manager-54f8864c6c-j26q5\" (UID: \"f500dce8-6c1c-4523-9bf1-49f02322f25f\") " pod="openshift-lws-operator/lws-controller-manager-54f8864c6c-j26q5" Apr 17 21:21:12.689442 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:12.689420 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgv6w\" (UniqueName: \"kubernetes.io/projected/f500dce8-6c1c-4523-9bf1-49f02322f25f-kube-api-access-rgv6w\") pod \"lws-controller-manager-54f8864c6c-j26q5\" (UID: \"f500dce8-6c1c-4523-9bf1-49f02322f25f\") " pod="openshift-lws-operator/lws-controller-manager-54f8864c6c-j26q5" Apr 17 21:21:12.773572 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:12.773544 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-54f8864c6c-j26q5" Apr 17 21:21:12.896104 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:12.896075 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-54f8864c6c-j26q5"] Apr 17 21:21:12.898974 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:21:12.898946 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf500dce8_6c1c_4523_9bf1_49f02322f25f.slice/crio-edd8a3a42f12eb2fe66592903be74c767d308a2d9a5920388601eb448b229c1e WatchSource:0}: Error finding container edd8a3a42f12eb2fe66592903be74c767d308a2d9a5920388601eb448b229c1e: Status 404 returned error can't find the container with id edd8a3a42f12eb2fe66592903be74c767d308a2d9a5920388601eb448b229c1e Apr 17 21:21:13.340521 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:13.340488 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-54f8864c6c-j26q5" event={"ID":"f500dce8-6c1c-4523-9bf1-49f02322f25f","Type":"ContainerStarted","Data":"edd8a3a42f12eb2fe66592903be74c767d308a2d9a5920388601eb448b229c1e"} Apr 17 21:21:15.317978 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:15.317941 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-zjznf" Apr 17 21:21:16.351772 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:16.351737 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-54f8864c6c-j26q5" event={"ID":"f500dce8-6c1c-4523-9bf1-49f02322f25f","Type":"ContainerStarted","Data":"05aa234e788d80cc283cdcdeb36a2279557af2daa08e2ee5558e7eefe1af597e"} Apr 17 21:21:16.352195 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:16.351788 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-54f8864c6c-j26q5" Apr 17 21:21:16.386602 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:16.386557 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-54f8864c6c-j26q5" podStartSLOduration=1.847639589 podStartE2EDuration="4.386543864s" podCreationTimestamp="2026-04-17 21:21:12 +0000 UTC" firstStartedPulling="2026-04-17 21:21:12.900736295 +0000 UTC m=+476.597291345" lastFinishedPulling="2026-04-17 21:21:15.439640566 +0000 UTC m=+479.136195620" observedRunningTime="2026-04-17 21:21:16.384536532 +0000 UTC m=+480.081091603" watchObservedRunningTime="2026-04-17 21:21:16.386543864 +0000 UTC m=+480.083098935" Apr 17 21:21:24.635776 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:24.635739 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-56dddbd4f7-5tsjk"] Apr 17 21:21:24.639247 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:24.639201 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-56dddbd4f7-5tsjk" Apr 17 21:21:24.641268 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:24.641243 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 17 21:21:24.641536 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:24.641518 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-phnf9\"" Apr 17 21:21:24.642030 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:24.642013 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 17 21:21:24.649024 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:24.648992 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-56dddbd4f7-5tsjk"] Apr 17 21:21:24.778545 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:24.778497 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82mj4\" (UniqueName: \"kubernetes.io/projected/018458a5-1c58-49b8-b783-bebebc553f84-kube-api-access-82mj4\") pod \"kube-auth-proxy-56dddbd4f7-5tsjk\" (UID: \"018458a5-1c58-49b8-b783-bebebc553f84\") " pod="openshift-ingress/kube-auth-proxy-56dddbd4f7-5tsjk" Apr 17 21:21:24.778545 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:24.778552 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/018458a5-1c58-49b8-b783-bebebc553f84-tmp\") pod \"kube-auth-proxy-56dddbd4f7-5tsjk\" (UID: \"018458a5-1c58-49b8-b783-bebebc553f84\") " pod="openshift-ingress/kube-auth-proxy-56dddbd4f7-5tsjk" Apr 17 21:21:24.778784 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:24.778608 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/018458a5-1c58-49b8-b783-bebebc553f84-tls-certs\") pod \"kube-auth-proxy-56dddbd4f7-5tsjk\" (UID: \"018458a5-1c58-49b8-b783-bebebc553f84\") " pod="openshift-ingress/kube-auth-proxy-56dddbd4f7-5tsjk" Apr 17 21:21:24.879755 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:24.879724 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-82mj4\" (UniqueName: \"kubernetes.io/projected/018458a5-1c58-49b8-b783-bebebc553f84-kube-api-access-82mj4\") pod \"kube-auth-proxy-56dddbd4f7-5tsjk\" (UID: \"018458a5-1c58-49b8-b783-bebebc553f84\") " pod="openshift-ingress/kube-auth-proxy-56dddbd4f7-5tsjk" Apr 17 21:21:24.879932 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:24.879767 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/018458a5-1c58-49b8-b783-bebebc553f84-tmp\") pod \"kube-auth-proxy-56dddbd4f7-5tsjk\" (UID: \"018458a5-1c58-49b8-b783-bebebc553f84\") " pod="openshift-ingress/kube-auth-proxy-56dddbd4f7-5tsjk" Apr 17 21:21:24.879932 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:24.879806 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/018458a5-1c58-49b8-b783-bebebc553f84-tls-certs\") pod \"kube-auth-proxy-56dddbd4f7-5tsjk\" (UID: \"018458a5-1c58-49b8-b783-bebebc553f84\") " pod="openshift-ingress/kube-auth-proxy-56dddbd4f7-5tsjk" Apr 17 21:21:24.882149 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:24.882117 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/018458a5-1c58-49b8-b783-bebebc553f84-tmp\") pod \"kube-auth-proxy-56dddbd4f7-5tsjk\" (UID: \"018458a5-1c58-49b8-b783-bebebc553f84\") " pod="openshift-ingress/kube-auth-proxy-56dddbd4f7-5tsjk" Apr 17 21:21:24.882320 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:24.882304 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/018458a5-1c58-49b8-b783-bebebc553f84-tls-certs\") pod \"kube-auth-proxy-56dddbd4f7-5tsjk\" (UID: \"018458a5-1c58-49b8-b783-bebebc553f84\") " pod="openshift-ingress/kube-auth-proxy-56dddbd4f7-5tsjk" Apr 17 21:21:24.887279 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:24.887231 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-82mj4\" (UniqueName: \"kubernetes.io/projected/018458a5-1c58-49b8-b783-bebebc553f84-kube-api-access-82mj4\") pod \"kube-auth-proxy-56dddbd4f7-5tsjk\" (UID: \"018458a5-1c58-49b8-b783-bebebc553f84\") " pod="openshift-ingress/kube-auth-proxy-56dddbd4f7-5tsjk" Apr 17 21:21:24.950131 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:24.950081 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-56dddbd4f7-5tsjk" Apr 17 21:21:25.087586 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:25.087552 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-56dddbd4f7-5tsjk"] Apr 17 21:21:25.090549 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:21:25.090522 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod018458a5_1c58_49b8_b783_bebebc553f84.slice/crio-97cfea9d049282954922383da0ecc0b7d61eb5b0b8074a89e2b3b217b34d9b0d WatchSource:0}: Error finding container 97cfea9d049282954922383da0ecc0b7d61eb5b0b8074a89e2b3b217b34d9b0d: Status 404 returned error can't find the container with id 97cfea9d049282954922383da0ecc0b7d61eb5b0b8074a89e2b3b217b34d9b0d Apr 17 21:21:25.381830 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:25.381787 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-56dddbd4f7-5tsjk" event={"ID":"018458a5-1c58-49b8-b783-bebebc553f84","Type":"ContainerStarted","Data":"97cfea9d049282954922383da0ecc0b7d61eb5b0b8074a89e2b3b217b34d9b0d"} Apr 17 21:21:27.357612 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:27.357579 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-54f8864c6c-j26q5" Apr 17 21:21:29.396256 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:29.396214 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-56dddbd4f7-5tsjk" event={"ID":"018458a5-1c58-49b8-b783-bebebc553f84","Type":"ContainerStarted","Data":"8eced40adc9aad410123f167e2284c07cf8c10bef0d76f00dab8c1dcff67b47d"} Apr 17 21:21:29.411245 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:21:29.411186 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-56dddbd4f7-5tsjk" podStartSLOduration=1.6513057070000001 podStartE2EDuration="5.411151935s" podCreationTimestamp="2026-04-17 21:21:24 +0000 UTC" firstStartedPulling="2026-04-17 21:21:25.092365188 +0000 UTC m=+488.788920239" lastFinishedPulling="2026-04-17 21:21:28.852211411 +0000 UTC m=+492.548766467" observedRunningTime="2026-04-17 21:21:29.410509793 +0000 UTC m=+493.107064864" watchObservedRunningTime="2026-04-17 21:21:29.411151935 +0000 UTC m=+493.107707007" Apr 17 21:23:01.705807 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:01.705776 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vl7wh"] Apr 17 21:23:01.708633 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:01.708616 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vl7wh" Apr 17 21:23:01.711538 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:01.711515 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 21:23:01.712273 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:01.712256 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 21:23:01.712369 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:01.712273 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-jzsl7\"" Apr 17 21:23:01.729906 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:01.729881 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vl7wh"] Apr 17 21:23:01.795103 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:01.795067 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4lfk\" (UniqueName: \"kubernetes.io/projected/7e5e5c6e-e4a9-4222-a7f5-c7d35c09ec74-kube-api-access-d4lfk\") pod \"limitador-operator-controller-manager-85c4996f8c-vl7wh\" (UID: \"7e5e5c6e-e4a9-4222-a7f5-c7d35c09ec74\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vl7wh" Apr 17 21:23:01.896086 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:01.896051 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d4lfk\" (UniqueName: \"kubernetes.io/projected/7e5e5c6e-e4a9-4222-a7f5-c7d35c09ec74-kube-api-access-d4lfk\") pod \"limitador-operator-controller-manager-85c4996f8c-vl7wh\" (UID: \"7e5e5c6e-e4a9-4222-a7f5-c7d35c09ec74\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vl7wh" Apr 17 21:23:01.903673 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:01.903640 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4lfk\" (UniqueName: \"kubernetes.io/projected/7e5e5c6e-e4a9-4222-a7f5-c7d35c09ec74-kube-api-access-d4lfk\") pod \"limitador-operator-controller-manager-85c4996f8c-vl7wh\" (UID: \"7e5e5c6e-e4a9-4222-a7f5-c7d35c09ec74\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vl7wh" Apr 17 21:23:02.018776 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:02.018732 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vl7wh" Apr 17 21:23:02.140602 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:02.140569 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vl7wh"] Apr 17 21:23:02.143589 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:23:02.143564 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e5e5c6e_e4a9_4222_a7f5_c7d35c09ec74.slice/crio-6298e8efbdcf323a5b3142b8fa0ea2b29fe2b60ff0a2a7f16ad1fd439e47c7d6 WatchSource:0}: Error finding container 6298e8efbdcf323a5b3142b8fa0ea2b29fe2b60ff0a2a7f16ad1fd439e47c7d6: Status 404 returned error can't find the container with id 6298e8efbdcf323a5b3142b8fa0ea2b29fe2b60ff0a2a7f16ad1fd439e47c7d6 Apr 17 21:23:02.699044 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:02.699009 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vl7wh" event={"ID":"7e5e5c6e-e4a9-4222-a7f5-c7d35c09ec74","Type":"ContainerStarted","Data":"6298e8efbdcf323a5b3142b8fa0ea2b29fe2b60ff0a2a7f16ad1fd439e47c7d6"} Apr 17 21:23:04.706472 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:04.706395 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vl7wh" event={"ID":"7e5e5c6e-e4a9-4222-a7f5-c7d35c09ec74","Type":"ContainerStarted","Data":"082b4de72cd6356b8de954470ad2f6b8bc7b00eb083623ffd1237775fded124c"} Apr 17 21:23:04.706817 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:04.706504 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vl7wh" Apr 17 21:23:04.721111 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:04.721070 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vl7wh" podStartSLOduration=1.47539435 podStartE2EDuration="3.721056628s" podCreationTimestamp="2026-04-17 21:23:01 +0000 UTC" firstStartedPulling="2026-04-17 21:23:02.145523141 +0000 UTC m=+585.842078195" lastFinishedPulling="2026-04-17 21:23:04.391185409 +0000 UTC m=+588.087740473" observedRunningTime="2026-04-17 21:23:04.719853738 +0000 UTC m=+588.416408810" watchObservedRunningTime="2026-04-17 21:23:04.721056628 +0000 UTC m=+588.417611700" Apr 17 21:23:14.729555 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:14.729512 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vl7wh"] Apr 17 21:23:14.729979 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:14.729770 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vl7wh" podUID="7e5e5c6e-e4a9-4222-a7f5-c7d35c09ec74" containerName="manager" containerID="cri-o://082b4de72cd6356b8de954470ad2f6b8bc7b00eb083623ffd1237775fded124c" gracePeriod=2 Apr 17 21:23:14.731456 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:14.731418 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vl7wh" Apr 17 21:23:14.733713 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:14.733687 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vl7wh"] Apr 17 21:23:14.737020 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:14.736983 2572 status_manager.go:895] "Failed to get status for pod" podUID="7e5e5c6e-e4a9-4222-a7f5-c7d35c09ec74" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vl7wh" err="pods \"limitador-operator-controller-manager-85c4996f8c-vl7wh\" is forbidden: User \"system:node:ip-10-0-135-174.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-135-174.ec2.internal' and this object" Apr 17 21:23:14.963887 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:14.963864 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vl7wh" Apr 17 21:23:15.005944 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:15.005856 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lfk\" (UniqueName: \"kubernetes.io/projected/7e5e5c6e-e4a9-4222-a7f5-c7d35c09ec74-kube-api-access-d4lfk\") pod \"7e5e5c6e-e4a9-4222-a7f5-c7d35c09ec74\" (UID: \"7e5e5c6e-e4a9-4222-a7f5-c7d35c09ec74\") " Apr 17 21:23:15.007968 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:15.007939 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e5e5c6e-e4a9-4222-a7f5-c7d35c09ec74-kube-api-access-d4lfk" (OuterVolumeSpecName: "kube-api-access-d4lfk") pod "7e5e5c6e-e4a9-4222-a7f5-c7d35c09ec74" (UID: "7e5e5c6e-e4a9-4222-a7f5-c7d35c09ec74"). InnerVolumeSpecName "kube-api-access-d4lfk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:23:15.106519 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:15.106482 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d4lfk\" (UniqueName: \"kubernetes.io/projected/7e5e5c6e-e4a9-4222-a7f5-c7d35c09ec74-kube-api-access-d4lfk\") on node \"ip-10-0-135-174.ec2.internal\" DevicePath \"\"" Apr 17 21:23:15.745035 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:15.744998 2572 generic.go:358] "Generic (PLEG): container finished" podID="7e5e5c6e-e4a9-4222-a7f5-c7d35c09ec74" containerID="082b4de72cd6356b8de954470ad2f6b8bc7b00eb083623ffd1237775fded124c" exitCode=0 Apr 17 21:23:15.745533 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:15.745048 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vl7wh" Apr 17 21:23:15.745533 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:15.745103 2572 scope.go:117] "RemoveContainer" containerID="082b4de72cd6356b8de954470ad2f6b8bc7b00eb083623ffd1237775fded124c" Apr 17 21:23:15.753304 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:15.753284 2572 scope.go:117] "RemoveContainer" containerID="082b4de72cd6356b8de954470ad2f6b8bc7b00eb083623ffd1237775fded124c" Apr 17 21:23:15.753569 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:23:15.753550 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"082b4de72cd6356b8de954470ad2f6b8bc7b00eb083623ffd1237775fded124c\": container with ID starting with 082b4de72cd6356b8de954470ad2f6b8bc7b00eb083623ffd1237775fded124c not found: ID does not exist" containerID="082b4de72cd6356b8de954470ad2f6b8bc7b00eb083623ffd1237775fded124c" Apr 17 21:23:15.753618 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:15.753579 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"082b4de72cd6356b8de954470ad2f6b8bc7b00eb083623ffd1237775fded124c"} err="failed to get container status \"082b4de72cd6356b8de954470ad2f6b8bc7b00eb083623ffd1237775fded124c\": rpc error: code = NotFound desc = could not find container \"082b4de72cd6356b8de954470ad2f6b8bc7b00eb083623ffd1237775fded124c\": container with ID starting with 082b4de72cd6356b8de954470ad2f6b8bc7b00eb083623ffd1237775fded124c not found: ID does not exist" Apr 17 21:23:15.754816 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:15.754793 2572 status_manager.go:895] "Failed to get status for pod" podUID="7e5e5c6e-e4a9-4222-a7f5-c7d35c09ec74" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vl7wh" err="pods \"limitador-operator-controller-manager-85c4996f8c-vl7wh\" is forbidden: User \"system:node:ip-10-0-135-174.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-135-174.ec2.internal' and this object" Apr 17 21:23:16.778882 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:16.778859 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7q89l_6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82/ovn-acl-logging/0.log" Apr 17 21:23:16.779284 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:16.779110 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7q89l_6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82/ovn-acl-logging/0.log" Apr 17 21:23:16.868753 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:16.868701 2572 status_manager.go:895] "Failed to get status for pod" podUID="7e5e5c6e-e4a9-4222-a7f5-c7d35c09ec74" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vl7wh" err="pods \"limitador-operator-controller-manager-85c4996f8c-vl7wh\" is forbidden: User \"system:node:ip-10-0-135-174.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-135-174.ec2.internal' and this object" Apr 17 21:23:16.869089 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:16.869067 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e5e5c6e-e4a9-4222-a7f5-c7d35c09ec74" path="/var/lib/kubelet/pods/7e5e5c6e-e4a9-4222-a7f5-c7d35c09ec74/volumes" Apr 17 21:23:59.197365 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:59.197281 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-7bxgj"] Apr 17 21:23:59.197782 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:59.197616 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e5e5c6e-e4a9-4222-a7f5-c7d35c09ec74" containerName="manager" Apr 17 21:23:59.197782 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:59.197627 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e5e5c6e-e4a9-4222-a7f5-c7d35c09ec74" containerName="manager" Apr 17 21:23:59.197782 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:59.197690 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="7e5e5c6e-e4a9-4222-a7f5-c7d35c09ec74" containerName="manager" Apr 17 21:23:59.200976 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:59.200959 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-7bxgj" Apr 17 21:23:59.203409 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:59.203386 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-cdvnh\"" Apr 17 21:23:59.203530 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:59.203396 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 21:23:59.204211 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:59.204193 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 17 21:23:59.204292 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:59.204209 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 21:23:59.208031 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:59.207989 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-7bxgj"] Apr 17 21:23:59.268844 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:59.268808 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64t87\" (UniqueName: \"kubernetes.io/projected/4a2bfcc8-e3e1-49aa-8004-3ed84a04472c-kube-api-access-64t87\") pod \"limitador-limitador-7d549b5b-7bxgj\" (UID: \"4a2bfcc8-e3e1-49aa-8004-3ed84a04472c\") " pod="kuadrant-system/limitador-limitador-7d549b5b-7bxgj" Apr 17 21:23:59.269033 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:59.268928 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/4a2bfcc8-e3e1-49aa-8004-3ed84a04472c-config-file\") pod \"limitador-limitador-7d549b5b-7bxgj\" (UID: \"4a2bfcc8-e3e1-49aa-8004-3ed84a04472c\") " pod="kuadrant-system/limitador-limitador-7d549b5b-7bxgj" Apr 17 21:23:59.294906 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:59.294872 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-7bxgj"] Apr 17 21:23:59.370140 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:59.370100 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-64t87\" (UniqueName: \"kubernetes.io/projected/4a2bfcc8-e3e1-49aa-8004-3ed84a04472c-kube-api-access-64t87\") pod \"limitador-limitador-7d549b5b-7bxgj\" (UID: \"4a2bfcc8-e3e1-49aa-8004-3ed84a04472c\") " pod="kuadrant-system/limitador-limitador-7d549b5b-7bxgj" Apr 17 21:23:59.370343 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:59.370193 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/4a2bfcc8-e3e1-49aa-8004-3ed84a04472c-config-file\") pod \"limitador-limitador-7d549b5b-7bxgj\" (UID: \"4a2bfcc8-e3e1-49aa-8004-3ed84a04472c\") " pod="kuadrant-system/limitador-limitador-7d549b5b-7bxgj" Apr 17 21:23:59.370769 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:59.370752 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/4a2bfcc8-e3e1-49aa-8004-3ed84a04472c-config-file\") pod \"limitador-limitador-7d549b5b-7bxgj\" (UID: \"4a2bfcc8-e3e1-49aa-8004-3ed84a04472c\") " pod="kuadrant-system/limitador-limitador-7d549b5b-7bxgj" Apr 17 21:23:59.377540 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:59.377513 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-64t87\" (UniqueName: \"kubernetes.io/projected/4a2bfcc8-e3e1-49aa-8004-3ed84a04472c-kube-api-access-64t87\") pod \"limitador-limitador-7d549b5b-7bxgj\" (UID: \"4a2bfcc8-e3e1-49aa-8004-3ed84a04472c\") " pod="kuadrant-system/limitador-limitador-7d549b5b-7bxgj" Apr 17 21:23:59.511375 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:59.511339 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-7bxgj" Apr 17 21:23:59.632000 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:59.631976 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-7bxgj"] Apr 17 21:23:59.634940 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:23:59.634911 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a2bfcc8_e3e1_49aa_8004_3ed84a04472c.slice/crio-cc50504bf724ac0269503ca8f66513253a49ea3901220f2d8c3bab688bf76922 WatchSource:0}: Error finding container cc50504bf724ac0269503ca8f66513253a49ea3901220f2d8c3bab688bf76922: Status 404 returned error can't find the container with id cc50504bf724ac0269503ca8f66513253a49ea3901220f2d8c3bab688bf76922 Apr 17 21:23:59.894211 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:59.894111 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-7bxgj" event={"ID":"4a2bfcc8-e3e1-49aa-8004-3ed84a04472c","Type":"ContainerStarted","Data":"cc50504bf724ac0269503ca8f66513253a49ea3901220f2d8c3bab688bf76922"} Apr 17 21:23:59.986906 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:59.986874 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-9kkr5"] Apr 17 21:23:59.991974 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:59.991951 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-9kkr5" Apr 17 21:23:59.995317 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:59.994488 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-645ll\"" Apr 17 21:23:59.996451 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:23:59.996429 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-9kkr5"] Apr 17 21:24:00.076333 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:00.076288 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jv7s\" (UniqueName: \"kubernetes.io/projected/8a82919a-80e0-4b8e-867e-9c75851ee14f-kube-api-access-2jv7s\") pod \"authorino-f99f4b5cd-9kkr5\" (UID: \"8a82919a-80e0-4b8e-867e-9c75851ee14f\") " pod="kuadrant-system/authorino-f99f4b5cd-9kkr5" Apr 17 21:24:00.177644 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:00.177559 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2jv7s\" (UniqueName: \"kubernetes.io/projected/8a82919a-80e0-4b8e-867e-9c75851ee14f-kube-api-access-2jv7s\") pod \"authorino-f99f4b5cd-9kkr5\" (UID: \"8a82919a-80e0-4b8e-867e-9c75851ee14f\") " pod="kuadrant-system/authorino-f99f4b5cd-9kkr5" Apr 17 21:24:00.181021 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:00.180992 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-z2d7j"] Apr 17 21:24:00.184684 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:00.184663 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-z2d7j" Apr 17 21:24:00.190402 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:00.190373 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-z2d7j"] Apr 17 21:24:00.192474 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:00.192455 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jv7s\" (UniqueName: \"kubernetes.io/projected/8a82919a-80e0-4b8e-867e-9c75851ee14f-kube-api-access-2jv7s\") pod \"authorino-f99f4b5cd-9kkr5\" (UID: \"8a82919a-80e0-4b8e-867e-9c75851ee14f\") " pod="kuadrant-system/authorino-f99f4b5cd-9kkr5" Apr 17 21:24:00.278688 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:00.278653 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c9fc\" (UniqueName: \"kubernetes.io/projected/cb575c6a-4f1b-48cd-b764-eab8ce593c57-kube-api-access-6c9fc\") pod \"authorino-7498df8756-z2d7j\" (UID: \"cb575c6a-4f1b-48cd-b764-eab8ce593c57\") " pod="kuadrant-system/authorino-7498df8756-z2d7j" Apr 17 21:24:00.308229 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:00.308200 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-9kkr5" Apr 17 21:24:00.379261 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:00.379228 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6c9fc\" (UniqueName: \"kubernetes.io/projected/cb575c6a-4f1b-48cd-b764-eab8ce593c57-kube-api-access-6c9fc\") pod \"authorino-7498df8756-z2d7j\" (UID: \"cb575c6a-4f1b-48cd-b764-eab8ce593c57\") " pod="kuadrant-system/authorino-7498df8756-z2d7j" Apr 17 21:24:00.386807 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:00.386780 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c9fc\" (UniqueName: \"kubernetes.io/projected/cb575c6a-4f1b-48cd-b764-eab8ce593c57-kube-api-access-6c9fc\") pod \"authorino-7498df8756-z2d7j\" (UID: \"cb575c6a-4f1b-48cd-b764-eab8ce593c57\") " pod="kuadrant-system/authorino-7498df8756-z2d7j" Apr 17 21:24:00.449050 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:00.448828 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-9kkr5"] Apr 17 21:24:00.454654 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:24:00.454616 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a82919a_80e0_4b8e_867e_9c75851ee14f.slice/crio-a0c1b840c30417e479adc9e6e14b69216210a8d2651c4b1e468917a86264fdb2 WatchSource:0}: Error finding container a0c1b840c30417e479adc9e6e14b69216210a8d2651c4b1e468917a86264fdb2: Status 404 returned error can't find the container with id a0c1b840c30417e479adc9e6e14b69216210a8d2651c4b1e468917a86264fdb2 Apr 17 21:24:00.502560 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:00.502515 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-z2d7j" Apr 17 21:24:00.680793 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:00.680761 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-z2d7j"] Apr 17 21:24:00.687236 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:24:00.687204 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb575c6a_4f1b_48cd_b764_eab8ce593c57.slice/crio-4133f3d88d652c21eca4d6231e1d2362f7cfb863c25c87632dd3d52878f78ad5 WatchSource:0}: Error finding container 4133f3d88d652c21eca4d6231e1d2362f7cfb863c25c87632dd3d52878f78ad5: Status 404 returned error can't find the container with id 4133f3d88d652c21eca4d6231e1d2362f7cfb863c25c87632dd3d52878f78ad5 Apr 17 21:24:00.901503 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:00.901462 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-9kkr5" event={"ID":"8a82919a-80e0-4b8e-867e-9c75851ee14f","Type":"ContainerStarted","Data":"a0c1b840c30417e479adc9e6e14b69216210a8d2651c4b1e468917a86264fdb2"} Apr 17 21:24:00.902681 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:00.902651 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-z2d7j" event={"ID":"cb575c6a-4f1b-48cd-b764-eab8ce593c57","Type":"ContainerStarted","Data":"4133f3d88d652c21eca4d6231e1d2362f7cfb863c25c87632dd3d52878f78ad5"} Apr 17 21:24:04.923206 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:04.923152 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-z2d7j" event={"ID":"cb575c6a-4f1b-48cd-b764-eab8ce593c57","Type":"ContainerStarted","Data":"32210fa2cec3d977e2877dca7b855f38fed386c7e438c8fa088ca9759219e255"} Apr 17 21:24:04.924443 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:04.924420 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-7bxgj" event={"ID":"4a2bfcc8-e3e1-49aa-8004-3ed84a04472c","Type":"ContainerStarted","Data":"1a90ff315b9a35ca7f037e66e03e6e5152502ded1a6d85c76af45a9e3d7acfd8"} Apr 17 21:24:04.924570 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:04.924533 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-7bxgj" Apr 17 21:24:04.925546 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:04.925527 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-9kkr5" event={"ID":"8a82919a-80e0-4b8e-867e-9c75851ee14f","Type":"ContainerStarted","Data":"47bbc0d2d7a14a6b80a6fd2583a07c2dac78541deeacec896e09593204a34600"} Apr 17 21:24:04.937757 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:04.937719 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-z2d7j" podStartSLOduration=1.244135986 podStartE2EDuration="4.937707858s" podCreationTimestamp="2026-04-17 21:24:00 +0000 UTC" firstStartedPulling="2026-04-17 21:24:00.689781597 +0000 UTC m=+644.386336649" lastFinishedPulling="2026-04-17 21:24:04.383353456 +0000 UTC m=+648.079908521" observedRunningTime="2026-04-17 21:24:04.936586256 +0000 UTC m=+648.633141329" watchObservedRunningTime="2026-04-17 21:24:04.937707858 +0000 UTC m=+648.634262931" Apr 17 21:24:04.952110 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:04.952062 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-7bxgj" podStartSLOduration=1.272536029 podStartE2EDuration="5.952048695s" podCreationTimestamp="2026-04-17 21:23:59 +0000 UTC" firstStartedPulling="2026-04-17 21:23:59.637249831 +0000 UTC m=+643.333804897" lastFinishedPulling="2026-04-17 21:24:04.31676251 +0000 UTC m=+648.013317563" observedRunningTime="2026-04-17 21:24:04.950097251 +0000 UTC m=+648.646652323" watchObservedRunningTime="2026-04-17 21:24:04.952048695 +0000 UTC m=+648.648603769" Apr 17 21:24:04.958823 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:04.958802 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-9kkr5"] Apr 17 21:24:04.969712 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:04.969665 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-9kkr5" podStartSLOduration=2.038712762 podStartE2EDuration="5.969650102s" podCreationTimestamp="2026-04-17 21:23:59 +0000 UTC" firstStartedPulling="2026-04-17 21:24:00.456600512 +0000 UTC m=+644.153155567" lastFinishedPulling="2026-04-17 21:24:04.38753784 +0000 UTC m=+648.084092907" observedRunningTime="2026-04-17 21:24:04.968833969 +0000 UTC m=+648.665389041" watchObservedRunningTime="2026-04-17 21:24:04.969650102 +0000 UTC m=+648.666205176" Apr 17 21:24:06.933501 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:06.933451 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-9kkr5" podUID="8a82919a-80e0-4b8e-867e-9c75851ee14f" containerName="authorino" containerID="cri-o://47bbc0d2d7a14a6b80a6fd2583a07c2dac78541deeacec896e09593204a34600" gracePeriod=30 Apr 17 21:24:07.176868 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:07.176847 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-9kkr5" Apr 17 21:24:07.242186 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:07.242140 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jv7s\" (UniqueName: \"kubernetes.io/projected/8a82919a-80e0-4b8e-867e-9c75851ee14f-kube-api-access-2jv7s\") pod \"8a82919a-80e0-4b8e-867e-9c75851ee14f\" (UID: \"8a82919a-80e0-4b8e-867e-9c75851ee14f\") " Apr 17 21:24:07.244198 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:07.244153 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a82919a-80e0-4b8e-867e-9c75851ee14f-kube-api-access-2jv7s" (OuterVolumeSpecName: "kube-api-access-2jv7s") pod "8a82919a-80e0-4b8e-867e-9c75851ee14f" (UID: "8a82919a-80e0-4b8e-867e-9c75851ee14f"). InnerVolumeSpecName "kube-api-access-2jv7s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:24:07.342913 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:07.342866 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2jv7s\" (UniqueName: \"kubernetes.io/projected/8a82919a-80e0-4b8e-867e-9c75851ee14f-kube-api-access-2jv7s\") on node \"ip-10-0-135-174.ec2.internal\" DevicePath \"\"" Apr 17 21:24:07.938258 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:07.938219 2572 generic.go:358] "Generic (PLEG): container finished" podID="8a82919a-80e0-4b8e-867e-9c75851ee14f" containerID="47bbc0d2d7a14a6b80a6fd2583a07c2dac78541deeacec896e09593204a34600" exitCode=0 Apr 17 21:24:07.938742 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:07.938290 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-9kkr5" Apr 17 21:24:07.938742 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:07.938302 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-9kkr5" event={"ID":"8a82919a-80e0-4b8e-867e-9c75851ee14f","Type":"ContainerDied","Data":"47bbc0d2d7a14a6b80a6fd2583a07c2dac78541deeacec896e09593204a34600"} Apr 17 21:24:07.938742 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:07.938339 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-9kkr5" event={"ID":"8a82919a-80e0-4b8e-867e-9c75851ee14f","Type":"ContainerDied","Data":"a0c1b840c30417e479adc9e6e14b69216210a8d2651c4b1e468917a86264fdb2"} Apr 17 21:24:07.938742 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:07.938355 2572 scope.go:117] "RemoveContainer" containerID="47bbc0d2d7a14a6b80a6fd2583a07c2dac78541deeacec896e09593204a34600" Apr 17 21:24:07.946860 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:07.946840 2572 scope.go:117] "RemoveContainer" containerID="47bbc0d2d7a14a6b80a6fd2583a07c2dac78541deeacec896e09593204a34600" Apr 17 21:24:07.947096 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:24:07.947076 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47bbc0d2d7a14a6b80a6fd2583a07c2dac78541deeacec896e09593204a34600\": container with ID starting with 47bbc0d2d7a14a6b80a6fd2583a07c2dac78541deeacec896e09593204a34600 not found: ID does not exist" containerID="47bbc0d2d7a14a6b80a6fd2583a07c2dac78541deeacec896e09593204a34600" Apr 17 21:24:07.947142 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:07.947106 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47bbc0d2d7a14a6b80a6fd2583a07c2dac78541deeacec896e09593204a34600"} err="failed to get container status \"47bbc0d2d7a14a6b80a6fd2583a07c2dac78541deeacec896e09593204a34600\": rpc error: code = NotFound desc = could not find container \"47bbc0d2d7a14a6b80a6fd2583a07c2dac78541deeacec896e09593204a34600\": container with ID starting with 47bbc0d2d7a14a6b80a6fd2583a07c2dac78541deeacec896e09593204a34600 not found: ID does not exist" Apr 17 21:24:07.957007 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:07.956986 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-9kkr5"] Apr 17 21:24:07.958819 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:07.958800 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-9kkr5"] Apr 17 21:24:08.868313 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:08.868280 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a82919a-80e0-4b8e-867e-9c75851ee14f" path="/var/lib/kubelet/pods/8a82919a-80e0-4b8e-867e-9c75851ee14f/volumes" Apr 17 21:24:14.326062 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:14.326031 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-7bxgj"] Apr 17 21:24:14.326533 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:14.326324 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-7bxgj" podUID="4a2bfcc8-e3e1-49aa-8004-3ed84a04472c" containerName="limitador" containerID="cri-o://1a90ff315b9a35ca7f037e66e03e6e5152502ded1a6d85c76af45a9e3d7acfd8" gracePeriod=30 Apr 17 21:24:14.327017 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:14.326969 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-7bxgj" Apr 17 21:24:14.871346 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:14.871324 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-7bxgj" Apr 17 21:24:14.960756 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:14.960688 2572 generic.go:358] "Generic (PLEG): container finished" podID="4a2bfcc8-e3e1-49aa-8004-3ed84a04472c" containerID="1a90ff315b9a35ca7f037e66e03e6e5152502ded1a6d85c76af45a9e3d7acfd8" exitCode=0 Apr 17 21:24:14.960756 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:14.960743 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-7bxgj" Apr 17 21:24:14.960956 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:14.960773 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-7bxgj" event={"ID":"4a2bfcc8-e3e1-49aa-8004-3ed84a04472c","Type":"ContainerDied","Data":"1a90ff315b9a35ca7f037e66e03e6e5152502ded1a6d85c76af45a9e3d7acfd8"} Apr 17 21:24:14.960956 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:14.960812 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-7bxgj" event={"ID":"4a2bfcc8-e3e1-49aa-8004-3ed84a04472c","Type":"ContainerDied","Data":"cc50504bf724ac0269503ca8f66513253a49ea3901220f2d8c3bab688bf76922"} Apr 17 21:24:14.960956 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:14.960833 2572 scope.go:117] "RemoveContainer" containerID="1a90ff315b9a35ca7f037e66e03e6e5152502ded1a6d85c76af45a9e3d7acfd8" Apr 17 21:24:14.968038 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:14.968021 2572 scope.go:117] "RemoveContainer" containerID="1a90ff315b9a35ca7f037e66e03e6e5152502ded1a6d85c76af45a9e3d7acfd8" Apr 17 21:24:14.968283 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:24:14.968265 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a90ff315b9a35ca7f037e66e03e6e5152502ded1a6d85c76af45a9e3d7acfd8\": container with ID starting with 1a90ff315b9a35ca7f037e66e03e6e5152502ded1a6d85c76af45a9e3d7acfd8 not found: ID does not exist" containerID="1a90ff315b9a35ca7f037e66e03e6e5152502ded1a6d85c76af45a9e3d7acfd8" Apr 17 21:24:14.968344 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:14.968290 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a90ff315b9a35ca7f037e66e03e6e5152502ded1a6d85c76af45a9e3d7acfd8"} err="failed to get container status \"1a90ff315b9a35ca7f037e66e03e6e5152502ded1a6d85c76af45a9e3d7acfd8\": rpc error: code = NotFound desc = could not find container \"1a90ff315b9a35ca7f037e66e03e6e5152502ded1a6d85c76af45a9e3d7acfd8\": container with ID starting with 1a90ff315b9a35ca7f037e66e03e6e5152502ded1a6d85c76af45a9e3d7acfd8 not found: ID does not exist" Apr 17 21:24:14.999669 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:14.999646 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64t87\" (UniqueName: \"kubernetes.io/projected/4a2bfcc8-e3e1-49aa-8004-3ed84a04472c-kube-api-access-64t87\") pod \"4a2bfcc8-e3e1-49aa-8004-3ed84a04472c\" (UID: \"4a2bfcc8-e3e1-49aa-8004-3ed84a04472c\") " Apr 17 21:24:14.999756 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:14.999720 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/4a2bfcc8-e3e1-49aa-8004-3ed84a04472c-config-file\") pod \"4a2bfcc8-e3e1-49aa-8004-3ed84a04472c\" (UID: \"4a2bfcc8-e3e1-49aa-8004-3ed84a04472c\") " Apr 17 21:24:15.000043 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:15.000015 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a2bfcc8-e3e1-49aa-8004-3ed84a04472c-config-file" (OuterVolumeSpecName: "config-file") pod "4a2bfcc8-e3e1-49aa-8004-3ed84a04472c" (UID: "4a2bfcc8-e3e1-49aa-8004-3ed84a04472c"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 21:24:15.001488 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:15.001470 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a2bfcc8-e3e1-49aa-8004-3ed84a04472c-kube-api-access-64t87" (OuterVolumeSpecName: "kube-api-access-64t87") pod "4a2bfcc8-e3e1-49aa-8004-3ed84a04472c" (UID: "4a2bfcc8-e3e1-49aa-8004-3ed84a04472c"). InnerVolumeSpecName "kube-api-access-64t87". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:24:15.101272 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:15.101241 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-64t87\" (UniqueName: \"kubernetes.io/projected/4a2bfcc8-e3e1-49aa-8004-3ed84a04472c-kube-api-access-64t87\") on node \"ip-10-0-135-174.ec2.internal\" DevicePath \"\"" Apr 17 21:24:15.101272 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:15.101269 2572 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/4a2bfcc8-e3e1-49aa-8004-3ed84a04472c-config-file\") on node \"ip-10-0-135-174.ec2.internal\" DevicePath \"\"" Apr 17 21:24:15.285668 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:15.285641 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-7bxgj"] Apr 17 21:24:15.287228 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:15.287207 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-7bxgj"] Apr 17 21:24:16.868183 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:16.868137 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a2bfcc8-e3e1-49aa-8004-3ed84a04472c" path="/var/lib/kubelet/pods/4a2bfcc8-e3e1-49aa-8004-3ed84a04472c/volumes" Apr 17 21:24:20.227226 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:20.227152 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-hbzsk"] Apr 17 21:24:20.227742 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:20.227697 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8a82919a-80e0-4b8e-867e-9c75851ee14f" containerName="authorino" Apr 17 21:24:20.227742 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:20.227717 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a82919a-80e0-4b8e-867e-9c75851ee14f" containerName="authorino" Apr 17 21:24:20.227742 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:20.227729 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4a2bfcc8-e3e1-49aa-8004-3ed84a04472c" containerName="limitador" Apr 17 21:24:20.227742 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:20.227737 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a2bfcc8-e3e1-49aa-8004-3ed84a04472c" containerName="limitador" Apr 17 21:24:20.227946 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:20.227830 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="4a2bfcc8-e3e1-49aa-8004-3ed84a04472c" containerName="limitador" Apr 17 21:24:20.227946 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:20.227844 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="8a82919a-80e0-4b8e-867e-9c75851ee14f" containerName="authorino" Apr 17 21:24:20.232483 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:20.232457 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-hbzsk" Apr 17 21:24:20.234542 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:20.234515 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 17 21:24:20.234542 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:20.234535 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-tzk6w\"" Apr 17 21:24:20.238764 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:20.238740 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-hbzsk"] Apr 17 21:24:20.343789 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:20.343754 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4sth\" (UniqueName: \"kubernetes.io/projected/c75eaa04-92b2-4fa9-8dad-f52397712f27-kube-api-access-r4sth\") pod \"postgres-868db5846d-hbzsk\" (UID: \"c75eaa04-92b2-4fa9-8dad-f52397712f27\") " pod="opendatahub/postgres-868db5846d-hbzsk" Apr 17 21:24:20.343789 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:20.343801 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/c75eaa04-92b2-4fa9-8dad-f52397712f27-data\") pod \"postgres-868db5846d-hbzsk\" (UID: \"c75eaa04-92b2-4fa9-8dad-f52397712f27\") " pod="opendatahub/postgres-868db5846d-hbzsk" Apr 17 21:24:20.444497 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:20.444444 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r4sth\" (UniqueName: \"kubernetes.io/projected/c75eaa04-92b2-4fa9-8dad-f52397712f27-kube-api-access-r4sth\") pod \"postgres-868db5846d-hbzsk\" (UID: \"c75eaa04-92b2-4fa9-8dad-f52397712f27\") " pod="opendatahub/postgres-868db5846d-hbzsk" Apr 17 21:24:20.444497 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:20.444504 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/c75eaa04-92b2-4fa9-8dad-f52397712f27-data\") pod \"postgres-868db5846d-hbzsk\" (UID: \"c75eaa04-92b2-4fa9-8dad-f52397712f27\") " pod="opendatahub/postgres-868db5846d-hbzsk" Apr 17 21:24:20.444831 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:20.444815 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/c75eaa04-92b2-4fa9-8dad-f52397712f27-data\") pod \"postgres-868db5846d-hbzsk\" (UID: \"c75eaa04-92b2-4fa9-8dad-f52397712f27\") " pod="opendatahub/postgres-868db5846d-hbzsk" Apr 17 21:24:20.452599 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:20.452573 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4sth\" (UniqueName: \"kubernetes.io/projected/c75eaa04-92b2-4fa9-8dad-f52397712f27-kube-api-access-r4sth\") pod \"postgres-868db5846d-hbzsk\" (UID: \"c75eaa04-92b2-4fa9-8dad-f52397712f27\") " pod="opendatahub/postgres-868db5846d-hbzsk" Apr 17 21:24:20.545739 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:20.545654 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-hbzsk" Apr 17 21:24:20.663376 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:20.663344 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-hbzsk"] Apr 17 21:24:20.666084 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:24:20.666059 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc75eaa04_92b2_4fa9_8dad_f52397712f27.slice/crio-2cf1945018a6ee0e3be09fb91efa967301ae5a84d06b400055565ff01a2ca2dd WatchSource:0}: Error finding container 2cf1945018a6ee0e3be09fb91efa967301ae5a84d06b400055565ff01a2ca2dd: Status 404 returned error can't find the container with id 2cf1945018a6ee0e3be09fb91efa967301ae5a84d06b400055565ff01a2ca2dd Apr 17 21:24:20.981984 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:20.981937 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-hbzsk" event={"ID":"c75eaa04-92b2-4fa9-8dad-f52397712f27","Type":"ContainerStarted","Data":"2cf1945018a6ee0e3be09fb91efa967301ae5a84d06b400055565ff01a2ca2dd"} Apr 17 21:24:26.002250 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:26.002214 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-hbzsk" event={"ID":"c75eaa04-92b2-4fa9-8dad-f52397712f27","Type":"ContainerStarted","Data":"f33437aa062a5b1c94d8d35dd9dd5849a26763a2033cd67eaa69ac34c31419f1"} Apr 17 21:24:26.002619 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:26.002296 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-hbzsk" Apr 17 21:24:26.017774 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:26.017718 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-hbzsk" podStartSLOduration=0.980281418 podStartE2EDuration="6.017701978s" podCreationTimestamp="2026-04-17 21:24:20 +0000 UTC" firstStartedPulling="2026-04-17 21:24:20.66782308 +0000 UTC m=+664.364378130" lastFinishedPulling="2026-04-17 21:24:25.70524364 +0000 UTC m=+669.401798690" observedRunningTime="2026-04-17 21:24:26.015880168 +0000 UTC m=+669.712435240" watchObservedRunningTime="2026-04-17 21:24:26.017701978 +0000 UTC m=+669.714257050" Apr 17 21:24:32.034450 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:32.034414 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-hbzsk" Apr 17 21:24:32.539888 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:32.539851 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-q9lhq"] Apr 17 21:24:32.546141 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:32.546116 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-q9lhq" Apr 17 21:24:32.548872 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:32.548847 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-q9lhq"] Apr 17 21:24:32.652271 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:32.652228 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8qsf\" (UniqueName: \"kubernetes.io/projected/558d628d-438f-4c53-abc6-e7938c252aa4-kube-api-access-p8qsf\") pod \"authorino-8b475cf9f-q9lhq\" (UID: \"558d628d-438f-4c53-abc6-e7938c252aa4\") " pod="kuadrant-system/authorino-8b475cf9f-q9lhq" Apr 17 21:24:32.714745 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:32.714705 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-q9lhq"] Apr 17 21:24:32.714981 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:24:32.714960 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-p8qsf], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-8b475cf9f-q9lhq" podUID="558d628d-438f-4c53-abc6-e7938c252aa4" Apr 17 21:24:32.738424 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:32.738392 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-56fdd757f5-5spm2"] Apr 17 21:24:32.742214 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:32.742194 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-56fdd757f5-5spm2" Apr 17 21:24:32.744385 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:32.744360 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 17 21:24:32.750329 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:32.750301 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-56fdd757f5-5spm2"] Apr 17 21:24:32.753464 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:32.753442 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p8qsf\" (UniqueName: \"kubernetes.io/projected/558d628d-438f-4c53-abc6-e7938c252aa4-kube-api-access-p8qsf\") pod \"authorino-8b475cf9f-q9lhq\" (UID: \"558d628d-438f-4c53-abc6-e7938c252aa4\") " pod="kuadrant-system/authorino-8b475cf9f-q9lhq" Apr 17 21:24:32.765342 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:32.765315 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8qsf\" (UniqueName: \"kubernetes.io/projected/558d628d-438f-4c53-abc6-e7938c252aa4-kube-api-access-p8qsf\") pod \"authorino-8b475cf9f-q9lhq\" (UID: \"558d628d-438f-4c53-abc6-e7938c252aa4\") " pod="kuadrant-system/authorino-8b475cf9f-q9lhq" Apr 17 21:24:32.784736 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:32.784694 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-56fdd757f5-5spm2"] Apr 17 21:24:32.785007 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:24:32.784981 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-mtnwl tls-cert], unattached volumes=[], failed to process volumes=[kube-api-access-mtnwl tls-cert]: context canceled" pod="kuadrant-system/authorino-56fdd757f5-5spm2" podUID="31f54739-a583-46c6-9052-0b86c21f2f16" Apr 17 21:24:32.814924 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:32.814825 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7f57fb5654-f4ngj"] Apr 17 21:24:32.818794 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:32.818767 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7f57fb5654-f4ngj" Apr 17 21:24:32.826250 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:32.826225 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7f57fb5654-f4ngj"] Apr 17 21:24:32.854335 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:32.854297 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/31f54739-a583-46c6-9052-0b86c21f2f16-tls-cert\") pod \"authorino-56fdd757f5-5spm2\" (UID: \"31f54739-a583-46c6-9052-0b86c21f2f16\") " pod="kuadrant-system/authorino-56fdd757f5-5spm2" Apr 17 21:24:32.854335 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:32.854350 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtnwl\" (UniqueName: \"kubernetes.io/projected/31f54739-a583-46c6-9052-0b86c21f2f16-kube-api-access-mtnwl\") pod \"authorino-56fdd757f5-5spm2\" (UID: \"31f54739-a583-46c6-9052-0b86c21f2f16\") " pod="kuadrant-system/authorino-56fdd757f5-5spm2" Apr 17 21:24:32.954894 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:32.954852 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqqw2\" (UniqueName: \"kubernetes.io/projected/ec083ab0-4ec0-4e8c-9da5-aa6d5f6591a1-kube-api-access-lqqw2\") pod \"authorino-7f57fb5654-f4ngj\" (UID: \"ec083ab0-4ec0-4e8c-9da5-aa6d5f6591a1\") " pod="kuadrant-system/authorino-7f57fb5654-f4ngj" Apr 17 21:24:32.954894 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:32.954897 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/31f54739-a583-46c6-9052-0b86c21f2f16-tls-cert\") pod \"authorino-56fdd757f5-5spm2\" (UID: \"31f54739-a583-46c6-9052-0b86c21f2f16\") " pod="kuadrant-system/authorino-56fdd757f5-5spm2" Apr 17 21:24:32.955121 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:32.954925 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/ec083ab0-4ec0-4e8c-9da5-aa6d5f6591a1-tls-cert\") pod \"authorino-7f57fb5654-f4ngj\" (UID: \"ec083ab0-4ec0-4e8c-9da5-aa6d5f6591a1\") " pod="kuadrant-system/authorino-7f57fb5654-f4ngj" Apr 17 21:24:32.955121 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:32.955015 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mtnwl\" (UniqueName: \"kubernetes.io/projected/31f54739-a583-46c6-9052-0b86c21f2f16-kube-api-access-mtnwl\") pod \"authorino-56fdd757f5-5spm2\" (UID: \"31f54739-a583-46c6-9052-0b86c21f2f16\") " pod="kuadrant-system/authorino-56fdd757f5-5spm2" Apr 17 21:24:32.957396 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:32.957365 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/31f54739-a583-46c6-9052-0b86c21f2f16-tls-cert\") pod \"authorino-56fdd757f5-5spm2\" (UID: \"31f54739-a583-46c6-9052-0b86c21f2f16\") " pod="kuadrant-system/authorino-56fdd757f5-5spm2" Apr 17 21:24:32.962666 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:32.962645 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtnwl\" (UniqueName: \"kubernetes.io/projected/31f54739-a583-46c6-9052-0b86c21f2f16-kube-api-access-mtnwl\") pod \"authorino-56fdd757f5-5spm2\" (UID: \"31f54739-a583-46c6-9052-0b86c21f2f16\") " pod="kuadrant-system/authorino-56fdd757f5-5spm2" Apr 17 21:24:33.022938 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:33.022907 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-56fdd757f5-5spm2" Apr 17 21:24:33.023108 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:33.022908 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-q9lhq" Apr 17 21:24:33.027997 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:33.027970 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-56fdd757f5-5spm2" Apr 17 21:24:33.031431 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:33.031411 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-q9lhq" Apr 17 21:24:33.055702 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:33.055670 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lqqw2\" (UniqueName: \"kubernetes.io/projected/ec083ab0-4ec0-4e8c-9da5-aa6d5f6591a1-kube-api-access-lqqw2\") pod \"authorino-7f57fb5654-f4ngj\" (UID: \"ec083ab0-4ec0-4e8c-9da5-aa6d5f6591a1\") " pod="kuadrant-system/authorino-7f57fb5654-f4ngj" Apr 17 21:24:33.055702 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:33.055711 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/ec083ab0-4ec0-4e8c-9da5-aa6d5f6591a1-tls-cert\") pod \"authorino-7f57fb5654-f4ngj\" (UID: \"ec083ab0-4ec0-4e8c-9da5-aa6d5f6591a1\") " pod="kuadrant-system/authorino-7f57fb5654-f4ngj" Apr 17 21:24:33.058259 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:33.058230 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/ec083ab0-4ec0-4e8c-9da5-aa6d5f6591a1-tls-cert\") pod \"authorino-7f57fb5654-f4ngj\" (UID: \"ec083ab0-4ec0-4e8c-9da5-aa6d5f6591a1\") " pod="kuadrant-system/authorino-7f57fb5654-f4ngj" Apr 17 21:24:33.063748 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:33.063718 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqqw2\" (UniqueName: \"kubernetes.io/projected/ec083ab0-4ec0-4e8c-9da5-aa6d5f6591a1-kube-api-access-lqqw2\") pod \"authorino-7f57fb5654-f4ngj\" (UID: \"ec083ab0-4ec0-4e8c-9da5-aa6d5f6591a1\") " pod="kuadrant-system/authorino-7f57fb5654-f4ngj" Apr 17 21:24:33.129158 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:33.129059 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7f57fb5654-f4ngj" Apr 17 21:24:33.156049 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:33.156015 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8qsf\" (UniqueName: \"kubernetes.io/projected/558d628d-438f-4c53-abc6-e7938c252aa4-kube-api-access-p8qsf\") pod \"558d628d-438f-4c53-abc6-e7938c252aa4\" (UID: \"558d628d-438f-4c53-abc6-e7938c252aa4\") " Apr 17 21:24:33.156254 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:33.156091 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtnwl\" (UniqueName: \"kubernetes.io/projected/31f54739-a583-46c6-9052-0b86c21f2f16-kube-api-access-mtnwl\") pod \"31f54739-a583-46c6-9052-0b86c21f2f16\" (UID: \"31f54739-a583-46c6-9052-0b86c21f2f16\") " Apr 17 21:24:33.156254 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:33.156112 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/31f54739-a583-46c6-9052-0b86c21f2f16-tls-cert\") pod \"31f54739-a583-46c6-9052-0b86c21f2f16\" (UID: \"31f54739-a583-46c6-9052-0b86c21f2f16\") " Apr 17 21:24:33.158266 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:33.158230 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/558d628d-438f-4c53-abc6-e7938c252aa4-kube-api-access-p8qsf" (OuterVolumeSpecName: "kube-api-access-p8qsf") pod "558d628d-438f-4c53-abc6-e7938c252aa4" (UID: "558d628d-438f-4c53-abc6-e7938c252aa4"). InnerVolumeSpecName "kube-api-access-p8qsf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:24:33.158266 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:33.158244 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31f54739-a583-46c6-9052-0b86c21f2f16-kube-api-access-mtnwl" (OuterVolumeSpecName: "kube-api-access-mtnwl") pod "31f54739-a583-46c6-9052-0b86c21f2f16" (UID: "31f54739-a583-46c6-9052-0b86c21f2f16"). InnerVolumeSpecName "kube-api-access-mtnwl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:24:33.158430 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:33.158341 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31f54739-a583-46c6-9052-0b86c21f2f16-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "31f54739-a583-46c6-9052-0b86c21f2f16" (UID: "31f54739-a583-46c6-9052-0b86c21f2f16"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 21:24:33.255254 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:33.255221 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7f57fb5654-f4ngj"] Apr 17 21:24:33.257373 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:33.257346 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mtnwl\" (UniqueName: \"kubernetes.io/projected/31f54739-a583-46c6-9052-0b86c21f2f16-kube-api-access-mtnwl\") on node \"ip-10-0-135-174.ec2.internal\" DevicePath \"\"" Apr 17 21:24:33.257373 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:24:33.257361 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec083ab0_4ec0_4e8c_9da5_aa6d5f6591a1.slice/crio-d9a021c1aab73b60b9acb571b4b55cf6f72d71585e4f5346c9d47df787096d1e WatchSource:0}: Error finding container d9a021c1aab73b60b9acb571b4b55cf6f72d71585e4f5346c9d47df787096d1e: Status 404 returned error can't find the container with id d9a021c1aab73b60b9acb571b4b55cf6f72d71585e4f5346c9d47df787096d1e Apr 17 21:24:33.257514 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:33.257378 2572 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/31f54739-a583-46c6-9052-0b86c21f2f16-tls-cert\") on node \"ip-10-0-135-174.ec2.internal\" DevicePath \"\"" Apr 17 21:24:33.257514 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:33.257395 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p8qsf\" (UniqueName: \"kubernetes.io/projected/558d628d-438f-4c53-abc6-e7938c252aa4-kube-api-access-p8qsf\") on node \"ip-10-0-135-174.ec2.internal\" DevicePath \"\"" Apr 17 21:24:34.028205 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:34.028156 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7f57fb5654-f4ngj" event={"ID":"ec083ab0-4ec0-4e8c-9da5-aa6d5f6591a1","Type":"ContainerStarted","Data":"b407ff5c4ee7545e945e877932db994efd9cbee0d87f43ea13e19f0a35c01544"} Apr 17 21:24:34.028205 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:34.028207 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7f57fb5654-f4ngj" event={"ID":"ec083ab0-4ec0-4e8c-9da5-aa6d5f6591a1","Type":"ContainerStarted","Data":"d9a021c1aab73b60b9acb571b4b55cf6f72d71585e4f5346c9d47df787096d1e"} Apr 17 21:24:34.028205 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:34.028214 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-q9lhq" Apr 17 21:24:34.028520 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:34.028375 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-56fdd757f5-5spm2" Apr 17 21:24:34.045781 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:34.045727 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7f57fb5654-f4ngj" podStartSLOduration=1.72814086 podStartE2EDuration="2.045710085s" podCreationTimestamp="2026-04-17 21:24:32 +0000 UTC" firstStartedPulling="2026-04-17 21:24:33.258667141 +0000 UTC m=+676.955222190" lastFinishedPulling="2026-04-17 21:24:33.576236361 +0000 UTC m=+677.272791415" observedRunningTime="2026-04-17 21:24:34.045027489 +0000 UTC m=+677.741582562" watchObservedRunningTime="2026-04-17 21:24:34.045710085 +0000 UTC m=+677.742265158" Apr 17 21:24:34.065145 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:34.065108 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-q9lhq"] Apr 17 21:24:34.072372 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:34.072297 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-z2d7j"] Apr 17 21:24:34.072607 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:34.072571 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-z2d7j" podUID="cb575c6a-4f1b-48cd-b764-eab8ce593c57" containerName="authorino" containerID="cri-o://32210fa2cec3d977e2877dca7b855f38fed386c7e438c8fa088ca9759219e255" gracePeriod=30 Apr 17 21:24:34.074072 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:34.074050 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-q9lhq"] Apr 17 21:24:34.091816 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:34.091783 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-56fdd757f5-5spm2"] Apr 17 21:24:34.098770 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:34.098739 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-56fdd757f5-5spm2"] Apr 17 21:24:34.319886 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:34.319858 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-z2d7j" Apr 17 21:24:34.466312 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:34.466275 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6c9fc\" (UniqueName: \"kubernetes.io/projected/cb575c6a-4f1b-48cd-b764-eab8ce593c57-kube-api-access-6c9fc\") pod \"cb575c6a-4f1b-48cd-b764-eab8ce593c57\" (UID: \"cb575c6a-4f1b-48cd-b764-eab8ce593c57\") " Apr 17 21:24:34.468407 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:34.468382 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb575c6a-4f1b-48cd-b764-eab8ce593c57-kube-api-access-6c9fc" (OuterVolumeSpecName: "kube-api-access-6c9fc") pod "cb575c6a-4f1b-48cd-b764-eab8ce593c57" (UID: "cb575c6a-4f1b-48cd-b764-eab8ce593c57"). InnerVolumeSpecName "kube-api-access-6c9fc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:24:34.567240 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:34.567119 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6c9fc\" (UniqueName: \"kubernetes.io/projected/cb575c6a-4f1b-48cd-b764-eab8ce593c57-kube-api-access-6c9fc\") on node \"ip-10-0-135-174.ec2.internal\" DevicePath \"\"" Apr 17 21:24:34.868862 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:34.868784 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31f54739-a583-46c6-9052-0b86c21f2f16" path="/var/lib/kubelet/pods/31f54739-a583-46c6-9052-0b86c21f2f16/volumes" Apr 17 21:24:34.869017 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:34.869004 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="558d628d-438f-4c53-abc6-e7938c252aa4" path="/var/lib/kubelet/pods/558d628d-438f-4c53-abc6-e7938c252aa4/volumes" Apr 17 21:24:34.978608 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:34.978570 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-7fd5dbf468-jw92l"] Apr 17 21:24:34.978969 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:34.978956 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb575c6a-4f1b-48cd-b764-eab8ce593c57" containerName="authorino" Apr 17 21:24:34.979018 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:34.978972 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb575c6a-4f1b-48cd-b764-eab8ce593c57" containerName="authorino" Apr 17 21:24:34.979052 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:34.979031 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="cb575c6a-4f1b-48cd-b764-eab8ce593c57" containerName="authorino" Apr 17 21:24:34.983197 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:34.983159 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7fd5dbf468-jw92l" Apr 17 21:24:34.985390 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:34.985364 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-mv65r\"" Apr 17 21:24:34.988546 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:34.988519 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7fd5dbf468-jw92l"] Apr 17 21:24:35.033192 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:35.033130 2572 generic.go:358] "Generic (PLEG): container finished" podID="cb575c6a-4f1b-48cd-b764-eab8ce593c57" containerID="32210fa2cec3d977e2877dca7b855f38fed386c7e438c8fa088ca9759219e255" exitCode=0 Apr 17 21:24:35.033389 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:35.033209 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-z2d7j" event={"ID":"cb575c6a-4f1b-48cd-b764-eab8ce593c57","Type":"ContainerDied","Data":"32210fa2cec3d977e2877dca7b855f38fed386c7e438c8fa088ca9759219e255"} Apr 17 21:24:35.033389 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:35.033249 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-z2d7j" event={"ID":"cb575c6a-4f1b-48cd-b764-eab8ce593c57","Type":"ContainerDied","Data":"4133f3d88d652c21eca4d6231e1d2362f7cfb863c25c87632dd3d52878f78ad5"} Apr 17 21:24:35.033389 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:35.033270 2572 scope.go:117] "RemoveContainer" containerID="32210fa2cec3d977e2877dca7b855f38fed386c7e438c8fa088ca9759219e255" Apr 17 21:24:35.033389 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:35.033222 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-z2d7j" Apr 17 21:24:35.041967 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:35.041947 2572 scope.go:117] "RemoveContainer" containerID="32210fa2cec3d977e2877dca7b855f38fed386c7e438c8fa088ca9759219e255" Apr 17 21:24:35.042284 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:24:35.042265 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32210fa2cec3d977e2877dca7b855f38fed386c7e438c8fa088ca9759219e255\": container with ID starting with 32210fa2cec3d977e2877dca7b855f38fed386c7e438c8fa088ca9759219e255 not found: ID does not exist" containerID="32210fa2cec3d977e2877dca7b855f38fed386c7e438c8fa088ca9759219e255" Apr 17 21:24:35.042342 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:35.042293 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32210fa2cec3d977e2877dca7b855f38fed386c7e438c8fa088ca9759219e255"} err="failed to get container status \"32210fa2cec3d977e2877dca7b855f38fed386c7e438c8fa088ca9759219e255\": rpc error: code = NotFound desc = could not find container \"32210fa2cec3d977e2877dca7b855f38fed386c7e438c8fa088ca9759219e255\": container with ID starting with 32210fa2cec3d977e2877dca7b855f38fed386c7e438c8fa088ca9759219e255 not found: ID does not exist" Apr 17 21:24:35.047459 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:35.047421 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-z2d7j"] Apr 17 21:24:35.048955 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:35.048926 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-z2d7j"] Apr 17 21:24:35.070719 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:35.070682 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f7rp\" (UniqueName: \"kubernetes.io/projected/a7198ddb-6de6-4646-a404-69d3c24dba71-kube-api-access-9f7rp\") pod \"maas-controller-7fd5dbf468-jw92l\" (UID: \"a7198ddb-6de6-4646-a404-69d3c24dba71\") " pod="opendatahub/maas-controller-7fd5dbf468-jw92l" Apr 17 21:24:35.116144 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:35.116105 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-756d6c5c7d-6t8zl"] Apr 17 21:24:35.120734 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:35.120667 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-756d6c5c7d-6t8zl" Apr 17 21:24:35.127232 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:35.127201 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-756d6c5c7d-6t8zl"] Apr 17 21:24:35.172002 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:35.171964 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9f7rp\" (UniqueName: \"kubernetes.io/projected/a7198ddb-6de6-4646-a404-69d3c24dba71-kube-api-access-9f7rp\") pod \"maas-controller-7fd5dbf468-jw92l\" (UID: \"a7198ddb-6de6-4646-a404-69d3c24dba71\") " pod="opendatahub/maas-controller-7fd5dbf468-jw92l" Apr 17 21:24:35.179451 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:35.179417 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f7rp\" (UniqueName: \"kubernetes.io/projected/a7198ddb-6de6-4646-a404-69d3c24dba71-kube-api-access-9f7rp\") pod \"maas-controller-7fd5dbf468-jw92l\" (UID: \"a7198ddb-6de6-4646-a404-69d3c24dba71\") " pod="opendatahub/maas-controller-7fd5dbf468-jw92l" Apr 17 21:24:35.273357 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:35.273318 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d477d\" (UniqueName: \"kubernetes.io/projected/79d12426-544d-4aca-9d61-2bb15902f197-kube-api-access-d477d\") pod \"maas-controller-756d6c5c7d-6t8zl\" (UID: \"79d12426-544d-4aca-9d61-2bb15902f197\") " pod="opendatahub/maas-controller-756d6c5c7d-6t8zl" Apr 17 21:24:35.294159 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:35.294123 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7fd5dbf468-jw92l" Apr 17 21:24:35.374104 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:35.374021 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d477d\" (UniqueName: \"kubernetes.io/projected/79d12426-544d-4aca-9d61-2bb15902f197-kube-api-access-d477d\") pod \"maas-controller-756d6c5c7d-6t8zl\" (UID: \"79d12426-544d-4aca-9d61-2bb15902f197\") " pod="opendatahub/maas-controller-756d6c5c7d-6t8zl" Apr 17 21:24:35.382686 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:35.382653 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d477d\" (UniqueName: \"kubernetes.io/projected/79d12426-544d-4aca-9d61-2bb15902f197-kube-api-access-d477d\") pod \"maas-controller-756d6c5c7d-6t8zl\" (UID: \"79d12426-544d-4aca-9d61-2bb15902f197\") " pod="opendatahub/maas-controller-756d6c5c7d-6t8zl" Apr 17 21:24:35.423572 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:35.423533 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7fd5dbf468-jw92l"] Apr 17 21:24:35.427926 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:24:35.427891 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7198ddb_6de6_4646_a404_69d3c24dba71.slice/crio-90c7f1eeac0f468cb33b26f83a0d941deb21017dd5f2019c805b121eb41b655c WatchSource:0}: Error finding container 90c7f1eeac0f468cb33b26f83a0d941deb21017dd5f2019c805b121eb41b655c: Status 404 returned error can't find the container with id 90c7f1eeac0f468cb33b26f83a0d941deb21017dd5f2019c805b121eb41b655c Apr 17 21:24:35.431993 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:35.431970 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-756d6c5c7d-6t8zl" Apr 17 21:24:35.555540 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:35.555516 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-756d6c5c7d-6t8zl"] Apr 17 21:24:35.557596 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:24:35.557562 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79d12426_544d_4aca_9d61_2bb15902f197.slice/crio-f8f83826134b71ae77d495e6661ffa575147080ca2ccf4b44e237395ae9d3430 WatchSource:0}: Error finding container f8f83826134b71ae77d495e6661ffa575147080ca2ccf4b44e237395ae9d3430: Status 404 returned error can't find the container with id f8f83826134b71ae77d495e6661ffa575147080ca2ccf4b44e237395ae9d3430 Apr 17 21:24:36.038293 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:36.038251 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-756d6c5c7d-6t8zl" event={"ID":"79d12426-544d-4aca-9d61-2bb15902f197","Type":"ContainerStarted","Data":"f8f83826134b71ae77d495e6661ffa575147080ca2ccf4b44e237395ae9d3430"} Apr 17 21:24:36.039799 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:36.039766 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7fd5dbf468-jw92l" event={"ID":"a7198ddb-6de6-4646-a404-69d3c24dba71","Type":"ContainerStarted","Data":"90c7f1eeac0f468cb33b26f83a0d941deb21017dd5f2019c805b121eb41b655c"} Apr 17 21:24:36.871341 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:36.871304 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb575c6a-4f1b-48cd-b764-eab8ce593c57" path="/var/lib/kubelet/pods/cb575c6a-4f1b-48cd-b764-eab8ce593c57/volumes" Apr 17 21:24:39.053481 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:39.053442 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-756d6c5c7d-6t8zl" event={"ID":"79d12426-544d-4aca-9d61-2bb15902f197","Type":"ContainerStarted","Data":"812c1aa37546cc31c95ef45c1c72e59b95abbda3d801b75c02576d75f8352b89"} Apr 17 21:24:39.053996 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:39.053543 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-756d6c5c7d-6t8zl" Apr 17 21:24:39.054855 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:39.054835 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7fd5dbf468-jw92l" event={"ID":"a7198ddb-6de6-4646-a404-69d3c24dba71","Type":"ContainerStarted","Data":"427ecd1407cce93c0f6280330649d15919f2e4c75f820555b9fb0dfeeabe2e69"} Apr 17 21:24:39.054943 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:39.054865 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-7fd5dbf468-jw92l" Apr 17 21:24:39.069330 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:39.069283 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-756d6c5c7d-6t8zl" podStartSLOduration=0.957843828 podStartE2EDuration="4.069269063s" podCreationTimestamp="2026-04-17 21:24:35 +0000 UTC" firstStartedPulling="2026-04-17 21:24:35.5589408 +0000 UTC m=+679.255495850" lastFinishedPulling="2026-04-17 21:24:38.670366035 +0000 UTC m=+682.366921085" observedRunningTime="2026-04-17 21:24:39.067412917 +0000 UTC m=+682.763967988" watchObservedRunningTime="2026-04-17 21:24:39.069269063 +0000 UTC m=+682.765824134" Apr 17 21:24:39.080386 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:39.080327 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-7fd5dbf468-jw92l" podStartSLOduration=1.847042839 podStartE2EDuration="5.080306868s" podCreationTimestamp="2026-04-17 21:24:34 +0000 UTC" firstStartedPulling="2026-04-17 21:24:35.429460047 +0000 UTC m=+679.126015104" lastFinishedPulling="2026-04-17 21:24:38.662724077 +0000 UTC m=+682.359279133" observedRunningTime="2026-04-17 21:24:39.079867194 +0000 UTC m=+682.776422263" watchObservedRunningTime="2026-04-17 21:24:39.080306868 +0000 UTC m=+682.776861939" Apr 17 21:24:50.063334 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:50.063299 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-756d6c5c7d-6t8zl" Apr 17 21:24:50.063798 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:50.063708 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-7fd5dbf468-jw92l" Apr 17 21:24:50.112639 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:50.112593 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-7fd5dbf468-jw92l"] Apr 17 21:24:50.112898 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:50.112855 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-7fd5dbf468-jw92l" podUID="a7198ddb-6de6-4646-a404-69d3c24dba71" containerName="manager" containerID="cri-o://427ecd1407cce93c0f6280330649d15919f2e4c75f820555b9fb0dfeeabe2e69" gracePeriod=10 Apr 17 21:24:50.357139 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:50.357107 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7fd5dbf468-jw92l" Apr 17 21:24:50.383333 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:50.383299 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-764cf56b49-dz8mz"] Apr 17 21:24:50.383658 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:50.383647 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7198ddb-6de6-4646-a404-69d3c24dba71" containerName="manager" Apr 17 21:24:50.383702 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:50.383659 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7198ddb-6de6-4646-a404-69d3c24dba71" containerName="manager" Apr 17 21:24:50.383737 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:50.383724 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="a7198ddb-6de6-4646-a404-69d3c24dba71" containerName="manager" Apr 17 21:24:50.386868 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:50.386844 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-764cf56b49-dz8mz" Apr 17 21:24:50.394435 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:50.394393 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-764cf56b49-dz8mz"] Apr 17 21:24:50.416645 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:50.416609 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9f7rp\" (UniqueName: \"kubernetes.io/projected/a7198ddb-6de6-4646-a404-69d3c24dba71-kube-api-access-9f7rp\") pod \"a7198ddb-6de6-4646-a404-69d3c24dba71\" (UID: \"a7198ddb-6de6-4646-a404-69d3c24dba71\") " Apr 17 21:24:50.416830 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:50.416740 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfnp4\" (UniqueName: \"kubernetes.io/projected/32031ea5-5706-4f88-aeb9-c78a19f458da-kube-api-access-hfnp4\") pod \"maas-controller-764cf56b49-dz8mz\" (UID: \"32031ea5-5706-4f88-aeb9-c78a19f458da\") " pod="opendatahub/maas-controller-764cf56b49-dz8mz" Apr 17 21:24:50.418807 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:50.418768 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7198ddb-6de6-4646-a404-69d3c24dba71-kube-api-access-9f7rp" (OuterVolumeSpecName: "kube-api-access-9f7rp") pod "a7198ddb-6de6-4646-a404-69d3c24dba71" (UID: "a7198ddb-6de6-4646-a404-69d3c24dba71"). InnerVolumeSpecName "kube-api-access-9f7rp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:24:50.517527 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:50.517487 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hfnp4\" (UniqueName: \"kubernetes.io/projected/32031ea5-5706-4f88-aeb9-c78a19f458da-kube-api-access-hfnp4\") pod \"maas-controller-764cf56b49-dz8mz\" (UID: \"32031ea5-5706-4f88-aeb9-c78a19f458da\") " pod="opendatahub/maas-controller-764cf56b49-dz8mz" Apr 17 21:24:50.517701 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:50.517542 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9f7rp\" (UniqueName: \"kubernetes.io/projected/a7198ddb-6de6-4646-a404-69d3c24dba71-kube-api-access-9f7rp\") on node \"ip-10-0-135-174.ec2.internal\" DevicePath \"\"" Apr 17 21:24:50.524998 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:50.524971 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfnp4\" (UniqueName: \"kubernetes.io/projected/32031ea5-5706-4f88-aeb9-c78a19f458da-kube-api-access-hfnp4\") pod \"maas-controller-764cf56b49-dz8mz\" (UID: \"32031ea5-5706-4f88-aeb9-c78a19f458da\") " pod="opendatahub/maas-controller-764cf56b49-dz8mz" Apr 17 21:24:50.698293 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:50.698207 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-764cf56b49-dz8mz" Apr 17 21:24:50.820508 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:50.820429 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-764cf56b49-dz8mz"] Apr 17 21:24:50.823229 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:24:50.823200 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32031ea5_5706_4f88_aeb9_c78a19f458da.slice/crio-0fa5cc9f88ccea6cb52fc2b26959f96de727f10261afd2809e98749bcf7c6719 WatchSource:0}: Error finding container 0fa5cc9f88ccea6cb52fc2b26959f96de727f10261afd2809e98749bcf7c6719: Status 404 returned error can't find the container with id 0fa5cc9f88ccea6cb52fc2b26959f96de727f10261afd2809e98749bcf7c6719 Apr 17 21:24:51.097750 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:51.097724 2572 generic.go:358] "Generic (PLEG): container finished" podID="a7198ddb-6de6-4646-a404-69d3c24dba71" containerID="427ecd1407cce93c0f6280330649d15919f2e4c75f820555b9fb0dfeeabe2e69" exitCode=0 Apr 17 21:24:51.098103 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:51.097814 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7fd5dbf468-jw92l" event={"ID":"a7198ddb-6de6-4646-a404-69d3c24dba71","Type":"ContainerDied","Data":"427ecd1407cce93c0f6280330649d15919f2e4c75f820555b9fb0dfeeabe2e69"} Apr 17 21:24:51.098103 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:51.097839 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7fd5dbf468-jw92l" event={"ID":"a7198ddb-6de6-4646-a404-69d3c24dba71","Type":"ContainerDied","Data":"90c7f1eeac0f468cb33b26f83a0d941deb21017dd5f2019c805b121eb41b655c"} Apr 17 21:24:51.098103 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:51.097840 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7fd5dbf468-jw92l" Apr 17 21:24:51.098103 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:51.097853 2572 scope.go:117] "RemoveContainer" containerID="427ecd1407cce93c0f6280330649d15919f2e4c75f820555b9fb0dfeeabe2e69" Apr 17 21:24:51.098916 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:51.098891 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-764cf56b49-dz8mz" event={"ID":"32031ea5-5706-4f88-aeb9-c78a19f458da","Type":"ContainerStarted","Data":"0fa5cc9f88ccea6cb52fc2b26959f96de727f10261afd2809e98749bcf7c6719"} Apr 17 21:24:51.107182 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:51.107081 2572 scope.go:117] "RemoveContainer" containerID="427ecd1407cce93c0f6280330649d15919f2e4c75f820555b9fb0dfeeabe2e69" Apr 17 21:24:51.107424 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:24:51.107402 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"427ecd1407cce93c0f6280330649d15919f2e4c75f820555b9fb0dfeeabe2e69\": container with ID starting with 427ecd1407cce93c0f6280330649d15919f2e4c75f820555b9fb0dfeeabe2e69 not found: ID does not exist" containerID="427ecd1407cce93c0f6280330649d15919f2e4c75f820555b9fb0dfeeabe2e69" Apr 17 21:24:51.107501 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:51.107431 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"427ecd1407cce93c0f6280330649d15919f2e4c75f820555b9fb0dfeeabe2e69"} err="failed to get container status \"427ecd1407cce93c0f6280330649d15919f2e4c75f820555b9fb0dfeeabe2e69\": rpc error: code = NotFound desc = could not find container \"427ecd1407cce93c0f6280330649d15919f2e4c75f820555b9fb0dfeeabe2e69\": container with ID starting with 427ecd1407cce93c0f6280330649d15919f2e4c75f820555b9fb0dfeeabe2e69 not found: ID does not exist" Apr 17 21:24:51.114419 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:51.114389 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-7fd5dbf468-jw92l"] Apr 17 21:24:51.118350 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:51.118325 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-7fd5dbf468-jw92l"] Apr 17 21:24:52.103847 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:52.103818 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-764cf56b49-dz8mz" event={"ID":"32031ea5-5706-4f88-aeb9-c78a19f458da","Type":"ContainerStarted","Data":"63b99d2ec30fc210143fd7175a3762fed6186304879bc8aa35746888ac9fa0c7"} Apr 17 21:24:52.104225 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:52.103918 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-764cf56b49-dz8mz" Apr 17 21:24:52.120758 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:52.120719 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-764cf56b49-dz8mz" podStartSLOduration=1.84975083 podStartE2EDuration="2.120706033s" podCreationTimestamp="2026-04-17 21:24:50 +0000 UTC" firstStartedPulling="2026-04-17 21:24:50.824570206 +0000 UTC m=+694.521125259" lastFinishedPulling="2026-04-17 21:24:51.095525388 +0000 UTC m=+694.792080462" observedRunningTime="2026-04-17 21:24:52.118518114 +0000 UTC m=+695.815073185" watchObservedRunningTime="2026-04-17 21:24:52.120706033 +0000 UTC m=+695.817261105" Apr 17 21:24:52.868128 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:52.868092 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7198ddb-6de6-4646-a404-69d3c24dba71" path="/var/lib/kubelet/pods/a7198ddb-6de6-4646-a404-69d3c24dba71/volumes" Apr 17 21:24:58.960927 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:58.960889 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-5b4449676-fmk49"] Apr 17 21:24:58.964586 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:58.964568 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-5b4449676-fmk49" Apr 17 21:24:58.966765 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:58.966740 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 17 21:24:58.966871 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:58.966746 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-c56s7\"" Apr 17 21:24:58.966871 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:58.966749 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 17 21:24:58.974620 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:58.974586 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-5b4449676-fmk49"] Apr 17 21:24:59.097753 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:59.097717 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22rfj\" (UniqueName: \"kubernetes.io/projected/7246fee8-7040-4e73-a863-f27ac5776673-kube-api-access-22rfj\") pod \"maas-api-5b4449676-fmk49\" (UID: \"7246fee8-7040-4e73-a863-f27ac5776673\") " pod="opendatahub/maas-api-5b4449676-fmk49" Apr 17 21:24:59.097918 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:59.097777 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/7246fee8-7040-4e73-a863-f27ac5776673-maas-api-tls\") pod \"maas-api-5b4449676-fmk49\" (UID: \"7246fee8-7040-4e73-a863-f27ac5776673\") " pod="opendatahub/maas-api-5b4449676-fmk49" Apr 17 21:24:59.198582 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:59.198547 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-22rfj\" (UniqueName: \"kubernetes.io/projected/7246fee8-7040-4e73-a863-f27ac5776673-kube-api-access-22rfj\") pod \"maas-api-5b4449676-fmk49\" (UID: \"7246fee8-7040-4e73-a863-f27ac5776673\") " pod="opendatahub/maas-api-5b4449676-fmk49" Apr 17 21:24:59.198768 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:59.198600 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/7246fee8-7040-4e73-a863-f27ac5776673-maas-api-tls\") pod \"maas-api-5b4449676-fmk49\" (UID: \"7246fee8-7040-4e73-a863-f27ac5776673\") " pod="opendatahub/maas-api-5b4449676-fmk49" Apr 17 21:24:59.201229 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:59.201198 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/7246fee8-7040-4e73-a863-f27ac5776673-maas-api-tls\") pod \"maas-api-5b4449676-fmk49\" (UID: \"7246fee8-7040-4e73-a863-f27ac5776673\") " pod="opendatahub/maas-api-5b4449676-fmk49" Apr 17 21:24:59.208229 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:59.208205 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-22rfj\" (UniqueName: \"kubernetes.io/projected/7246fee8-7040-4e73-a863-f27ac5776673-kube-api-access-22rfj\") pod \"maas-api-5b4449676-fmk49\" (UID: \"7246fee8-7040-4e73-a863-f27ac5776673\") " pod="opendatahub/maas-api-5b4449676-fmk49" Apr 17 21:24:59.277193 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:59.277131 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-5b4449676-fmk49" Apr 17 21:24:59.408059 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:24:59.408034 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-5b4449676-fmk49"] Apr 17 21:24:59.410901 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:24:59.410871 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7246fee8_7040_4e73_a863_f27ac5776673.slice/crio-a4a46a3209218e9d2b788ce230fe5259473467492da3d085d8cfe9a61427c80c WatchSource:0}: Error finding container a4a46a3209218e9d2b788ce230fe5259473467492da3d085d8cfe9a61427c80c: Status 404 returned error can't find the container with id a4a46a3209218e9d2b788ce230fe5259473467492da3d085d8cfe9a61427c80c Apr 17 21:25:00.135208 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:00.135149 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-5b4449676-fmk49" event={"ID":"7246fee8-7040-4e73-a863-f27ac5776673","Type":"ContainerStarted","Data":"a4a46a3209218e9d2b788ce230fe5259473467492da3d085d8cfe9a61427c80c"} Apr 17 21:25:01.140831 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:01.140795 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-5b4449676-fmk49" event={"ID":"7246fee8-7040-4e73-a863-f27ac5776673","Type":"ContainerStarted","Data":"2307327cffe0e130ad1e91c5a4a51b82473b4315dfc9bf3ca4ce07cbb5655574"} Apr 17 21:25:01.141262 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:01.140885 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-5b4449676-fmk49" Apr 17 21:25:01.159745 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:01.159634 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-5b4449676-fmk49" podStartSLOduration=1.6657719279999998 podStartE2EDuration="3.159615987s" podCreationTimestamp="2026-04-17 21:24:58 +0000 UTC" firstStartedPulling="2026-04-17 21:24:59.412156182 +0000 UTC m=+703.108711232" lastFinishedPulling="2026-04-17 21:25:00.906000242 +0000 UTC m=+704.602555291" observedRunningTime="2026-04-17 21:25:01.158417147 +0000 UTC m=+704.854972219" watchObservedRunningTime="2026-04-17 21:25:01.159615987 +0000 UTC m=+704.856171061" Apr 17 21:25:03.113241 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:03.113209 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-764cf56b49-dz8mz" Apr 17 21:25:03.152951 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:03.152907 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-756d6c5c7d-6t8zl"] Apr 17 21:25:03.153280 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:03.153152 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-756d6c5c7d-6t8zl" podUID="79d12426-544d-4aca-9d61-2bb15902f197" containerName="manager" containerID="cri-o://812c1aa37546cc31c95ef45c1c72e59b95abbda3d801b75c02576d75f8352b89" gracePeriod=10 Apr 17 21:25:03.401493 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:03.401466 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-756d6c5c7d-6t8zl" Apr 17 21:25:03.541502 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:03.541468 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d477d\" (UniqueName: \"kubernetes.io/projected/79d12426-544d-4aca-9d61-2bb15902f197-kube-api-access-d477d\") pod \"79d12426-544d-4aca-9d61-2bb15902f197\" (UID: \"79d12426-544d-4aca-9d61-2bb15902f197\") " Apr 17 21:25:03.543580 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:03.543546 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79d12426-544d-4aca-9d61-2bb15902f197-kube-api-access-d477d" (OuterVolumeSpecName: "kube-api-access-d477d") pod "79d12426-544d-4aca-9d61-2bb15902f197" (UID: "79d12426-544d-4aca-9d61-2bb15902f197"). InnerVolumeSpecName "kube-api-access-d477d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:25:03.642234 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:03.642119 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d477d\" (UniqueName: \"kubernetes.io/projected/79d12426-544d-4aca-9d61-2bb15902f197-kube-api-access-d477d\") on node \"ip-10-0-135-174.ec2.internal\" DevicePath \"\"" Apr 17 21:25:04.151722 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:04.151684 2572 generic.go:358] "Generic (PLEG): container finished" podID="79d12426-544d-4aca-9d61-2bb15902f197" containerID="812c1aa37546cc31c95ef45c1c72e59b95abbda3d801b75c02576d75f8352b89" exitCode=0 Apr 17 21:25:04.152159 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:04.151744 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-756d6c5c7d-6t8zl" Apr 17 21:25:04.152159 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:04.151771 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-756d6c5c7d-6t8zl" event={"ID":"79d12426-544d-4aca-9d61-2bb15902f197","Type":"ContainerDied","Data":"812c1aa37546cc31c95ef45c1c72e59b95abbda3d801b75c02576d75f8352b89"} Apr 17 21:25:04.152159 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:04.151810 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-756d6c5c7d-6t8zl" event={"ID":"79d12426-544d-4aca-9d61-2bb15902f197","Type":"ContainerDied","Data":"f8f83826134b71ae77d495e6661ffa575147080ca2ccf4b44e237395ae9d3430"} Apr 17 21:25:04.152159 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:04.151826 2572 scope.go:117] "RemoveContainer" containerID="812c1aa37546cc31c95ef45c1c72e59b95abbda3d801b75c02576d75f8352b89" Apr 17 21:25:04.160809 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:04.160647 2572 scope.go:117] "RemoveContainer" containerID="812c1aa37546cc31c95ef45c1c72e59b95abbda3d801b75c02576d75f8352b89" Apr 17 21:25:04.160984 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:25:04.160964 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"812c1aa37546cc31c95ef45c1c72e59b95abbda3d801b75c02576d75f8352b89\": container with ID starting with 812c1aa37546cc31c95ef45c1c72e59b95abbda3d801b75c02576d75f8352b89 not found: ID does not exist" containerID="812c1aa37546cc31c95ef45c1c72e59b95abbda3d801b75c02576d75f8352b89" Apr 17 21:25:04.161032 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:04.160995 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"812c1aa37546cc31c95ef45c1c72e59b95abbda3d801b75c02576d75f8352b89"} err="failed to get container status \"812c1aa37546cc31c95ef45c1c72e59b95abbda3d801b75c02576d75f8352b89\": rpc error: code = NotFound desc = could not find container \"812c1aa37546cc31c95ef45c1c72e59b95abbda3d801b75c02576d75f8352b89\": container with ID starting with 812c1aa37546cc31c95ef45c1c72e59b95abbda3d801b75c02576d75f8352b89 not found: ID does not exist" Apr 17 21:25:04.172310 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:04.172267 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-756d6c5c7d-6t8zl"] Apr 17 21:25:04.173733 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:04.173711 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-756d6c5c7d-6t8zl"] Apr 17 21:25:04.867933 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:04.867899 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79d12426-544d-4aca-9d61-2bb15902f197" path="/var/lib/kubelet/pods/79d12426-544d-4aca-9d61-2bb15902f197/volumes" Apr 17 21:25:07.150346 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:07.150268 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-5b4449676-fmk49" Apr 17 21:25:14.162248 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:14.162208 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7"] Apr 17 21:25:14.162611 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:14.162578 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="79d12426-544d-4aca-9d61-2bb15902f197" containerName="manager" Apr 17 21:25:14.162611 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:14.162589 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="79d12426-544d-4aca-9d61-2bb15902f197" containerName="manager" Apr 17 21:25:14.162695 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:14.162643 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="79d12426-544d-4aca-9d61-2bb15902f197" containerName="manager" Apr 17 21:25:14.169842 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:14.169810 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" Apr 17 21:25:14.171831 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:14.171801 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-vmdkh\"" Apr 17 21:25:14.172247 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:14.172226 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 17 21:25:14.172655 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:14.172638 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 17 21:25:14.172728 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:14.172679 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 17 21:25:14.172793 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:14.172727 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7"] Apr 17 21:25:14.337594 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:14.337553 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c4329648-06d9-49f2-bbc6-c9cb28c1e100-dshm\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-zm6l7\" (UID: \"c4329648-06d9-49f2-bbc6-c9cb28c1e100\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" Apr 17 21:25:14.337798 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:14.337659 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c4329648-06d9-49f2-bbc6-c9cb28c1e100-home\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-zm6l7\" (UID: \"c4329648-06d9-49f2-bbc6-c9cb28c1e100\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" Apr 17 21:25:14.337798 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:14.337698 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c4329648-06d9-49f2-bbc6-c9cb28c1e100-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-zm6l7\" (UID: \"c4329648-06d9-49f2-bbc6-c9cb28c1e100\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" Apr 17 21:25:14.337925 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:14.337812 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42wzv\" (UniqueName: \"kubernetes.io/projected/c4329648-06d9-49f2-bbc6-c9cb28c1e100-kube-api-access-42wzv\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-zm6l7\" (UID: \"c4329648-06d9-49f2-bbc6-c9cb28c1e100\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" Apr 17 21:25:14.337925 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:14.337864 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c4329648-06d9-49f2-bbc6-c9cb28c1e100-tls-certs\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-zm6l7\" (UID: \"c4329648-06d9-49f2-bbc6-c9cb28c1e100\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" Apr 17 21:25:14.337925 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:14.337900 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c4329648-06d9-49f2-bbc6-c9cb28c1e100-model-cache\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-zm6l7\" (UID: \"c4329648-06d9-49f2-bbc6-c9cb28c1e100\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" Apr 17 21:25:14.438955 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:14.438849 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c4329648-06d9-49f2-bbc6-c9cb28c1e100-home\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-zm6l7\" (UID: \"c4329648-06d9-49f2-bbc6-c9cb28c1e100\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" Apr 17 21:25:14.438955 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:14.438899 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c4329648-06d9-49f2-bbc6-c9cb28c1e100-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-zm6l7\" (UID: \"c4329648-06d9-49f2-bbc6-c9cb28c1e100\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" Apr 17 21:25:14.438955 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:14.438944 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-42wzv\" (UniqueName: \"kubernetes.io/projected/c4329648-06d9-49f2-bbc6-c9cb28c1e100-kube-api-access-42wzv\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-zm6l7\" (UID: \"c4329648-06d9-49f2-bbc6-c9cb28c1e100\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" Apr 17 21:25:14.439276 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:14.438961 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c4329648-06d9-49f2-bbc6-c9cb28c1e100-tls-certs\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-zm6l7\" (UID: \"c4329648-06d9-49f2-bbc6-c9cb28c1e100\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" Apr 17 21:25:14.439276 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:14.438979 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c4329648-06d9-49f2-bbc6-c9cb28c1e100-model-cache\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-zm6l7\" (UID: \"c4329648-06d9-49f2-bbc6-c9cb28c1e100\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" Apr 17 21:25:14.439276 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:14.439116 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c4329648-06d9-49f2-bbc6-c9cb28c1e100-dshm\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-zm6l7\" (UID: \"c4329648-06d9-49f2-bbc6-c9cb28c1e100\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" Apr 17 21:25:14.439441 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:14.439387 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c4329648-06d9-49f2-bbc6-c9cb28c1e100-home\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-zm6l7\" (UID: \"c4329648-06d9-49f2-bbc6-c9cb28c1e100\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" Apr 17 21:25:14.439441 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:14.439403 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c4329648-06d9-49f2-bbc6-c9cb28c1e100-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-zm6l7\" (UID: \"c4329648-06d9-49f2-bbc6-c9cb28c1e100\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" Apr 17 21:25:14.439510 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:14.439493 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c4329648-06d9-49f2-bbc6-c9cb28c1e100-model-cache\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-zm6l7\" (UID: \"c4329648-06d9-49f2-bbc6-c9cb28c1e100\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" Apr 17 21:25:14.441323 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:14.441297 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c4329648-06d9-49f2-bbc6-c9cb28c1e100-dshm\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-zm6l7\" (UID: \"c4329648-06d9-49f2-bbc6-c9cb28c1e100\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" Apr 17 21:25:14.441502 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:14.441486 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c4329648-06d9-49f2-bbc6-c9cb28c1e100-tls-certs\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-zm6l7\" (UID: \"c4329648-06d9-49f2-bbc6-c9cb28c1e100\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" Apr 17 21:25:14.447147 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:14.447119 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-42wzv\" (UniqueName: \"kubernetes.io/projected/c4329648-06d9-49f2-bbc6-c9cb28c1e100-kube-api-access-42wzv\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-zm6l7\" (UID: \"c4329648-06d9-49f2-bbc6-c9cb28c1e100\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" Apr 17 21:25:14.481359 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:14.481314 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" Apr 17 21:25:14.609527 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:14.609490 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7"] Apr 17 21:25:14.612587 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:25:14.612547 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4329648_06d9_49f2_bbc6_c9cb28c1e100.slice/crio-be480a1b87e83c570574bce6c8afa970f109620a74aa17ef4fe8c6509659db07 WatchSource:0}: Error finding container be480a1b87e83c570574bce6c8afa970f109620a74aa17ef4fe8c6509659db07: Status 404 returned error can't find the container with id be480a1b87e83c570574bce6c8afa970f109620a74aa17ef4fe8c6509659db07 Apr 17 21:25:15.193900 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:15.193862 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" event={"ID":"c4329648-06d9-49f2-bbc6-c9cb28c1e100","Type":"ContainerStarted","Data":"be480a1b87e83c570574bce6c8afa970f109620a74aa17ef4fe8c6509659db07"} Apr 17 21:25:21.218188 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:21.218140 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" event={"ID":"c4329648-06d9-49f2-bbc6-c9cb28c1e100","Type":"ContainerStarted","Data":"e88f8d663b6afde17ae344d9203cbb2282b02abe79a36bbc542a3f88755ffcff"} Apr 17 21:25:23.258331 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:23.258294 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6"] Apr 17 21:25:23.262134 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:23.262108 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" Apr 17 21:25:23.264360 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:23.264331 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 17 21:25:23.268868 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:23.268838 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6"] Apr 17 21:25:23.423426 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:23.423394 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e38f40b4-53d8-4792-8ced-d1409571e740-home\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6\" (UID: \"e38f40b4-53d8-4792-8ced-d1409571e740\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" Apr 17 21:25:23.423615 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:23.423441 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v9l4\" (UniqueName: \"kubernetes.io/projected/e38f40b4-53d8-4792-8ced-d1409571e740-kube-api-access-7v9l4\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6\" (UID: \"e38f40b4-53d8-4792-8ced-d1409571e740\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" Apr 17 21:25:23.423615 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:23.423472 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e38f40b4-53d8-4792-8ced-d1409571e740-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6\" (UID: \"e38f40b4-53d8-4792-8ced-d1409571e740\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" Apr 17 21:25:23.423615 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:23.423492 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e38f40b4-53d8-4792-8ced-d1409571e740-dshm\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6\" (UID: \"e38f40b4-53d8-4792-8ced-d1409571e740\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" Apr 17 21:25:23.423615 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:23.423526 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e38f40b4-53d8-4792-8ced-d1409571e740-model-cache\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6\" (UID: \"e38f40b4-53d8-4792-8ced-d1409571e740\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" Apr 17 21:25:23.423789 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:23.423635 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e38f40b4-53d8-4792-8ced-d1409571e740-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6\" (UID: \"e38f40b4-53d8-4792-8ced-d1409571e740\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" Apr 17 21:25:23.524825 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:23.524722 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e38f40b4-53d8-4792-8ced-d1409571e740-model-cache\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6\" (UID: \"e38f40b4-53d8-4792-8ced-d1409571e740\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" Apr 17 21:25:23.524825 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:23.524791 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e38f40b4-53d8-4792-8ced-d1409571e740-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6\" (UID: \"e38f40b4-53d8-4792-8ced-d1409571e740\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" Apr 17 21:25:23.524825 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:23.524825 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e38f40b4-53d8-4792-8ced-d1409571e740-home\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6\" (UID: \"e38f40b4-53d8-4792-8ced-d1409571e740\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" Apr 17 21:25:23.525105 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:23.524869 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7v9l4\" (UniqueName: \"kubernetes.io/projected/e38f40b4-53d8-4792-8ced-d1409571e740-kube-api-access-7v9l4\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6\" (UID: \"e38f40b4-53d8-4792-8ced-d1409571e740\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" Apr 17 21:25:23.525105 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:23.524911 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e38f40b4-53d8-4792-8ced-d1409571e740-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6\" (UID: \"e38f40b4-53d8-4792-8ced-d1409571e740\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" Apr 17 21:25:23.525105 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:23.524943 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e38f40b4-53d8-4792-8ced-d1409571e740-dshm\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6\" (UID: \"e38f40b4-53d8-4792-8ced-d1409571e740\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" Apr 17 21:25:23.525431 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:23.525405 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e38f40b4-53d8-4792-8ced-d1409571e740-model-cache\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6\" (UID: \"e38f40b4-53d8-4792-8ced-d1409571e740\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" Apr 17 21:25:23.525562 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:23.525531 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e38f40b4-53d8-4792-8ced-d1409571e740-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6\" (UID: \"e38f40b4-53d8-4792-8ced-d1409571e740\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" Apr 17 21:25:23.525562 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:23.525561 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e38f40b4-53d8-4792-8ced-d1409571e740-home\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6\" (UID: \"e38f40b4-53d8-4792-8ced-d1409571e740\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" Apr 17 21:25:23.527522 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:23.527495 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e38f40b4-53d8-4792-8ced-d1409571e740-dshm\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6\" (UID: \"e38f40b4-53d8-4792-8ced-d1409571e740\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" Apr 17 21:25:23.527704 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:23.527685 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e38f40b4-53d8-4792-8ced-d1409571e740-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6\" (UID: \"e38f40b4-53d8-4792-8ced-d1409571e740\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" Apr 17 21:25:23.534125 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:23.534077 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v9l4\" (UniqueName: \"kubernetes.io/projected/e38f40b4-53d8-4792-8ced-d1409571e740-kube-api-access-7v9l4\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6\" (UID: \"e38f40b4-53d8-4792-8ced-d1409571e740\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" Apr 17 21:25:23.574850 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:23.574807 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" Apr 17 21:25:23.729209 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:23.729154 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6"] Apr 17 21:25:23.732983 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:25:23.732949 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode38f40b4_53d8_4792_8ced_d1409571e740.slice/crio-ee6b270d5e4e89fe61bd2f208ee8b84ba432d3c57f0f61e4465d46e7858d0c0a WatchSource:0}: Error finding container ee6b270d5e4e89fe61bd2f208ee8b84ba432d3c57f0f61e4465d46e7858d0c0a: Status 404 returned error can't find the container with id ee6b270d5e4e89fe61bd2f208ee8b84ba432d3c57f0f61e4465d46e7858d0c0a Apr 17 21:25:24.230250 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:24.230213 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" event={"ID":"e38f40b4-53d8-4792-8ced-d1409571e740","Type":"ContainerStarted","Data":"fe0003e65ea56138c695c597726b34bb9bc5a587d4dff56da1567418aa42f40f"} Apr 17 21:25:24.230250 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:24.230256 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" event={"ID":"e38f40b4-53d8-4792-8ced-d1409571e740","Type":"ContainerStarted","Data":"ee6b270d5e4e89fe61bd2f208ee8b84ba432d3c57f0f61e4465d46e7858d0c0a"} Apr 17 21:25:27.242864 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:27.242830 2572 generic.go:358] "Generic (PLEG): container finished" podID="c4329648-06d9-49f2-bbc6-c9cb28c1e100" containerID="e88f8d663b6afde17ae344d9203cbb2282b02abe79a36bbc542a3f88755ffcff" exitCode=0 Apr 17 21:25:27.242864 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:27.242868 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" event={"ID":"c4329648-06d9-49f2-bbc6-c9cb28c1e100","Type":"ContainerDied","Data":"e88f8d663b6afde17ae344d9203cbb2282b02abe79a36bbc542a3f88755ffcff"} Apr 17 21:25:29.251003 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:29.250974 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_c4329648-06d9-49f2-bbc6-c9cb28c1e100/main/0.log" Apr 17 21:25:29.251444 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:29.251275 2572 generic.go:358] "Generic (PLEG): container finished" podID="c4329648-06d9-49f2-bbc6-c9cb28c1e100" containerID="b639fd2755816ef9a4f44cc9f5d53e58947eca9b2ef093d4970faa147cc847b4" exitCode=2 Apr 17 21:25:29.251444 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:29.251343 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" event={"ID":"c4329648-06d9-49f2-bbc6-c9cb28c1e100","Type":"ContainerDied","Data":"b639fd2755816ef9a4f44cc9f5d53e58947eca9b2ef093d4970faa147cc847b4"} Apr 17 21:25:29.251706 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:29.251693 2572 scope.go:117] "RemoveContainer" containerID="b639fd2755816ef9a4f44cc9f5d53e58947eca9b2ef093d4970faa147cc847b4" Apr 17 21:25:30.257503 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:30.257468 2572 generic.go:358] "Generic (PLEG): container finished" podID="e38f40b4-53d8-4792-8ced-d1409571e740" containerID="fe0003e65ea56138c695c597726b34bb9bc5a587d4dff56da1567418aa42f40f" exitCode=0 Apr 17 21:25:30.257984 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:30.257545 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" event={"ID":"e38f40b4-53d8-4792-8ced-d1409571e740","Type":"ContainerDied","Data":"fe0003e65ea56138c695c597726b34bb9bc5a587d4dff56da1567418aa42f40f"} Apr 17 21:25:30.258996 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:30.258982 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_c4329648-06d9-49f2-bbc6-c9cb28c1e100/main/1.log" Apr 17 21:25:30.259375 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:30.259361 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_c4329648-06d9-49f2-bbc6-c9cb28c1e100/main/0.log" Apr 17 21:25:30.259708 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:30.259667 2572 generic.go:358] "Generic (PLEG): container finished" podID="c4329648-06d9-49f2-bbc6-c9cb28c1e100" containerID="f9a18cbba12d7a87f0c45539ef27fba1164a55db184405558660771f4c816d9c" exitCode=2 Apr 17 21:25:30.259783 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:30.259719 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" event={"ID":"c4329648-06d9-49f2-bbc6-c9cb28c1e100","Type":"ContainerDied","Data":"f9a18cbba12d7a87f0c45539ef27fba1164a55db184405558660771f4c816d9c"} Apr 17 21:25:30.259783 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:30.259758 2572 scope.go:117] "RemoveContainer" containerID="b639fd2755816ef9a4f44cc9f5d53e58947eca9b2ef093d4970faa147cc847b4" Apr 17 21:25:30.260242 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:30.260225 2572 scope.go:117] "RemoveContainer" containerID="f9a18cbba12d7a87f0c45539ef27fba1164a55db184405558660771f4c816d9c" Apr 17 21:25:30.260557 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:25:30.260534 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_llm(c4329648-06d9-49f2-bbc6-c9cb28c1e100)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" podUID="c4329648-06d9-49f2-bbc6-c9cb28c1e100" Apr 17 21:25:31.263840 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:31.263811 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_c4329648-06d9-49f2-bbc6-c9cb28c1e100/main/1.log" Apr 17 21:25:31.265510 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:31.265490 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_e38f40b4-53d8-4792-8ced-d1409571e740/main/0.log" Apr 17 21:25:31.265796 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:31.265777 2572 generic.go:358] "Generic (PLEG): container finished" podID="e38f40b4-53d8-4792-8ced-d1409571e740" containerID="9138e744c920e9be056bdcf57559af27ec212057ea90f2fbd8ad4787913e6177" exitCode=2 Apr 17 21:25:31.265842 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:31.265829 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" event={"ID":"e38f40b4-53d8-4792-8ced-d1409571e740","Type":"ContainerDied","Data":"9138e744c920e9be056bdcf57559af27ec212057ea90f2fbd8ad4787913e6177"} Apr 17 21:25:31.266145 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:31.266133 2572 scope.go:117] "RemoveContainer" containerID="9138e744c920e9be056bdcf57559af27ec212057ea90f2fbd8ad4787913e6177" Apr 17 21:25:31.593330 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:31.593305 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-5b4449676-fmk49"] Apr 17 21:25:31.593557 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:31.593536 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-5b4449676-fmk49" podUID="7246fee8-7040-4e73-a863-f27ac5776673" containerName="maas-api" containerID="cri-o://2307327cffe0e130ad1e91c5a4a51b82473b4315dfc9bf3ca4ce07cbb5655574" gracePeriod=30 Apr 17 21:25:31.832635 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:31.832612 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-5b4449676-fmk49" Apr 17 21:25:31.902354 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:31.902318 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22rfj\" (UniqueName: \"kubernetes.io/projected/7246fee8-7040-4e73-a863-f27ac5776673-kube-api-access-22rfj\") pod \"7246fee8-7040-4e73-a863-f27ac5776673\" (UID: \"7246fee8-7040-4e73-a863-f27ac5776673\") " Apr 17 21:25:31.902517 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:31.902383 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/7246fee8-7040-4e73-a863-f27ac5776673-maas-api-tls\") pod \"7246fee8-7040-4e73-a863-f27ac5776673\" (UID: \"7246fee8-7040-4e73-a863-f27ac5776673\") " Apr 17 21:25:31.904421 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:31.904397 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7246fee8-7040-4e73-a863-f27ac5776673-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "7246fee8-7040-4e73-a863-f27ac5776673" (UID: "7246fee8-7040-4e73-a863-f27ac5776673"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 21:25:31.904496 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:31.904430 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7246fee8-7040-4e73-a863-f27ac5776673-kube-api-access-22rfj" (OuterVolumeSpecName: "kube-api-access-22rfj") pod "7246fee8-7040-4e73-a863-f27ac5776673" (UID: "7246fee8-7040-4e73-a863-f27ac5776673"). InnerVolumeSpecName "kube-api-access-22rfj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:25:32.002988 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:32.002962 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-22rfj\" (UniqueName: \"kubernetes.io/projected/7246fee8-7040-4e73-a863-f27ac5776673-kube-api-access-22rfj\") on node \"ip-10-0-135-174.ec2.internal\" DevicePath \"\"" Apr 17 21:25:32.002988 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:32.002987 2572 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/7246fee8-7040-4e73-a863-f27ac5776673-maas-api-tls\") on node \"ip-10-0-135-174.ec2.internal\" DevicePath \"\"" Apr 17 21:25:32.270639 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:32.270604 2572 generic.go:358] "Generic (PLEG): container finished" podID="7246fee8-7040-4e73-a863-f27ac5776673" containerID="2307327cffe0e130ad1e91c5a4a51b82473b4315dfc9bf3ca4ce07cbb5655574" exitCode=0 Apr 17 21:25:32.271141 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:32.270679 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-5b4449676-fmk49" event={"ID":"7246fee8-7040-4e73-a863-f27ac5776673","Type":"ContainerDied","Data":"2307327cffe0e130ad1e91c5a4a51b82473b4315dfc9bf3ca4ce07cbb5655574"} Apr 17 21:25:32.271141 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:32.270690 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-5b4449676-fmk49" Apr 17 21:25:32.271141 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:32.270707 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-5b4449676-fmk49" event={"ID":"7246fee8-7040-4e73-a863-f27ac5776673","Type":"ContainerDied","Data":"a4a46a3209218e9d2b788ce230fe5259473467492da3d085d8cfe9a61427c80c"} Apr 17 21:25:32.271141 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:32.270727 2572 scope.go:117] "RemoveContainer" containerID="2307327cffe0e130ad1e91c5a4a51b82473b4315dfc9bf3ca4ce07cbb5655574" Apr 17 21:25:32.272327 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:32.272299 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_e38f40b4-53d8-4792-8ced-d1409571e740/main/1.log" Apr 17 21:25:32.272671 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:32.272656 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_e38f40b4-53d8-4792-8ced-d1409571e740/main/0.log" Apr 17 21:25:32.273000 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:32.272973 2572 generic.go:358] "Generic (PLEG): container finished" podID="e38f40b4-53d8-4792-8ced-d1409571e740" containerID="c0d5724668fa62fedc06bb65f8a6ee7bda525e83e7a0837295e54f5e2e2625bc" exitCode=2 Apr 17 21:25:32.273097 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:32.273010 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" event={"ID":"e38f40b4-53d8-4792-8ced-d1409571e740","Type":"ContainerDied","Data":"c0d5724668fa62fedc06bb65f8a6ee7bda525e83e7a0837295e54f5e2e2625bc"} Apr 17 21:25:32.273512 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:32.273491 2572 scope.go:117] "RemoveContainer" containerID="c0d5724668fa62fedc06bb65f8a6ee7bda525e83e7a0837295e54f5e2e2625bc" Apr 17 21:25:32.273751 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:25:32.273730 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_llm(e38f40b4-53d8-4792-8ced-d1409571e740)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" podUID="e38f40b4-53d8-4792-8ced-d1409571e740" Apr 17 21:25:32.279291 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:32.279275 2572 scope.go:117] "RemoveContainer" containerID="2307327cffe0e130ad1e91c5a4a51b82473b4315dfc9bf3ca4ce07cbb5655574" Apr 17 21:25:32.279544 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:25:32.279523 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2307327cffe0e130ad1e91c5a4a51b82473b4315dfc9bf3ca4ce07cbb5655574\": container with ID starting with 2307327cffe0e130ad1e91c5a4a51b82473b4315dfc9bf3ca4ce07cbb5655574 not found: ID does not exist" containerID="2307327cffe0e130ad1e91c5a4a51b82473b4315dfc9bf3ca4ce07cbb5655574" Apr 17 21:25:32.279587 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:32.279552 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2307327cffe0e130ad1e91c5a4a51b82473b4315dfc9bf3ca4ce07cbb5655574"} err="failed to get container status \"2307327cffe0e130ad1e91c5a4a51b82473b4315dfc9bf3ca4ce07cbb5655574\": rpc error: code = NotFound desc = could not find container \"2307327cffe0e130ad1e91c5a4a51b82473b4315dfc9bf3ca4ce07cbb5655574\": container with ID starting with 2307327cffe0e130ad1e91c5a4a51b82473b4315dfc9bf3ca4ce07cbb5655574 not found: ID does not exist" Apr 17 21:25:32.279587 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:32.279569 2572 scope.go:117] "RemoveContainer" containerID="9138e744c920e9be056bdcf57559af27ec212057ea90f2fbd8ad4787913e6177" Apr 17 21:25:32.301268 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:32.301248 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-5b4449676-fmk49"] Apr 17 21:25:32.305183 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:32.305153 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-5b4449676-fmk49"] Apr 17 21:25:32.869068 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:32.869034 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7246fee8-7040-4e73-a863-f27ac5776673" path="/var/lib/kubelet/pods/7246fee8-7040-4e73-a863-f27ac5776673/volumes" Apr 17 21:25:33.278261 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:33.278235 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_e38f40b4-53d8-4792-8ced-d1409571e740/main/1.log" Apr 17 21:25:33.575308 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:33.575238 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" Apr 17 21:25:33.575308 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:33.575268 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" Apr 17 21:25:33.575631 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:33.575616 2572 scope.go:117] "RemoveContainer" containerID="c0d5724668fa62fedc06bb65f8a6ee7bda525e83e7a0837295e54f5e2e2625bc" Apr 17 21:25:33.575800 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:25:33.575784 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_llm(e38f40b4-53d8-4792-8ced-d1409571e740)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" podUID="e38f40b4-53d8-4792-8ced-d1409571e740" Apr 17 21:25:34.481777 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:34.481735 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" Apr 17 21:25:34.481777 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:34.481782 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" Apr 17 21:25:34.482331 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:34.482312 2572 scope.go:117] "RemoveContainer" containerID="f9a18cbba12d7a87f0c45539ef27fba1164a55db184405558660771f4c816d9c" Apr 17 21:25:34.482546 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:25:34.482526 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_llm(c4329648-06d9-49f2-bbc6-c9cb28c1e100)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" podUID="c4329648-06d9-49f2-bbc6-c9cb28c1e100" Apr 17 21:25:44.864398 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:44.864368 2572 scope.go:117] "RemoveContainer" containerID="f9a18cbba12d7a87f0c45539ef27fba1164a55db184405558660771f4c816d9c" Apr 17 21:25:45.320009 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:45.319977 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_c4329648-06d9-49f2-bbc6-c9cb28c1e100/main/2.log" Apr 17 21:25:45.320399 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:45.320383 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_c4329648-06d9-49f2-bbc6-c9cb28c1e100/main/1.log" Apr 17 21:25:45.320693 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:45.320672 2572 generic.go:358] "Generic (PLEG): container finished" podID="c4329648-06d9-49f2-bbc6-c9cb28c1e100" containerID="47009ece0d91fc0ffbd0d733570ff2e919ecdc3e1a09aa8d872797b424c9f9ba" exitCode=2 Apr 17 21:25:45.320751 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:45.320728 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" event={"ID":"c4329648-06d9-49f2-bbc6-c9cb28c1e100","Type":"ContainerDied","Data":"47009ece0d91fc0ffbd0d733570ff2e919ecdc3e1a09aa8d872797b424c9f9ba"} Apr 17 21:25:45.320789 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:45.320770 2572 scope.go:117] "RemoveContainer" containerID="f9a18cbba12d7a87f0c45539ef27fba1164a55db184405558660771f4c816d9c" Apr 17 21:25:45.321160 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:45.321140 2572 scope.go:117] "RemoveContainer" containerID="47009ece0d91fc0ffbd0d733570ff2e919ecdc3e1a09aa8d872797b424c9f9ba" Apr 17 21:25:45.321423 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:25:45.321397 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_llm(c4329648-06d9-49f2-bbc6-c9cb28c1e100)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" podUID="c4329648-06d9-49f2-bbc6-c9cb28c1e100" Apr 17 21:25:46.325801 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:46.325775 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_c4329648-06d9-49f2-bbc6-c9cb28c1e100/main/2.log" Apr 17 21:25:47.863615 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:47.863587 2572 scope.go:117] "RemoveContainer" containerID="c0d5724668fa62fedc06bb65f8a6ee7bda525e83e7a0837295e54f5e2e2625bc" Apr 17 21:25:48.334003 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:48.333977 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_e38f40b4-53d8-4792-8ced-d1409571e740/main/2.log" Apr 17 21:25:48.334401 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:48.334385 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_e38f40b4-53d8-4792-8ced-d1409571e740/main/1.log" Apr 17 21:25:48.334689 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:48.334668 2572 generic.go:358] "Generic (PLEG): container finished" podID="e38f40b4-53d8-4792-8ced-d1409571e740" containerID="1974d59ffacdb01a562c92d28f0b924d329d9fcf630f72a2af903e78f2ba1836" exitCode=2 Apr 17 21:25:48.334755 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:48.334737 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" event={"ID":"e38f40b4-53d8-4792-8ced-d1409571e740","Type":"ContainerDied","Data":"1974d59ffacdb01a562c92d28f0b924d329d9fcf630f72a2af903e78f2ba1836"} Apr 17 21:25:48.334793 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:48.334779 2572 scope.go:117] "RemoveContainer" containerID="c0d5724668fa62fedc06bb65f8a6ee7bda525e83e7a0837295e54f5e2e2625bc" Apr 17 21:25:48.335157 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:48.335139 2572 scope.go:117] "RemoveContainer" containerID="1974d59ffacdb01a562c92d28f0b924d329d9fcf630f72a2af903e78f2ba1836" Apr 17 21:25:48.335397 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:25:48.335379 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_llm(e38f40b4-53d8-4792-8ced-d1409571e740)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" podUID="e38f40b4-53d8-4792-8ced-d1409571e740" Apr 17 21:25:49.339958 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:49.339932 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_e38f40b4-53d8-4792-8ced-d1409571e740/main/2.log" Apr 17 21:25:53.575740 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:53.575705 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" Apr 17 21:25:53.576166 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:53.575799 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" Apr 17 21:25:53.576166 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:53.576110 2572 scope.go:117] "RemoveContainer" containerID="1974d59ffacdb01a562c92d28f0b924d329d9fcf630f72a2af903e78f2ba1836" Apr 17 21:25:53.576337 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:25:53.576321 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_llm(e38f40b4-53d8-4792-8ced-d1409571e740)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" podUID="e38f40b4-53d8-4792-8ced-d1409571e740" Apr 17 21:25:54.358397 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:54.358370 2572 scope.go:117] "RemoveContainer" containerID="1974d59ffacdb01a562c92d28f0b924d329d9fcf630f72a2af903e78f2ba1836" Apr 17 21:25:54.358566 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:25:54.358548 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_llm(e38f40b4-53d8-4792-8ced-d1409571e740)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" podUID="e38f40b4-53d8-4792-8ced-d1409571e740" Apr 17 21:25:54.481705 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:54.481672 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" Apr 17 21:25:54.481895 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:54.481715 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" Apr 17 21:25:54.482163 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:25:54.482146 2572 scope.go:117] "RemoveContainer" containerID="47009ece0d91fc0ffbd0d733570ff2e919ecdc3e1a09aa8d872797b424c9f9ba" Apr 17 21:25:54.482412 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:25:54.482391 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_llm(c4329648-06d9-49f2-bbc6-c9cb28c1e100)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" podUID="c4329648-06d9-49f2-bbc6-c9cb28c1e100" Apr 17 21:26:08.870051 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:08.864379 2572 scope.go:117] "RemoveContainer" containerID="47009ece0d91fc0ffbd0d733570ff2e919ecdc3e1a09aa8d872797b424c9f9ba" Apr 17 21:26:08.870051 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:08.864997 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 21:26:09.414503 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:09.414424 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_c4329648-06d9-49f2-bbc6-c9cb28c1e100/main/3.log" Apr 17 21:26:09.414804 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:09.414787 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_c4329648-06d9-49f2-bbc6-c9cb28c1e100/main/2.log" Apr 17 21:26:09.415127 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:09.415103 2572 generic.go:358] "Generic (PLEG): container finished" podID="c4329648-06d9-49f2-bbc6-c9cb28c1e100" containerID="9f986feea590648e18701bee897ff8b15a4dda62641da0a4f2711e9a3f3c0847" exitCode=2 Apr 17 21:26:09.415286 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:09.415186 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" event={"ID":"c4329648-06d9-49f2-bbc6-c9cb28c1e100","Type":"ContainerDied","Data":"9f986feea590648e18701bee897ff8b15a4dda62641da0a4f2711e9a3f3c0847"} Apr 17 21:26:09.415286 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:09.415235 2572 scope.go:117] "RemoveContainer" containerID="47009ece0d91fc0ffbd0d733570ff2e919ecdc3e1a09aa8d872797b424c9f9ba" Apr 17 21:26:09.415742 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:09.415725 2572 scope.go:117] "RemoveContainer" containerID="9f986feea590648e18701bee897ff8b15a4dda62641da0a4f2711e9a3f3c0847" Apr 17 21:26:09.415956 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:26:09.415938 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_llm(c4329648-06d9-49f2-bbc6-c9cb28c1e100)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" podUID="c4329648-06d9-49f2-bbc6-c9cb28c1e100" Apr 17 21:26:09.864295 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:09.864262 2572 scope.go:117] "RemoveContainer" containerID="1974d59ffacdb01a562c92d28f0b924d329d9fcf630f72a2af903e78f2ba1836" Apr 17 21:26:10.420011 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:10.419934 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_e38f40b4-53d8-4792-8ced-d1409571e740/main/3.log" Apr 17 21:26:10.420396 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:10.420330 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_e38f40b4-53d8-4792-8ced-d1409571e740/main/2.log" Apr 17 21:26:10.420627 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:10.420605 2572 generic.go:358] "Generic (PLEG): container finished" podID="e38f40b4-53d8-4792-8ced-d1409571e740" containerID="1eeffea484b0d7eaf1db1e6e10b722bc62de941e3b9f2d531f0b5c0bdcc7941c" exitCode=2 Apr 17 21:26:10.420704 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:10.420674 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" event={"ID":"e38f40b4-53d8-4792-8ced-d1409571e740","Type":"ContainerDied","Data":"1eeffea484b0d7eaf1db1e6e10b722bc62de941e3b9f2d531f0b5c0bdcc7941c"} Apr 17 21:26:10.420755 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:10.420710 2572 scope.go:117] "RemoveContainer" containerID="1974d59ffacdb01a562c92d28f0b924d329d9fcf630f72a2af903e78f2ba1836" Apr 17 21:26:10.421157 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:10.421134 2572 scope.go:117] "RemoveContainer" containerID="1eeffea484b0d7eaf1db1e6e10b722bc62de941e3b9f2d531f0b5c0bdcc7941c" Apr 17 21:26:10.421420 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:26:10.421401 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_llm(e38f40b4-53d8-4792-8ced-d1409571e740)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" podUID="e38f40b4-53d8-4792-8ced-d1409571e740" Apr 17 21:26:10.422320 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:10.422305 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_c4329648-06d9-49f2-bbc6-c9cb28c1e100/main/3.log" Apr 17 21:26:11.427431 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:11.427403 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_e38f40b4-53d8-4792-8ced-d1409571e740/main/3.log" Apr 17 21:26:13.575065 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:13.575033 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" Apr 17 21:26:13.575065 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:13.575070 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" Apr 17 21:26:13.575483 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:13.575466 2572 scope.go:117] "RemoveContainer" containerID="1eeffea484b0d7eaf1db1e6e10b722bc62de941e3b9f2d531f0b5c0bdcc7941c" Apr 17 21:26:13.575719 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:26:13.575700 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_llm(e38f40b4-53d8-4792-8ced-d1409571e740)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" podUID="e38f40b4-53d8-4792-8ced-d1409571e740" Apr 17 21:26:14.482318 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:14.482287 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" Apr 17 21:26:14.482318 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:14.482322 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" Apr 17 21:26:14.482738 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:14.482714 2572 scope.go:117] "RemoveContainer" containerID="9f986feea590648e18701bee897ff8b15a4dda62641da0a4f2711e9a3f3c0847" Apr 17 21:26:14.482910 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:26:14.482892 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_llm(c4329648-06d9-49f2-bbc6-c9cb28c1e100)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" podUID="c4329648-06d9-49f2-bbc6-c9cb28c1e100" Apr 17 21:26:25.864515 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:25.864480 2572 scope.go:117] "RemoveContainer" containerID="9f986feea590648e18701bee897ff8b15a4dda62641da0a4f2711e9a3f3c0847" Apr 17 21:26:25.864969 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:26:25.864655 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_llm(c4329648-06d9-49f2-bbc6-c9cb28c1e100)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" podUID="c4329648-06d9-49f2-bbc6-c9cb28c1e100" Apr 17 21:26:26.865578 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:26.865547 2572 scope.go:117] "RemoveContainer" containerID="1eeffea484b0d7eaf1db1e6e10b722bc62de941e3b9f2d531f0b5c0bdcc7941c" Apr 17 21:26:26.866020 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:26:26.865725 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_llm(e38f40b4-53d8-4792-8ced-d1409571e740)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" podUID="e38f40b4-53d8-4792-8ced-d1409571e740" Apr 17 21:26:37.864348 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:37.864267 2572 scope.go:117] "RemoveContainer" containerID="9f986feea590648e18701bee897ff8b15a4dda62641da0a4f2711e9a3f3c0847" Apr 17 21:26:37.864714 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:26:37.864483 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_llm(c4329648-06d9-49f2-bbc6-c9cb28c1e100)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" podUID="c4329648-06d9-49f2-bbc6-c9cb28c1e100" Apr 17 21:26:40.868604 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:40.868566 2572 scope.go:117] "RemoveContainer" containerID="1eeffea484b0d7eaf1db1e6e10b722bc62de941e3b9f2d531f0b5c0bdcc7941c" Apr 17 21:26:40.869164 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:26:40.868818 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_llm(e38f40b4-53d8-4792-8ced-d1409571e740)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" podUID="e38f40b4-53d8-4792-8ced-d1409571e740" Apr 17 21:26:42.482507 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:42.482461 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4"] Apr 17 21:26:42.483519 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:42.483120 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7246fee8-7040-4e73-a863-f27ac5776673" containerName="maas-api" Apr 17 21:26:42.483519 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:42.483142 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7246fee8-7040-4e73-a863-f27ac5776673" containerName="maas-api" Apr 17 21:26:42.483519 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:42.483256 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="7246fee8-7040-4e73-a863-f27ac5776673" containerName="maas-api" Apr 17 21:26:42.486850 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:42.486825 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" Apr 17 21:26:42.489341 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:42.489319 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 17 21:26:42.499196 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:42.499139 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4"] Apr 17 21:26:42.587472 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:42.587422 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3ba10146-c865-43ee-b3ba-c8ff54cc1e59-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4\" (UID: \"3ba10146-c865-43ee-b3ba-c8ff54cc1e59\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" Apr 17 21:26:42.587472 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:42.587466 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3ba10146-c865-43ee-b3ba-c8ff54cc1e59-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4\" (UID: \"3ba10146-c865-43ee-b3ba-c8ff54cc1e59\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" Apr 17 21:26:42.587739 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:42.587496 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cwtb\" (UniqueName: \"kubernetes.io/projected/3ba10146-c865-43ee-b3ba-c8ff54cc1e59-kube-api-access-7cwtb\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4\" (UID: \"3ba10146-c865-43ee-b3ba-c8ff54cc1e59\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" Apr 17 21:26:42.587739 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:42.587562 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3ba10146-c865-43ee-b3ba-c8ff54cc1e59-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4\" (UID: \"3ba10146-c865-43ee-b3ba-c8ff54cc1e59\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" Apr 17 21:26:42.587739 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:42.587609 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3ba10146-c865-43ee-b3ba-c8ff54cc1e59-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4\" (UID: \"3ba10146-c865-43ee-b3ba-c8ff54cc1e59\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" Apr 17 21:26:42.587739 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:42.587646 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3ba10146-c865-43ee-b3ba-c8ff54cc1e59-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4\" (UID: \"3ba10146-c865-43ee-b3ba-c8ff54cc1e59\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" Apr 17 21:26:42.688601 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:42.688560 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3ba10146-c865-43ee-b3ba-c8ff54cc1e59-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4\" (UID: \"3ba10146-c865-43ee-b3ba-c8ff54cc1e59\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" Apr 17 21:26:42.688822 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:42.688625 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3ba10146-c865-43ee-b3ba-c8ff54cc1e59-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4\" (UID: \"3ba10146-c865-43ee-b3ba-c8ff54cc1e59\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" Apr 17 21:26:42.688822 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:42.688746 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3ba10146-c865-43ee-b3ba-c8ff54cc1e59-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4\" (UID: \"3ba10146-c865-43ee-b3ba-c8ff54cc1e59\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" Apr 17 21:26:42.688822 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:42.688774 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3ba10146-c865-43ee-b3ba-c8ff54cc1e59-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4\" (UID: \"3ba10146-c865-43ee-b3ba-c8ff54cc1e59\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" Apr 17 21:26:42.688822 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:42.688797 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7cwtb\" (UniqueName: \"kubernetes.io/projected/3ba10146-c865-43ee-b3ba-c8ff54cc1e59-kube-api-access-7cwtb\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4\" (UID: \"3ba10146-c865-43ee-b3ba-c8ff54cc1e59\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" Apr 17 21:26:42.689030 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:42.688855 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3ba10146-c865-43ee-b3ba-c8ff54cc1e59-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4\" (UID: \"3ba10146-c865-43ee-b3ba-c8ff54cc1e59\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" Apr 17 21:26:42.689199 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:42.689131 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3ba10146-c865-43ee-b3ba-c8ff54cc1e59-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4\" (UID: \"3ba10146-c865-43ee-b3ba-c8ff54cc1e59\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" Apr 17 21:26:42.689320 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:42.689203 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3ba10146-c865-43ee-b3ba-c8ff54cc1e59-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4\" (UID: \"3ba10146-c865-43ee-b3ba-c8ff54cc1e59\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" Apr 17 21:26:42.689320 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:42.689291 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3ba10146-c865-43ee-b3ba-c8ff54cc1e59-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4\" (UID: \"3ba10146-c865-43ee-b3ba-c8ff54cc1e59\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" Apr 17 21:26:42.690918 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:42.690889 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3ba10146-c865-43ee-b3ba-c8ff54cc1e59-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4\" (UID: \"3ba10146-c865-43ee-b3ba-c8ff54cc1e59\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" Apr 17 21:26:42.691222 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:42.691202 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3ba10146-c865-43ee-b3ba-c8ff54cc1e59-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4\" (UID: \"3ba10146-c865-43ee-b3ba-c8ff54cc1e59\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" Apr 17 21:26:42.695892 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:42.695863 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cwtb\" (UniqueName: \"kubernetes.io/projected/3ba10146-c865-43ee-b3ba-c8ff54cc1e59-kube-api-access-7cwtb\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4\" (UID: \"3ba10146-c865-43ee-b3ba-c8ff54cc1e59\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" Apr 17 21:26:42.801834 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:42.801739 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" Apr 17 21:26:42.931824 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:42.931718 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4"] Apr 17 21:26:42.934573 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:26:42.934546 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ba10146_c865_43ee_b3ba_c8ff54cc1e59.slice/crio-21f7ad7cac43b95f0b69badbbaac81b55ef7b2a252720f6a8e0ada28e10698ed WatchSource:0}: Error finding container 21f7ad7cac43b95f0b69badbbaac81b55ef7b2a252720f6a8e0ada28e10698ed: Status 404 returned error can't find the container with id 21f7ad7cac43b95f0b69badbbaac81b55ef7b2a252720f6a8e0ada28e10698ed Apr 17 21:26:43.541827 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:43.541792 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" event={"ID":"3ba10146-c865-43ee-b3ba-c8ff54cc1e59","Type":"ContainerStarted","Data":"acf3ae86f191fdd87b01e2ee0c9db220cf70dea8d77071267255b01f4248e9f1"} Apr 17 21:26:43.541827 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:43.541833 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" event={"ID":"3ba10146-c865-43ee-b3ba-c8ff54cc1e59","Type":"ContainerStarted","Data":"21f7ad7cac43b95f0b69badbbaac81b55ef7b2a252720f6a8e0ada28e10698ed"} Apr 17 21:26:48.560875 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:48.560838 2572 generic.go:358] "Generic (PLEG): container finished" podID="3ba10146-c865-43ee-b3ba-c8ff54cc1e59" containerID="acf3ae86f191fdd87b01e2ee0c9db220cf70dea8d77071267255b01f4248e9f1" exitCode=0 Apr 17 21:26:48.561504 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:48.560912 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" event={"ID":"3ba10146-c865-43ee-b3ba-c8ff54cc1e59","Type":"ContainerDied","Data":"acf3ae86f191fdd87b01e2ee0c9db220cf70dea8d77071267255b01f4248e9f1"} Apr 17 21:26:49.566633 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:49.566606 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4_3ba10146-c865-43ee-b3ba-c8ff54cc1e59/main/0.log" Apr 17 21:26:49.567066 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:49.566921 2572 generic.go:358] "Generic (PLEG): container finished" podID="3ba10146-c865-43ee-b3ba-c8ff54cc1e59" containerID="ca31882c42816dc063040ca44afaffe49f1756ff478dde051bc0a366cf071ba4" exitCode=2 Apr 17 21:26:49.567066 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:49.567003 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" event={"ID":"3ba10146-c865-43ee-b3ba-c8ff54cc1e59","Type":"ContainerDied","Data":"ca31882c42816dc063040ca44afaffe49f1756ff478dde051bc0a366cf071ba4"} Apr 17 21:26:49.567441 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:49.567423 2572 scope.go:117] "RemoveContainer" containerID="ca31882c42816dc063040ca44afaffe49f1756ff478dde051bc0a366cf071ba4" Apr 17 21:26:50.572042 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:50.572014 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4_3ba10146-c865-43ee-b3ba-c8ff54cc1e59/main/1.log" Apr 17 21:26:50.572485 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:50.572435 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4_3ba10146-c865-43ee-b3ba-c8ff54cc1e59/main/0.log" Apr 17 21:26:50.572746 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:50.572723 2572 generic.go:358] "Generic (PLEG): container finished" podID="3ba10146-c865-43ee-b3ba-c8ff54cc1e59" containerID="aed5460b039deed727a871fb22e3d49234b0592ac2eccb3b66ea20af3baee315" exitCode=2 Apr 17 21:26:50.572811 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:50.572792 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" event={"ID":"3ba10146-c865-43ee-b3ba-c8ff54cc1e59","Type":"ContainerDied","Data":"aed5460b039deed727a871fb22e3d49234b0592ac2eccb3b66ea20af3baee315"} Apr 17 21:26:50.572848 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:50.572837 2572 scope.go:117] "RemoveContainer" containerID="ca31882c42816dc063040ca44afaffe49f1756ff478dde051bc0a366cf071ba4" Apr 17 21:26:50.573289 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:50.573267 2572 scope.go:117] "RemoveContainer" containerID="aed5460b039deed727a871fb22e3d49234b0592ac2eccb3b66ea20af3baee315" Apr 17 21:26:50.573519 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:26:50.573491 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4_llm(3ba10146-c865-43ee-b3ba-c8ff54cc1e59)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" podUID="3ba10146-c865-43ee-b3ba-c8ff54cc1e59" Apr 17 21:26:50.864013 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:50.863928 2572 scope.go:117] "RemoveContainer" containerID="9f986feea590648e18701bee897ff8b15a4dda62641da0a4f2711e9a3f3c0847" Apr 17 21:26:51.577902 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:51.577877 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4_3ba10146-c865-43ee-b3ba-c8ff54cc1e59/main/1.log" Apr 17 21:26:51.579436 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:51.579417 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_c4329648-06d9-49f2-bbc6-c9cb28c1e100/main/4.log" Apr 17 21:26:51.579742 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:51.579728 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_c4329648-06d9-49f2-bbc6-c9cb28c1e100/main/3.log" Apr 17 21:26:51.580012 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:51.579991 2572 generic.go:358] "Generic (PLEG): container finished" podID="c4329648-06d9-49f2-bbc6-c9cb28c1e100" containerID="713d01c542e503b1a4031fc197fadaa062448d4d2cacb3bac2d650a25ee8d088" exitCode=2 Apr 17 21:26:51.580083 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:51.580025 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" event={"ID":"c4329648-06d9-49f2-bbc6-c9cb28c1e100","Type":"ContainerDied","Data":"713d01c542e503b1a4031fc197fadaa062448d4d2cacb3bac2d650a25ee8d088"} Apr 17 21:26:51.580083 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:51.580051 2572 scope.go:117] "RemoveContainer" containerID="9f986feea590648e18701bee897ff8b15a4dda62641da0a4f2711e9a3f3c0847" Apr 17 21:26:51.580495 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:51.580478 2572 scope.go:117] "RemoveContainer" containerID="713d01c542e503b1a4031fc197fadaa062448d4d2cacb3bac2d650a25ee8d088" Apr 17 21:26:51.580720 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:26:51.580693 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_llm(c4329648-06d9-49f2-bbc6-c9cb28c1e100)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" podUID="c4329648-06d9-49f2-bbc6-c9cb28c1e100" Apr 17 21:26:52.585016 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:52.584987 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_c4329648-06d9-49f2-bbc6-c9cb28c1e100/main/4.log" Apr 17 21:26:52.802821 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:52.802778 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" Apr 17 21:26:52.802821 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:52.802821 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" Apr 17 21:26:52.803226 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:52.803201 2572 scope.go:117] "RemoveContainer" containerID="aed5460b039deed727a871fb22e3d49234b0592ac2eccb3b66ea20af3baee315" Apr 17 21:26:52.803401 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:26:52.803384 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4_llm(3ba10146-c865-43ee-b3ba-c8ff54cc1e59)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" podUID="3ba10146-c865-43ee-b3ba-c8ff54cc1e59" Apr 17 21:26:52.864281 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:52.864213 2572 scope.go:117] "RemoveContainer" containerID="1eeffea484b0d7eaf1db1e6e10b722bc62de941e3b9f2d531f0b5c0bdcc7941c" Apr 17 21:26:53.589858 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:53.589831 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_e38f40b4-53d8-4792-8ced-d1409571e740/main/4.log" Apr 17 21:26:53.590331 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:53.590272 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_e38f40b4-53d8-4792-8ced-d1409571e740/main/3.log" Apr 17 21:26:53.590628 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:53.590606 2572 generic.go:358] "Generic (PLEG): container finished" podID="e38f40b4-53d8-4792-8ced-d1409571e740" containerID="733fc6b890f37a012f1b5775fe0261edbfac9c26055d1056afc1cb8fffc4a5e1" exitCode=2 Apr 17 21:26:53.590692 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:53.590674 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" event={"ID":"e38f40b4-53d8-4792-8ced-d1409571e740","Type":"ContainerDied","Data":"733fc6b890f37a012f1b5775fe0261edbfac9c26055d1056afc1cb8fffc4a5e1"} Apr 17 21:26:53.590734 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:53.590713 2572 scope.go:117] "RemoveContainer" containerID="1eeffea484b0d7eaf1db1e6e10b722bc62de941e3b9f2d531f0b5c0bdcc7941c" Apr 17 21:26:53.591107 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:53.591088 2572 scope.go:117] "RemoveContainer" containerID="733fc6b890f37a012f1b5775fe0261edbfac9c26055d1056afc1cb8fffc4a5e1" Apr 17 21:26:53.591321 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:26:53.591304 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_llm(e38f40b4-53d8-4792-8ced-d1409571e740)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" podUID="e38f40b4-53d8-4792-8ced-d1409571e740" Apr 17 21:26:54.481846 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:54.481818 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" Apr 17 21:26:54.482040 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:54.481860 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" Apr 17 21:26:54.482367 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:54.482347 2572 scope.go:117] "RemoveContainer" containerID="713d01c542e503b1a4031fc197fadaa062448d4d2cacb3bac2d650a25ee8d088" Apr 17 21:26:54.482584 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:26:54.482564 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_llm(c4329648-06d9-49f2-bbc6-c9cb28c1e100)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" podUID="c4329648-06d9-49f2-bbc6-c9cb28c1e100" Apr 17 21:26:54.594919 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:26:54.594898 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_e38f40b4-53d8-4792-8ced-d1409571e740/main/4.log" Apr 17 21:27:03.575207 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:27:03.575154 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" Apr 17 21:27:03.575207 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:27:03.575210 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" Apr 17 21:27:03.575728 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:27:03.575596 2572 scope.go:117] "RemoveContainer" containerID="733fc6b890f37a012f1b5775fe0261edbfac9c26055d1056afc1cb8fffc4a5e1" Apr 17 21:27:03.575801 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:27:03.575782 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_llm(e38f40b4-53d8-4792-8ced-d1409571e740)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" podUID="e38f40b4-53d8-4792-8ced-d1409571e740" Apr 17 21:27:03.863596 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:27:03.863524 2572 scope.go:117] "RemoveContainer" containerID="aed5460b039deed727a871fb22e3d49234b0592ac2eccb3b66ea20af3baee315" Apr 17 21:27:04.628495 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:27:04.628468 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4_3ba10146-c865-43ee-b3ba-c8ff54cc1e59/main/2.log" Apr 17 21:27:04.628884 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:27:04.628822 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4_3ba10146-c865-43ee-b3ba-c8ff54cc1e59/main/1.log" Apr 17 21:27:04.629137 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:27:04.629119 2572 generic.go:358] "Generic (PLEG): container finished" podID="3ba10146-c865-43ee-b3ba-c8ff54cc1e59" containerID="41709967b92f8a870c6d1868b60c7d4a939cc7f9c2afe05135ca87541da5032d" exitCode=2 Apr 17 21:27:04.629217 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:27:04.629195 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" event={"ID":"3ba10146-c865-43ee-b3ba-c8ff54cc1e59","Type":"ContainerDied","Data":"41709967b92f8a870c6d1868b60c7d4a939cc7f9c2afe05135ca87541da5032d"} Apr 17 21:27:04.629264 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:27:04.629239 2572 scope.go:117] "RemoveContainer" containerID="aed5460b039deed727a871fb22e3d49234b0592ac2eccb3b66ea20af3baee315" Apr 17 21:27:04.629671 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:27:04.629651 2572 scope.go:117] "RemoveContainer" containerID="41709967b92f8a870c6d1868b60c7d4a939cc7f9c2afe05135ca87541da5032d" Apr 17 21:27:04.629895 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:27:04.629867 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4_llm(3ba10146-c865-43ee-b3ba-c8ff54cc1e59)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" podUID="3ba10146-c865-43ee-b3ba-c8ff54cc1e59" Apr 17 21:27:05.634410 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:27:05.634379 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4_3ba10146-c865-43ee-b3ba-c8ff54cc1e59/main/2.log" Apr 17 21:27:06.867925 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:27:06.865769 2572 scope.go:117] "RemoveContainer" containerID="713d01c542e503b1a4031fc197fadaa062448d4d2cacb3bac2d650a25ee8d088" Apr 17 21:27:06.867925 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:27:06.866012 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_llm(c4329648-06d9-49f2-bbc6-c9cb28c1e100)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" podUID="c4329648-06d9-49f2-bbc6-c9cb28c1e100" Apr 17 21:27:12.802287 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:27:12.802244 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" Apr 17 21:27:12.802287 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:27:12.802284 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" Apr 17 21:27:12.802803 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:27:12.802712 2572 scope.go:117] "RemoveContainer" containerID="41709967b92f8a870c6d1868b60c7d4a939cc7f9c2afe05135ca87541da5032d" Apr 17 21:27:12.802926 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:27:12.802907 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4_llm(3ba10146-c865-43ee-b3ba-c8ff54cc1e59)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" podUID="3ba10146-c865-43ee-b3ba-c8ff54cc1e59" Apr 17 21:27:15.863784 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:27:15.863747 2572 scope.go:117] "RemoveContainer" containerID="733fc6b890f37a012f1b5775fe0261edbfac9c26055d1056afc1cb8fffc4a5e1" Apr 17 21:27:15.864153 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:27:15.863934 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_llm(e38f40b4-53d8-4792-8ced-d1409571e740)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" podUID="e38f40b4-53d8-4792-8ced-d1409571e740" Apr 17 21:27:20.864071 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:27:20.864038 2572 scope.go:117] "RemoveContainer" containerID="713d01c542e503b1a4031fc197fadaa062448d4d2cacb3bac2d650a25ee8d088" Apr 17 21:27:20.864482 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:27:20.864253 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_llm(c4329648-06d9-49f2-bbc6-c9cb28c1e100)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" podUID="c4329648-06d9-49f2-bbc6-c9cb28c1e100" Apr 17 21:27:24.864166 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:27:24.864130 2572 scope.go:117] "RemoveContainer" containerID="41709967b92f8a870c6d1868b60c7d4a939cc7f9c2afe05135ca87541da5032d" Apr 17 21:27:25.706357 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:27:25.706328 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4_3ba10146-c865-43ee-b3ba-c8ff54cc1e59/main/3.log" Apr 17 21:27:25.706749 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:27:25.706733 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4_3ba10146-c865-43ee-b3ba-c8ff54cc1e59/main/2.log" Apr 17 21:27:25.707047 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:27:25.707026 2572 generic.go:358] "Generic (PLEG): container finished" podID="3ba10146-c865-43ee-b3ba-c8ff54cc1e59" containerID="eb1f2f9dd99452a0689bf99c4be4ea00d109b7f915c22a625d76ce23b03b329a" exitCode=2 Apr 17 21:27:25.707129 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:27:25.707103 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" event={"ID":"3ba10146-c865-43ee-b3ba-c8ff54cc1e59","Type":"ContainerDied","Data":"eb1f2f9dd99452a0689bf99c4be4ea00d109b7f915c22a625d76ce23b03b329a"} Apr 17 21:27:25.707188 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:27:25.707151 2572 scope.go:117] "RemoveContainer" containerID="41709967b92f8a870c6d1868b60c7d4a939cc7f9c2afe05135ca87541da5032d" Apr 17 21:27:25.707571 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:27:25.707556 2572 scope.go:117] "RemoveContainer" containerID="eb1f2f9dd99452a0689bf99c4be4ea00d109b7f915c22a625d76ce23b03b329a" Apr 17 21:27:25.707785 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:27:25.707767 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4_llm(3ba10146-c865-43ee-b3ba-c8ff54cc1e59)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" podUID="3ba10146-c865-43ee-b3ba-c8ff54cc1e59" Apr 17 21:27:26.712325 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:27:26.712295 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4_3ba10146-c865-43ee-b3ba-c8ff54cc1e59/main/3.log" Apr 17 21:27:27.864578 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:27:27.864543 2572 scope.go:117] "RemoveContainer" containerID="733fc6b890f37a012f1b5775fe0261edbfac9c26055d1056afc1cb8fffc4a5e1" Apr 17 21:27:27.864978 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:27:27.864752 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_llm(e38f40b4-53d8-4792-8ced-d1409571e740)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" podUID="e38f40b4-53d8-4792-8ced-d1409571e740" Apr 17 21:27:32.802240 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:27:32.802206 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" Apr 17 21:27:32.802240 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:27:32.802245 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" Apr 17 21:27:32.802776 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:27:32.802707 2572 scope.go:117] "RemoveContainer" containerID="eb1f2f9dd99452a0689bf99c4be4ea00d109b7f915c22a625d76ce23b03b329a" Apr 17 21:27:32.802942 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:27:32.802919 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4_llm(3ba10146-c865-43ee-b3ba-c8ff54cc1e59)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" podUID="3ba10146-c865-43ee-b3ba-c8ff54cc1e59" Apr 17 21:27:35.864003 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:27:35.863966 2572 scope.go:117] "RemoveContainer" containerID="713d01c542e503b1a4031fc197fadaa062448d4d2cacb3bac2d650a25ee8d088" Apr 17 21:27:35.864523 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:27:35.864156 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_llm(c4329648-06d9-49f2-bbc6-c9cb28c1e100)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" podUID="c4329648-06d9-49f2-bbc6-c9cb28c1e100" Apr 17 21:27:40.863853 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:27:40.863818 2572 scope.go:117] "RemoveContainer" containerID="733fc6b890f37a012f1b5775fe0261edbfac9c26055d1056afc1cb8fffc4a5e1" Apr 17 21:27:40.864372 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:27:40.864010 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_llm(e38f40b4-53d8-4792-8ced-d1409571e740)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" podUID="e38f40b4-53d8-4792-8ced-d1409571e740" Apr 17 21:27:44.864056 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:27:44.864024 2572 scope.go:117] "RemoveContainer" containerID="eb1f2f9dd99452a0689bf99c4be4ea00d109b7f915c22a625d76ce23b03b329a" Apr 17 21:27:44.864525 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:27:44.864230 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4_llm(3ba10146-c865-43ee-b3ba-c8ff54cc1e59)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" podUID="3ba10146-c865-43ee-b3ba-c8ff54cc1e59" Apr 17 21:27:50.864563 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:27:50.864531 2572 scope.go:117] "RemoveContainer" containerID="713d01c542e503b1a4031fc197fadaa062448d4d2cacb3bac2d650a25ee8d088" Apr 17 21:27:50.865035 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:27:50.864744 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_llm(c4329648-06d9-49f2-bbc6-c9cb28c1e100)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" podUID="c4329648-06d9-49f2-bbc6-c9cb28c1e100" Apr 17 21:27:53.863880 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:27:53.863842 2572 scope.go:117] "RemoveContainer" containerID="733fc6b890f37a012f1b5775fe0261edbfac9c26055d1056afc1cb8fffc4a5e1" Apr 17 21:27:53.864392 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:27:53.864104 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_llm(e38f40b4-53d8-4792-8ced-d1409571e740)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" podUID="e38f40b4-53d8-4792-8ced-d1409571e740" Apr 17 21:27:57.863733 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:27:57.863699 2572 scope.go:117] "RemoveContainer" containerID="eb1f2f9dd99452a0689bf99c4be4ea00d109b7f915c22a625d76ce23b03b329a" Apr 17 21:27:57.864325 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:27:57.863892 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4_llm(3ba10146-c865-43ee-b3ba-c8ff54cc1e59)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" podUID="3ba10146-c865-43ee-b3ba-c8ff54cc1e59" Apr 17 21:28:01.864035 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:01.863996 2572 scope.go:117] "RemoveContainer" containerID="713d01c542e503b1a4031fc197fadaa062448d4d2cacb3bac2d650a25ee8d088" Apr 17 21:28:01.864447 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:28:01.864227 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_llm(c4329648-06d9-49f2-bbc6-c9cb28c1e100)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" podUID="c4329648-06d9-49f2-bbc6-c9cb28c1e100" Apr 17 21:28:04.863833 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:04.863792 2572 scope.go:117] "RemoveContainer" containerID="733fc6b890f37a012f1b5775fe0261edbfac9c26055d1056afc1cb8fffc4a5e1" Apr 17 21:28:04.864325 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:28:04.864014 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_llm(e38f40b4-53d8-4792-8ced-d1409571e740)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" podUID="e38f40b4-53d8-4792-8ced-d1409571e740" Apr 17 21:28:09.863691 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:09.863615 2572 scope.go:117] "RemoveContainer" containerID="eb1f2f9dd99452a0689bf99c4be4ea00d109b7f915c22a625d76ce23b03b329a" Apr 17 21:28:10.866409 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:10.866380 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4_3ba10146-c865-43ee-b3ba-c8ff54cc1e59/main/4.log" Apr 17 21:28:10.866940 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:10.866797 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4_3ba10146-c865-43ee-b3ba-c8ff54cc1e59/main/3.log" Apr 17 21:28:10.867113 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:10.867091 2572 generic.go:358] "Generic (PLEG): container finished" podID="3ba10146-c865-43ee-b3ba-c8ff54cc1e59" containerID="12f92241982ad335031f04fe3dbb72eeb572ab18d709c482f8300a20e7489bc3" exitCode=2 Apr 17 21:28:10.868019 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:10.867996 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" event={"ID":"3ba10146-c865-43ee-b3ba-c8ff54cc1e59","Type":"ContainerDied","Data":"12f92241982ad335031f04fe3dbb72eeb572ab18d709c482f8300a20e7489bc3"} Apr 17 21:28:10.868073 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:10.868040 2572 scope.go:117] "RemoveContainer" containerID="eb1f2f9dd99452a0689bf99c4be4ea00d109b7f915c22a625d76ce23b03b329a" Apr 17 21:28:10.868546 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:10.868523 2572 scope.go:117] "RemoveContainer" containerID="12f92241982ad335031f04fe3dbb72eeb572ab18d709c482f8300a20e7489bc3" Apr 17 21:28:10.868787 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:28:10.868767 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4_llm(3ba10146-c865-43ee-b3ba-c8ff54cc1e59)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" podUID="3ba10146-c865-43ee-b3ba-c8ff54cc1e59" Apr 17 21:28:11.872292 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:11.872260 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4_3ba10146-c865-43ee-b3ba-c8ff54cc1e59/main/4.log" Apr 17 21:28:12.802415 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:12.802373 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" Apr 17 21:28:12.802415 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:12.802414 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" Apr 17 21:28:12.802881 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:12.802861 2572 scope.go:117] "RemoveContainer" containerID="12f92241982ad335031f04fe3dbb72eeb572ab18d709c482f8300a20e7489bc3" Apr 17 21:28:12.803086 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:28:12.803067 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4_llm(3ba10146-c865-43ee-b3ba-c8ff54cc1e59)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" podUID="3ba10146-c865-43ee-b3ba-c8ff54cc1e59" Apr 17 21:28:14.863671 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:14.863641 2572 scope.go:117] "RemoveContainer" containerID="713d01c542e503b1a4031fc197fadaa062448d4d2cacb3bac2d650a25ee8d088" Apr 17 21:28:15.887310 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:15.887279 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_c4329648-06d9-49f2-bbc6-c9cb28c1e100/main/5.log" Apr 17 21:28:15.887747 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:15.887678 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_c4329648-06d9-49f2-bbc6-c9cb28c1e100/main/4.log" Apr 17 21:28:15.888000 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:15.887973 2572 generic.go:358] "Generic (PLEG): container finished" podID="c4329648-06d9-49f2-bbc6-c9cb28c1e100" containerID="f0f92e209b034556fd2fc4febafaa553771ecbd656cae8bdccc3d1a914bfca40" exitCode=2 Apr 17 21:28:15.888068 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:15.888048 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" event={"ID":"c4329648-06d9-49f2-bbc6-c9cb28c1e100","Type":"ContainerDied","Data":"f0f92e209b034556fd2fc4febafaa553771ecbd656cae8bdccc3d1a914bfca40"} Apr 17 21:28:15.888106 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:15.888093 2572 scope.go:117] "RemoveContainer" containerID="713d01c542e503b1a4031fc197fadaa062448d4d2cacb3bac2d650a25ee8d088" Apr 17 21:28:15.888594 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:15.888574 2572 scope.go:117] "RemoveContainer" containerID="f0f92e209b034556fd2fc4febafaa553771ecbd656cae8bdccc3d1a914bfca40" Apr 17 21:28:15.888828 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:28:15.888809 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_llm(c4329648-06d9-49f2-bbc6-c9cb28c1e100)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" podUID="c4329648-06d9-49f2-bbc6-c9cb28c1e100" Apr 17 21:28:16.789554 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:16.789520 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4_3ba10146-c865-43ee-b3ba-c8ff54cc1e59/main/4.log" Apr 17 21:28:16.790154 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:16.790135 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_e38f40b4-53d8-4792-8ced-d1409571e740/main/4.log" Apr 17 21:28:16.790812 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:16.790796 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_c4329648-06d9-49f2-bbc6-c9cb28c1e100/main/5.log" Apr 17 21:28:16.791707 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:16.791687 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4_3ba10146-c865-43ee-b3ba-c8ff54cc1e59/main/4.log" Apr 17 21:28:16.792311 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:16.792291 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_e38f40b4-53d8-4792-8ced-d1409571e740/main/4.log" Apr 17 21:28:16.792979 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:16.792959 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_c4329648-06d9-49f2-bbc6-c9cb28c1e100/main/5.log" Apr 17 21:28:16.805082 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:16.805063 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7q89l_6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82/ovn-acl-logging/0.log" Apr 17 21:28:16.807079 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:16.807059 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7q89l_6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82/ovn-acl-logging/0.log" Apr 17 21:28:16.892239 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:16.892212 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_c4329648-06d9-49f2-bbc6-c9cb28c1e100/main/5.log" Apr 17 21:28:17.863809 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:17.863778 2572 scope.go:117] "RemoveContainer" containerID="733fc6b890f37a012f1b5775fe0261edbfac9c26055d1056afc1cb8fffc4a5e1" Apr 17 21:28:18.900924 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:18.900896 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_e38f40b4-53d8-4792-8ced-d1409571e740/main/5.log" Apr 17 21:28:18.901469 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:18.901342 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_e38f40b4-53d8-4792-8ced-d1409571e740/main/4.log" Apr 17 21:28:18.901653 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:18.901633 2572 generic.go:358] "Generic (PLEG): container finished" podID="e38f40b4-53d8-4792-8ced-d1409571e740" containerID="e1ac12d4940803b34aebb7e68cc7c8164801c54f2ac955fd3ae3ca2a647e429e" exitCode=2 Apr 17 21:28:18.901705 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:18.901685 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" event={"ID":"e38f40b4-53d8-4792-8ced-d1409571e740","Type":"ContainerDied","Data":"e1ac12d4940803b34aebb7e68cc7c8164801c54f2ac955fd3ae3ca2a647e429e"} Apr 17 21:28:18.901742 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:18.901727 2572 scope.go:117] "RemoveContainer" containerID="733fc6b890f37a012f1b5775fe0261edbfac9c26055d1056afc1cb8fffc4a5e1" Apr 17 21:28:18.902153 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:18.902139 2572 scope.go:117] "RemoveContainer" containerID="e1ac12d4940803b34aebb7e68cc7c8164801c54f2ac955fd3ae3ca2a647e429e" Apr 17 21:28:18.902360 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:28:18.902345 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_llm(e38f40b4-53d8-4792-8ced-d1409571e740)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" podUID="e38f40b4-53d8-4792-8ced-d1409571e740" Apr 17 21:28:19.907254 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:19.907223 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_e38f40b4-53d8-4792-8ced-d1409571e740/main/5.log" Apr 17 21:28:23.575436 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:23.575396 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" Apr 17 21:28:23.575436 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:23.575434 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" Apr 17 21:28:23.575855 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:23.575843 2572 scope.go:117] "RemoveContainer" containerID="e1ac12d4940803b34aebb7e68cc7c8164801c54f2ac955fd3ae3ca2a647e429e" Apr 17 21:28:23.576040 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:28:23.576023 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_llm(e38f40b4-53d8-4792-8ced-d1409571e740)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" podUID="e38f40b4-53d8-4792-8ced-d1409571e740" Apr 17 21:28:24.481736 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:24.481695 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" Apr 17 21:28:24.481736 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:24.481737 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" Apr 17 21:28:24.482236 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:24.482220 2572 scope.go:117] "RemoveContainer" containerID="f0f92e209b034556fd2fc4febafaa553771ecbd656cae8bdccc3d1a914bfca40" Apr 17 21:28:24.482418 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:28:24.482402 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_llm(c4329648-06d9-49f2-bbc6-c9cb28c1e100)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" podUID="c4329648-06d9-49f2-bbc6-c9cb28c1e100" Apr 17 21:28:25.864439 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:25.864404 2572 scope.go:117] "RemoveContainer" containerID="12f92241982ad335031f04fe3dbb72eeb572ab18d709c482f8300a20e7489bc3" Apr 17 21:28:25.864850 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:28:25.864605 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4_llm(3ba10146-c865-43ee-b3ba-c8ff54cc1e59)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" podUID="3ba10146-c865-43ee-b3ba-c8ff54cc1e59" Apr 17 21:28:38.863612 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:38.863570 2572 scope.go:117] "RemoveContainer" containerID="e1ac12d4940803b34aebb7e68cc7c8164801c54f2ac955fd3ae3ca2a647e429e" Apr 17 21:28:38.864102 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:28:38.863847 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_llm(e38f40b4-53d8-4792-8ced-d1409571e740)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" podUID="e38f40b4-53d8-4792-8ced-d1409571e740" Apr 17 21:28:38.864102 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:38.863864 2572 scope.go:117] "RemoveContainer" containerID="12f92241982ad335031f04fe3dbb72eeb572ab18d709c482f8300a20e7489bc3" Apr 17 21:28:38.864102 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:38.863936 2572 scope.go:117] "RemoveContainer" containerID="f0f92e209b034556fd2fc4febafaa553771ecbd656cae8bdccc3d1a914bfca40" Apr 17 21:28:38.864102 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:28:38.864062 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4_llm(3ba10146-c865-43ee-b3ba-c8ff54cc1e59)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" podUID="3ba10146-c865-43ee-b3ba-c8ff54cc1e59" Apr 17 21:28:38.864102 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:28:38.864092 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_llm(c4329648-06d9-49f2-bbc6-c9cb28c1e100)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" podUID="c4329648-06d9-49f2-bbc6-c9cb28c1e100" Apr 17 21:28:51.863772 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:51.863735 2572 scope.go:117] "RemoveContainer" containerID="12f92241982ad335031f04fe3dbb72eeb572ab18d709c482f8300a20e7489bc3" Apr 17 21:28:51.864294 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:28:51.863925 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4_llm(3ba10146-c865-43ee-b3ba-c8ff54cc1e59)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" podUID="3ba10146-c865-43ee-b3ba-c8ff54cc1e59" Apr 17 21:28:52.864402 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:52.864372 2572 scope.go:117] "RemoveContainer" containerID="e1ac12d4940803b34aebb7e68cc7c8164801c54f2ac955fd3ae3ca2a647e429e" Apr 17 21:28:52.864837 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:28:52.864563 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_llm(e38f40b4-53d8-4792-8ced-d1409571e740)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" podUID="e38f40b4-53d8-4792-8ced-d1409571e740" Apr 17 21:28:53.864464 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:28:53.864423 2572 scope.go:117] "RemoveContainer" containerID="f0f92e209b034556fd2fc4febafaa553771ecbd656cae8bdccc3d1a914bfca40" Apr 17 21:28:53.866821 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:28:53.864669 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_llm(c4329648-06d9-49f2-bbc6-c9cb28c1e100)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" podUID="c4329648-06d9-49f2-bbc6-c9cb28c1e100" Apr 17 21:29:02.863998 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:29:02.863963 2572 scope.go:117] "RemoveContainer" containerID="12f92241982ad335031f04fe3dbb72eeb572ab18d709c482f8300a20e7489bc3" Apr 17 21:29:02.864531 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:29:02.864156 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4_llm(3ba10146-c865-43ee-b3ba-c8ff54cc1e59)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" podUID="3ba10146-c865-43ee-b3ba-c8ff54cc1e59" Apr 17 21:29:04.863900 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:29:04.863861 2572 scope.go:117] "RemoveContainer" containerID="f0f92e209b034556fd2fc4febafaa553771ecbd656cae8bdccc3d1a914bfca40" Apr 17 21:29:04.864368 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:29:04.864144 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_llm(c4329648-06d9-49f2-bbc6-c9cb28c1e100)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" podUID="c4329648-06d9-49f2-bbc6-c9cb28c1e100" Apr 17 21:29:05.864247 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:29:05.864209 2572 scope.go:117] "RemoveContainer" containerID="e1ac12d4940803b34aebb7e68cc7c8164801c54f2ac955fd3ae3ca2a647e429e" Apr 17 21:29:05.864707 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:29:05.864416 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_llm(e38f40b4-53d8-4792-8ced-d1409571e740)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" podUID="e38f40b4-53d8-4792-8ced-d1409571e740" Apr 17 21:29:13.864085 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:29:13.864053 2572 scope.go:117] "RemoveContainer" containerID="12f92241982ad335031f04fe3dbb72eeb572ab18d709c482f8300a20e7489bc3" Apr 17 21:29:13.864476 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:29:13.864235 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4_llm(3ba10146-c865-43ee-b3ba-c8ff54cc1e59)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" podUID="3ba10146-c865-43ee-b3ba-c8ff54cc1e59" Apr 17 21:29:17.864310 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:29:17.864276 2572 scope.go:117] "RemoveContainer" containerID="e1ac12d4940803b34aebb7e68cc7c8164801c54f2ac955fd3ae3ca2a647e429e" Apr 17 21:29:17.864692 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:29:17.864464 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_llm(e38f40b4-53d8-4792-8ced-d1409571e740)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" podUID="e38f40b4-53d8-4792-8ced-d1409571e740" Apr 17 21:29:19.863728 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:29:19.863699 2572 scope.go:117] "RemoveContainer" containerID="f0f92e209b034556fd2fc4febafaa553771ecbd656cae8bdccc3d1a914bfca40" Apr 17 21:29:19.864084 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:29:19.863860 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_llm(c4329648-06d9-49f2-bbc6-c9cb28c1e100)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" podUID="c4329648-06d9-49f2-bbc6-c9cb28c1e100" Apr 17 21:29:27.863510 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:29:27.863475 2572 scope.go:117] "RemoveContainer" containerID="12f92241982ad335031f04fe3dbb72eeb572ab18d709c482f8300a20e7489bc3" Apr 17 21:29:27.864007 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:29:27.863667 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4_llm(3ba10146-c865-43ee-b3ba-c8ff54cc1e59)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" podUID="3ba10146-c865-43ee-b3ba-c8ff54cc1e59" Apr 17 21:29:30.864561 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:29:30.864531 2572 scope.go:117] "RemoveContainer" containerID="e1ac12d4940803b34aebb7e68cc7c8164801c54f2ac955fd3ae3ca2a647e429e" Apr 17 21:29:30.864921 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:29:30.864726 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_llm(e38f40b4-53d8-4792-8ced-d1409571e740)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" podUID="e38f40b4-53d8-4792-8ced-d1409571e740" Apr 17 21:29:32.863987 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:29:32.863958 2572 scope.go:117] "RemoveContainer" containerID="f0f92e209b034556fd2fc4febafaa553771ecbd656cae8bdccc3d1a914bfca40" Apr 17 21:29:32.864381 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:29:32.864141 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_llm(c4329648-06d9-49f2-bbc6-c9cb28c1e100)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" podUID="c4329648-06d9-49f2-bbc6-c9cb28c1e100" Apr 17 21:29:41.863939 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:29:41.863854 2572 scope.go:117] "RemoveContainer" containerID="12f92241982ad335031f04fe3dbb72eeb572ab18d709c482f8300a20e7489bc3" Apr 17 21:29:42.189401 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:29:42.189341 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4_3ba10146-c865-43ee-b3ba-c8ff54cc1e59/main/5.log" Apr 17 21:29:42.189691 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:29:42.189677 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4_3ba10146-c865-43ee-b3ba-c8ff54cc1e59/main/4.log" Apr 17 21:29:42.190007 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:29:42.189986 2572 generic.go:358] "Generic (PLEG): container finished" podID="3ba10146-c865-43ee-b3ba-c8ff54cc1e59" containerID="a56cc459d375f389916d9924c3839fdd0977366fae3b432325d655e607e50526" exitCode=2 Apr 17 21:29:42.190099 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:29:42.190075 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" event={"ID":"3ba10146-c865-43ee-b3ba-c8ff54cc1e59","Type":"ContainerDied","Data":"a56cc459d375f389916d9924c3839fdd0977366fae3b432325d655e607e50526"} Apr 17 21:29:42.190142 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:29:42.190124 2572 scope.go:117] "RemoveContainer" containerID="12f92241982ad335031f04fe3dbb72eeb572ab18d709c482f8300a20e7489bc3" Apr 17 21:29:42.190499 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:29:42.190484 2572 scope.go:117] "RemoveContainer" containerID="a56cc459d375f389916d9924c3839fdd0977366fae3b432325d655e607e50526" Apr 17 21:29:42.190701 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:29:42.190679 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4_llm(3ba10146-c865-43ee-b3ba-c8ff54cc1e59)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" podUID="3ba10146-c865-43ee-b3ba-c8ff54cc1e59" Apr 17 21:29:42.801962 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:29:42.801926 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" Apr 17 21:29:42.801962 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:29:42.801964 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" Apr 17 21:29:43.194961 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:29:43.194889 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4_3ba10146-c865-43ee-b3ba-c8ff54cc1e59/main/5.log" Apr 17 21:29:43.195583 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:29:43.195568 2572 scope.go:117] "RemoveContainer" containerID="a56cc459d375f389916d9924c3839fdd0977366fae3b432325d655e607e50526" Apr 17 21:29:43.195756 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:29:43.195738 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4_llm(3ba10146-c865-43ee-b3ba-c8ff54cc1e59)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" podUID="3ba10146-c865-43ee-b3ba-c8ff54cc1e59" Apr 17 21:29:43.863645 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:29:43.863614 2572 scope.go:117] "RemoveContainer" containerID="f0f92e209b034556fd2fc4febafaa553771ecbd656cae8bdccc3d1a914bfca40" Apr 17 21:29:43.863839 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:29:43.863785 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_llm(c4329648-06d9-49f2-bbc6-c9cb28c1e100)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" podUID="c4329648-06d9-49f2-bbc6-c9cb28c1e100" Apr 17 21:29:45.863472 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:29:45.863438 2572 scope.go:117] "RemoveContainer" containerID="e1ac12d4940803b34aebb7e68cc7c8164801c54f2ac955fd3ae3ca2a647e429e" Apr 17 21:29:45.863862 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:29:45.863618 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_llm(e38f40b4-53d8-4792-8ced-d1409571e740)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" podUID="e38f40b4-53d8-4792-8ced-d1409571e740" Apr 17 21:29:54.863535 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:29:54.863501 2572 scope.go:117] "RemoveContainer" containerID="f0f92e209b034556fd2fc4febafaa553771ecbd656cae8bdccc3d1a914bfca40" Apr 17 21:29:54.863920 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:29:54.863678 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_llm(c4329648-06d9-49f2-bbc6-c9cb28c1e100)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" podUID="c4329648-06d9-49f2-bbc6-c9cb28c1e100" Apr 17 21:29:55.864046 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:29:55.864017 2572 scope.go:117] "RemoveContainer" containerID="a56cc459d375f389916d9924c3839fdd0977366fae3b432325d655e607e50526" Apr 17 21:29:55.864455 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:29:55.864224 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4_llm(3ba10146-c865-43ee-b3ba-c8ff54cc1e59)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" podUID="3ba10146-c865-43ee-b3ba-c8ff54cc1e59" Apr 17 21:29:57.864252 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:29:57.864222 2572 scope.go:117] "RemoveContainer" containerID="e1ac12d4940803b34aebb7e68cc7c8164801c54f2ac955fd3ae3ca2a647e429e" Apr 17 21:29:57.864661 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:29:57.864398 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_llm(e38f40b4-53d8-4792-8ced-d1409571e740)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" podUID="e38f40b4-53d8-4792-8ced-d1409571e740" Apr 17 21:30:00.141942 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:00.141906 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29607690-zpqb6"] Apr 17 21:30:00.146508 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:00.146490 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607690-zpqb6" Apr 17 21:30:00.148811 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:00.148790 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-c56s7\"" Apr 17 21:30:00.151671 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:00.151650 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607690-zpqb6"] Apr 17 21:30:00.243266 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:00.243228 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-988fr\" (UniqueName: \"kubernetes.io/projected/14394a15-95aa-4eb9-9764-9780201f3dbf-kube-api-access-988fr\") pod \"maas-api-key-cleanup-29607690-zpqb6\" (UID: \"14394a15-95aa-4eb9-9764-9780201f3dbf\") " pod="opendatahub/maas-api-key-cleanup-29607690-zpqb6" Apr 17 21:30:00.344041 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:00.344003 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-988fr\" (UniqueName: \"kubernetes.io/projected/14394a15-95aa-4eb9-9764-9780201f3dbf-kube-api-access-988fr\") pod \"maas-api-key-cleanup-29607690-zpqb6\" (UID: \"14394a15-95aa-4eb9-9764-9780201f3dbf\") " pod="opendatahub/maas-api-key-cleanup-29607690-zpqb6" Apr 17 21:30:00.352071 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:00.352047 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-988fr\" (UniqueName: \"kubernetes.io/projected/14394a15-95aa-4eb9-9764-9780201f3dbf-kube-api-access-988fr\") pod \"maas-api-key-cleanup-29607690-zpqb6\" (UID: \"14394a15-95aa-4eb9-9764-9780201f3dbf\") " pod="opendatahub/maas-api-key-cleanup-29607690-zpqb6" Apr 17 21:30:00.458016 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:00.457951 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607690-zpqb6" Apr 17 21:30:00.574703 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:00.574680 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607690-zpqb6"] Apr 17 21:30:00.577285 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:30:00.577259 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14394a15_95aa_4eb9_9764_9780201f3dbf.slice/crio-c8acf6d39e070ea965d03f42f6fca8cc382702f3c8eaf1cdf824164241408eba WatchSource:0}: Error finding container c8acf6d39e070ea965d03f42f6fca8cc382702f3c8eaf1cdf824164241408eba: Status 404 returned error can't find the container with id c8acf6d39e070ea965d03f42f6fca8cc382702f3c8eaf1cdf824164241408eba Apr 17 21:30:01.260886 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:01.260857 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607690-zpqb6" event={"ID":"14394a15-95aa-4eb9-9764-9780201f3dbf","Type":"ContainerStarted","Data":"c8acf6d39e070ea965d03f42f6fca8cc382702f3c8eaf1cdf824164241408eba"} Apr 17 21:30:02.265589 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:02.265556 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607690-zpqb6" event={"ID":"14394a15-95aa-4eb9-9764-9780201f3dbf","Type":"ContainerStarted","Data":"1a3911bd404971c95275929ca1f0c4f2ff5f231817ce1ddd41f6c5673537ebb1"} Apr 17 21:30:02.281239 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:02.281105 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29607690-zpqb6" podStartSLOduration=1.68617596 podStartE2EDuration="2.281089131s" podCreationTimestamp="2026-04-17 21:30:00 +0000 UTC" firstStartedPulling="2026-04-17 21:30:00.578992427 +0000 UTC m=+1004.275547481" lastFinishedPulling="2026-04-17 21:30:01.173905603 +0000 UTC m=+1004.870460652" observedRunningTime="2026-04-17 21:30:02.280463727 +0000 UTC m=+1005.977018798" watchObservedRunningTime="2026-04-17 21:30:02.281089131 +0000 UTC m=+1005.977644203" Apr 17 21:30:07.864006 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:07.863974 2572 scope.go:117] "RemoveContainer" containerID="f0f92e209b034556fd2fc4febafaa553771ecbd656cae8bdccc3d1a914bfca40" Apr 17 21:30:07.864419 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:30:07.864166 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_llm(c4329648-06d9-49f2-bbc6-c9cb28c1e100)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" podUID="c4329648-06d9-49f2-bbc6-c9cb28c1e100" Apr 17 21:30:09.966033 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:09.965995 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_e38f40b4-53d8-4792-8ced-d1409571e740/main/5.log" Apr 17 21:30:10.094382 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:10.094352 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_e38f40b4-53d8-4792-8ced-d1409571e740/main/5.log" Apr 17 21:30:10.215326 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:10.215288 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_e38f40b4-53d8-4792-8ced-d1409571e740/storage-initializer/0.log" Apr 17 21:30:10.863524 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:10.863490 2572 scope.go:117] "RemoveContainer" containerID="a56cc459d375f389916d9924c3839fdd0977366fae3b432325d655e607e50526" Apr 17 21:30:10.863740 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:30:10.863673 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4_llm(3ba10146-c865-43ee-b3ba-c8ff54cc1e59)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" podUID="3ba10146-c865-43ee-b3ba-c8ff54cc1e59" Apr 17 21:30:11.229237 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:11.229206 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-7f57fb5654-f4ngj_ec083ab0-4ec0-4e8c-9da5-aa6d5f6591a1/authorino/0.log" Apr 17 21:30:11.864036 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:11.863998 2572 scope.go:117] "RemoveContainer" containerID="e1ac12d4940803b34aebb7e68cc7c8164801c54f2ac955fd3ae3ca2a647e429e" Apr 17 21:30:11.864262 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:30:11.864228 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_llm(e38f40b4-53d8-4792-8ced-d1409571e740)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" podUID="e38f40b4-53d8-4792-8ced-d1409571e740" Apr 17 21:30:15.415061 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:15.415031 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-key-cleanup-29607690-zpqb6_14394a15-95aa-4eb9-9764-9780201f3dbf/cleanup/0.log" Apr 17 21:30:15.529682 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:15.529646 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-764cf56b49-dz8mz_32031ea5-5706-4f88-aeb9-c78a19f458da/manager/0.log" Apr 17 21:30:16.022928 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:16.022892 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-694fdf7c65-zjznf_df4c4082-f6ad-45b2-864e-fb191d6d3aa8/manager/0.log" Apr 17 21:30:16.133641 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:16.133612 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-hbzsk_c75eaa04-92b2-4fa9-8dad-f52397712f27/postgres/0.log" Apr 17 21:30:17.425151 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:17.425113 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-7f57fb5654-f4ngj_ec083ab0-4ec0-4e8c-9da5-aa6d5f6591a1/authorino/0.log" Apr 17 21:30:18.819911 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:18.819878 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-56dddbd4f7-5tsjk_018458a5-1c58-49b8-b783-bebebc553f84/kube-auth-proxy/0.log" Apr 17 21:30:19.168828 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:19.168742 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7c95fdb69b-rgqgn_a7386c80-9465-42c2-b807-c545f0e73d34/router/0.log" Apr 17 21:30:19.681913 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:19.681885 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_c4329648-06d9-49f2-bbc6-c9cb28c1e100/storage-initializer/0.log" Apr 17 21:30:19.688603 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:19.688568 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_c4329648-06d9-49f2-bbc6-c9cb28c1e100/main/5.log" Apr 17 21:30:19.938245 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:19.938125 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4_3ba10146-c865-43ee-b3ba-c8ff54cc1e59/storage-initializer/0.log" Apr 17 21:30:19.943965 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:19.943942 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4_3ba10146-c865-43ee-b3ba-c8ff54cc1e59/main/5.log" Apr 17 21:30:20.064048 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:20.064017 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_e38f40b4-53d8-4792-8ced-d1409571e740/storage-initializer/0.log" Apr 17 21:30:20.070259 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:20.070235 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_e38f40b4-53d8-4792-8ced-d1409571e740/main/5.log" Apr 17 21:30:21.864211 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:21.864162 2572 scope.go:117] "RemoveContainer" containerID="a56cc459d375f389916d9924c3839fdd0977366fae3b432325d655e607e50526" Apr 17 21:30:21.864531 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:30:21.864355 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4_llm(3ba10146-c865-43ee-b3ba-c8ff54cc1e59)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" podUID="3ba10146-c865-43ee-b3ba-c8ff54cc1e59" Apr 17 21:30:21.864531 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:21.864363 2572 scope.go:117] "RemoveContainer" containerID="f0f92e209b034556fd2fc4febafaa553771ecbd656cae8bdccc3d1a914bfca40" Apr 17 21:30:21.864531 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:30:21.864474 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_llm(c4329648-06d9-49f2-bbc6-c9cb28c1e100)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" podUID="c4329648-06d9-49f2-bbc6-c9cb28c1e100" Apr 17 21:30:22.334969 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:22.334930 2572 generic.go:358] "Generic (PLEG): container finished" podID="14394a15-95aa-4eb9-9764-9780201f3dbf" containerID="1a3911bd404971c95275929ca1f0c4f2ff5f231817ce1ddd41f6c5673537ebb1" exitCode=6 Apr 17 21:30:22.335131 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:22.335005 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607690-zpqb6" event={"ID":"14394a15-95aa-4eb9-9764-9780201f3dbf","Type":"ContainerDied","Data":"1a3911bd404971c95275929ca1f0c4f2ff5f231817ce1ddd41f6c5673537ebb1"} Apr 17 21:30:22.335328 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:22.335314 2572 scope.go:117] "RemoveContainer" containerID="1a3911bd404971c95275929ca1f0c4f2ff5f231817ce1ddd41f6c5673537ebb1" Apr 17 21:30:23.340245 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:23.340208 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607690-zpqb6" event={"ID":"14394a15-95aa-4eb9-9764-9780201f3dbf","Type":"ContainerStarted","Data":"c6d5811ba795bb8bb38d3573846f9eaac6efcd3948baf87b6c3abce383bc7b79"} Apr 17 21:30:24.864430 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:24.864394 2572 scope.go:117] "RemoveContainer" containerID="e1ac12d4940803b34aebb7e68cc7c8164801c54f2ac955fd3ae3ca2a647e429e" Apr 17 21:30:24.864810 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:30:24.864564 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_llm(e38f40b4-53d8-4792-8ced-d1409571e740)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" podUID="e38f40b4-53d8-4792-8ced-d1409571e740" Apr 17 21:30:26.707797 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:26.707767 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-xjq7n_e1afcb7f-a57d-4014-a29d-5313bb6b154c/global-pull-secret-syncer/0.log" Apr 17 21:30:26.754890 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:26.754864 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-hllsm_aa8fd541-9d45-4bc6-891c-792a7ef59496/konnectivity-agent/0.log" Apr 17 21:30:26.855503 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:26.855479 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-135-174.ec2.internal_a9002f67337ee452fe59c1455ae92d3b/haproxy/0.log" Apr 17 21:30:31.254954 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:31.254915 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-7f57fb5654-f4ngj_ec083ab0-4ec0-4e8c-9da5-aa6d5f6591a1/authorino/0.log" Apr 17 21:30:33.292634 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:33.292602 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-kcrhz_621a2863-afe1-4038-a82c-a6786ab54ffc/monitoring-plugin/0.log" Apr 17 21:30:33.474151 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:33.474115 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-szdt5_d51a735f-f9c9-457e-9805-9c1c8b0e30cc/node-exporter/0.log" Apr 17 21:30:33.494795 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:33.494769 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-szdt5_d51a735f-f9c9-457e-9805-9c1c8b0e30cc/kube-rbac-proxy/0.log" Apr 17 21:30:33.517798 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:33.517776 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-szdt5_d51a735f-f9c9-457e-9805-9c1c8b0e30cc/init-textfile/0.log" Apr 17 21:30:33.543131 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:33.543065 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-8l9gf_977c9f1c-37d4-44d9-b178-b63a50b4a640/kube-rbac-proxy-main/0.log" Apr 17 21:30:33.564627 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:33.564607 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-8l9gf_977c9f1c-37d4-44d9-b178-b63a50b4a640/kube-rbac-proxy-self/0.log" Apr 17 21:30:33.586103 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:33.586085 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-8l9gf_977c9f1c-37d4-44d9-b178-b63a50b4a640/openshift-state-metrics/0.log" Apr 17 21:30:33.626038 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:33.626018 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6291479f-aae0-45f7-b5db-f1870619ef39/prometheus/0.log" Apr 17 21:30:33.643703 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:33.643684 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6291479f-aae0-45f7-b5db-f1870619ef39/config-reloader/0.log" Apr 17 21:30:33.666544 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:33.666516 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6291479f-aae0-45f7-b5db-f1870619ef39/thanos-sidecar/0.log" Apr 17 21:30:33.690001 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:33.689981 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6291479f-aae0-45f7-b5db-f1870619ef39/kube-rbac-proxy-web/0.log" Apr 17 21:30:33.711680 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:33.711659 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6291479f-aae0-45f7-b5db-f1870619ef39/kube-rbac-proxy/0.log" Apr 17 21:30:33.730807 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:33.730785 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6291479f-aae0-45f7-b5db-f1870619ef39/kube-rbac-proxy-thanos/0.log" Apr 17 21:30:33.749165 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:33.749140 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6291479f-aae0-45f7-b5db-f1870619ef39/init-config-reloader/0.log" Apr 17 21:30:33.774402 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:33.774383 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-7rgcx_2a733fe4-f6c7-40c4-9948-73411f5dd161/prometheus-operator/0.log" Apr 17 21:30:33.792044 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:33.792027 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-7rgcx_2a733fe4-f6c7-40c4-9948-73411f5dd161/kube-rbac-proxy/0.log" Apr 17 21:30:33.814931 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:33.814876 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-fz26b_e5ddf28a-4f8a-4a7c-af29-4c0befc12d5f/prometheus-operator-admission-webhook/0.log" Apr 17 21:30:33.938805 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:33.938779 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5dfb58dddc-9ttxq_08cc730b-19b5-47e6-9b01-174bc4e3cc13/thanos-query/0.log" Apr 17 21:30:33.961187 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:33.961149 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5dfb58dddc-9ttxq_08cc730b-19b5-47e6-9b01-174bc4e3cc13/kube-rbac-proxy-web/0.log" Apr 17 21:30:33.981045 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:33.981027 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5dfb58dddc-9ttxq_08cc730b-19b5-47e6-9b01-174bc4e3cc13/kube-rbac-proxy/0.log" Apr 17 21:30:33.998865 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:33.998849 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5dfb58dddc-9ttxq_08cc730b-19b5-47e6-9b01-174bc4e3cc13/prom-label-proxy/0.log" Apr 17 21:30:34.019344 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:34.019326 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5dfb58dddc-9ttxq_08cc730b-19b5-47e6-9b01-174bc4e3cc13/kube-rbac-proxy-rules/0.log" Apr 17 21:30:34.039026 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:34.039008 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5dfb58dddc-9ttxq_08cc730b-19b5-47e6-9b01-174bc4e3cc13/kube-rbac-proxy-metrics/0.log" Apr 17 21:30:34.864360 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:34.864326 2572 scope.go:117] "RemoveContainer" containerID="a56cc459d375f389916d9924c3839fdd0977366fae3b432325d655e607e50526" Apr 17 21:30:34.864735 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:30:34.864508 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4_llm(3ba10146-c865-43ee-b3ba-c8ff54cc1e59)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" podUID="3ba10146-c865-43ee-b3ba-c8ff54cc1e59" Apr 17 21:30:35.226899 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:35.226863 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-w64ck/perf-node-gather-daemonset-7qhp6"] Apr 17 21:30:35.230659 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:35.230643 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-7qhp6" Apr 17 21:30:35.232833 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:35.232813 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-w64ck\"/\"kube-root-ca.crt\"" Apr 17 21:30:35.232964 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:35.232814 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-w64ck\"/\"openshift-service-ca.crt\"" Apr 17 21:30:35.232964 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:35.232849 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-w64ck\"/\"default-dockercfg-fwnkt\"" Apr 17 21:30:35.240253 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:35.240231 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-w64ck/perf-node-gather-daemonset-7qhp6"] Apr 17 21:30:35.338709 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:35.338670 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/600e58c1-63ca-4a88-902a-c38cbe8229b2-sys\") pod \"perf-node-gather-daemonset-7qhp6\" (UID: \"600e58c1-63ca-4a88-902a-c38cbe8229b2\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-7qhp6" Apr 17 21:30:35.338709 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:35.338721 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmjbq\" (UniqueName: \"kubernetes.io/projected/600e58c1-63ca-4a88-902a-c38cbe8229b2-kube-api-access-xmjbq\") pod \"perf-node-gather-daemonset-7qhp6\" (UID: \"600e58c1-63ca-4a88-902a-c38cbe8229b2\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-7qhp6" Apr 17 21:30:35.338940 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:35.338797 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/600e58c1-63ca-4a88-902a-c38cbe8229b2-podres\") pod \"perf-node-gather-daemonset-7qhp6\" (UID: \"600e58c1-63ca-4a88-902a-c38cbe8229b2\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-7qhp6" Apr 17 21:30:35.338940 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:35.338830 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/600e58c1-63ca-4a88-902a-c38cbe8229b2-proc\") pod \"perf-node-gather-daemonset-7qhp6\" (UID: \"600e58c1-63ca-4a88-902a-c38cbe8229b2\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-7qhp6" Apr 17 21:30:35.338940 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:35.338884 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/600e58c1-63ca-4a88-902a-c38cbe8229b2-lib-modules\") pod \"perf-node-gather-daemonset-7qhp6\" (UID: \"600e58c1-63ca-4a88-902a-c38cbe8229b2\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-7qhp6" Apr 17 21:30:35.439328 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:35.439293 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/600e58c1-63ca-4a88-902a-c38cbe8229b2-podres\") pod \"perf-node-gather-daemonset-7qhp6\" (UID: \"600e58c1-63ca-4a88-902a-c38cbe8229b2\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-7qhp6" Apr 17 21:30:35.439509 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:35.439356 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/600e58c1-63ca-4a88-902a-c38cbe8229b2-proc\") pod \"perf-node-gather-daemonset-7qhp6\" (UID: \"600e58c1-63ca-4a88-902a-c38cbe8229b2\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-7qhp6" Apr 17 21:30:35.439509 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:35.439404 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/600e58c1-63ca-4a88-902a-c38cbe8229b2-lib-modules\") pod \"perf-node-gather-daemonset-7qhp6\" (UID: \"600e58c1-63ca-4a88-902a-c38cbe8229b2\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-7qhp6" Apr 17 21:30:35.439509 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:35.439476 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/600e58c1-63ca-4a88-902a-c38cbe8229b2-sys\") pod \"perf-node-gather-daemonset-7qhp6\" (UID: \"600e58c1-63ca-4a88-902a-c38cbe8229b2\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-7qhp6" Apr 17 21:30:35.439509 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:35.439500 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/600e58c1-63ca-4a88-902a-c38cbe8229b2-proc\") pod \"perf-node-gather-daemonset-7qhp6\" (UID: \"600e58c1-63ca-4a88-902a-c38cbe8229b2\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-7qhp6" Apr 17 21:30:35.439783 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:35.439504 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xmjbq\" (UniqueName: \"kubernetes.io/projected/600e58c1-63ca-4a88-902a-c38cbe8229b2-kube-api-access-xmjbq\") pod \"perf-node-gather-daemonset-7qhp6\" (UID: \"600e58c1-63ca-4a88-902a-c38cbe8229b2\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-7qhp6" Apr 17 21:30:35.439783 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:35.439547 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/600e58c1-63ca-4a88-902a-c38cbe8229b2-lib-modules\") pod \"perf-node-gather-daemonset-7qhp6\" (UID: \"600e58c1-63ca-4a88-902a-c38cbe8229b2\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-7qhp6" Apr 17 21:30:35.439783 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:35.439553 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/600e58c1-63ca-4a88-902a-c38cbe8229b2-sys\") pod \"perf-node-gather-daemonset-7qhp6\" (UID: \"600e58c1-63ca-4a88-902a-c38cbe8229b2\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-7qhp6" Apr 17 21:30:35.439783 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:35.439476 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/600e58c1-63ca-4a88-902a-c38cbe8229b2-podres\") pod \"perf-node-gather-daemonset-7qhp6\" (UID: \"600e58c1-63ca-4a88-902a-c38cbe8229b2\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-7qhp6" Apr 17 21:30:35.446403 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:35.446381 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmjbq\" (UniqueName: \"kubernetes.io/projected/600e58c1-63ca-4a88-902a-c38cbe8229b2-kube-api-access-xmjbq\") pod \"perf-node-gather-daemonset-7qhp6\" (UID: \"600e58c1-63ca-4a88-902a-c38cbe8229b2\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-7qhp6" Apr 17 21:30:35.541331 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:35.541275 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-7qhp6" Apr 17 21:30:35.664551 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:35.664505 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-w64ck/perf-node-gather-daemonset-7qhp6"] Apr 17 21:30:35.669685 ip-10-0-135-174 kubenswrapper[2572]: W0417 21:30:35.669654 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod600e58c1_63ca_4a88_902a_c38cbe8229b2.slice/crio-007fd825fe514b3c5e19aeb029c9296ba7de824cd14249915dcc3b6f7cf57f59 WatchSource:0}: Error finding container 007fd825fe514b3c5e19aeb029c9296ba7de824cd14249915dcc3b6f7cf57f59: Status 404 returned error can't find the container with id 007fd825fe514b3c5e19aeb029c9296ba7de824cd14249915dcc3b6f7cf57f59 Apr 17 21:30:35.864055 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:35.863994 2572 scope.go:117] "RemoveContainer" containerID="f0f92e209b034556fd2fc4febafaa553771ecbd656cae8bdccc3d1a914bfca40" Apr 17 21:30:35.864190 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:30:35.864146 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_llm(c4329648-06d9-49f2-bbc6-c9cb28c1e100)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" podUID="c4329648-06d9-49f2-bbc6-c9cb28c1e100" Apr 17 21:30:36.384756 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:36.384725 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-7qhp6" event={"ID":"600e58c1-63ca-4a88-902a-c38cbe8229b2","Type":"ContainerStarted","Data":"e07b0594e9c3abfa504bb10284546854a84f64e9f03fd28d552029196feb1e79"} Apr 17 21:30:36.384756 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:36.384758 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-7qhp6" event={"ID":"600e58c1-63ca-4a88-902a-c38cbe8229b2","Type":"ContainerStarted","Data":"007fd825fe514b3c5e19aeb029c9296ba7de824cd14249915dcc3b6f7cf57f59"} Apr 17 21:30:36.385150 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:36.384865 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-7qhp6" Apr 17 21:30:36.400902 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:36.400857 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-7qhp6" podStartSLOduration=1.400841602 podStartE2EDuration="1.400841602s" podCreationTimestamp="2026-04-17 21:30:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:30:36.398801966 +0000 UTC m=+1040.095357039" watchObservedRunningTime="2026-04-17 21:30:36.400841602 +0000 UTC m=+1040.097396678" Apr 17 21:30:37.597007 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:37.596977 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-s9qzj_4c6d8acf-b971-44ec-b14c-d6af7a3da43c/dns/0.log" Apr 17 21:30:37.636386 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:37.636360 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-s9qzj_4c6d8acf-b971-44ec-b14c-d6af7a3da43c/kube-rbac-proxy/0.log" Apr 17 21:30:37.782011 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:37.781978 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qnj2m_778ea300-0830-4ba5-8bc4-8bc4314e7652/dns-node-resolver/0.log" Apr 17 21:30:38.313178 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:38.313130 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-qsc7l_500b1ed9-0358-4553-b67c-814ae8a286af/node-ca/0.log" Apr 17 21:30:39.207492 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:39.207462 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-56dddbd4f7-5tsjk_018458a5-1c58-49b8-b783-bebebc553f84/kube-auth-proxy/0.log" Apr 17 21:30:39.276026 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:39.275995 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7c95fdb69b-rgqgn_a7386c80-9465-42c2-b807-c545f0e73d34/router/0.log" Apr 17 21:30:39.770127 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:39.770080 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-dfvbk_03a3935c-b9b0-4fda-8142-8cd63a6b5a88/serve-healthcheck-canary/0.log" Apr 17 21:30:39.863890 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:39.863855 2572 scope.go:117] "RemoveContainer" containerID="e1ac12d4940803b34aebb7e68cc7c8164801c54f2ac955fd3ae3ca2a647e429e" Apr 17 21:30:39.864116 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:30:39.864095 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6_llm(e38f40b4-53d8-4792-8ced-d1409571e740)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xzfs6" podUID="e38f40b4-53d8-4792-8ced-d1409571e740" Apr 17 21:30:40.304717 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:40.304683 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-d2h87_6bd70117-3252-4f39-aaf3-d2d5efefcc3b/insights-operator/1.log" Apr 17 21:30:40.305088 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:40.304784 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-d2h87_6bd70117-3252-4f39-aaf3-d2d5efefcc3b/insights-operator/0.log" Apr 17 21:30:40.612712 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:40.612630 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rjkj7_88bea75e-5668-4862-a73b-e197ee6429fe/kube-rbac-proxy/0.log" Apr 17 21:30:40.661519 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:40.661491 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rjkj7_88bea75e-5668-4862-a73b-e197ee6429fe/exporter/0.log" Apr 17 21:30:40.713494 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:40.713460 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rjkj7_88bea75e-5668-4862-a73b-e197ee6429fe/extractor/0.log" Apr 17 21:30:42.398004 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:42.397972 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-7qhp6" Apr 17 21:30:42.705785 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:42.705684 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-key-cleanup-29607690-zpqb6_14394a15-95aa-4eb9-9764-9780201f3dbf/cleanup/0.log" Apr 17 21:30:42.705785 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:42.705696 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-key-cleanup-29607690-zpqb6_14394a15-95aa-4eb9-9764-9780201f3dbf/cleanup/1.log" Apr 17 21:30:42.727952 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:42.727909 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-764cf56b49-dz8mz_32031ea5-5706-4f88-aeb9-c78a19f458da/manager/0.log" Apr 17 21:30:42.862618 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:42.862583 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-694fdf7c65-zjznf_df4c4082-f6ad-45b2-864e-fb191d6d3aa8/manager/0.log" Apr 17 21:30:42.880553 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:42.880521 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-hbzsk_c75eaa04-92b2-4fa9-8dad-f52397712f27/postgres/0.log" Apr 17 21:30:43.409486 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:43.409454 2572 generic.go:358] "Generic (PLEG): container finished" podID="14394a15-95aa-4eb9-9764-9780201f3dbf" containerID="c6d5811ba795bb8bb38d3573846f9eaac6efcd3948baf87b6c3abce383bc7b79" exitCode=6 Apr 17 21:30:43.409910 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:43.409514 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607690-zpqb6" event={"ID":"14394a15-95aa-4eb9-9764-9780201f3dbf","Type":"ContainerDied","Data":"c6d5811ba795bb8bb38d3573846f9eaac6efcd3948baf87b6c3abce383bc7b79"} Apr 17 21:30:43.409910 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:43.409569 2572 scope.go:117] "RemoveContainer" containerID="1a3911bd404971c95275929ca1f0c4f2ff5f231817ce1ddd41f6c5673537ebb1" Apr 17 21:30:43.409910 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:43.409855 2572 scope.go:117] "RemoveContainer" containerID="c6d5811ba795bb8bb38d3573846f9eaac6efcd3948baf87b6c3abce383bc7b79" Apr 17 21:30:43.410066 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:30:43.410056 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29607690-zpqb6_opendatahub(14394a15-95aa-4eb9-9764-9780201f3dbf)\"" pod="opendatahub/maas-api-key-cleanup-29607690-zpqb6" podUID="14394a15-95aa-4eb9-9764-9780201f3dbf" Apr 17 21:30:44.216103 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:44.216071 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-54f8864c6c-j26q5_f500dce8-6c1c-4523-9bf1-49f02322f25f/manager/0.log" Apr 17 21:30:46.866473 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:46.866442 2572 scope.go:117] "RemoveContainer" containerID="a56cc459d375f389916d9924c3839fdd0977366fae3b432325d655e607e50526" Apr 17 21:30:46.866910 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:30:46.866760 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4_llm(3ba10146-c865-43ee-b3ba-c8ff54cc1e59)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c85vqz4" podUID="3ba10146-c865-43ee-b3ba-c8ff54cc1e59" Apr 17 21:30:47.864299 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:47.864267 2572 scope.go:117] "RemoveContainer" containerID="f0f92e209b034556fd2fc4febafaa553771ecbd656cae8bdccc3d1a914bfca40" Apr 17 21:30:47.864548 ip-10-0-135-174 kubenswrapper[2572]: E0417 21:30:47.864463 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-zm6l7_llm(c4329648-06d9-49f2-bbc6-c9cb28c1e100)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-zm6l7" podUID="c4329648-06d9-49f2-bbc6-c9cb28c1e100" Apr 17 21:30:48.915387 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:48.915354 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-xtmbm_bfbb2b11-b5bf-4910-b508-d65f63da7218/kube-storage-version-migrator-operator/1.log" Apr 17 21:30:48.917140 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:48.917104 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-xtmbm_bfbb2b11-b5bf-4910-b508-d65f63da7218/kube-storage-version-migrator-operator/0.log" Apr 17 21:30:50.014933 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:50.014901 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w6rk2_4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26/kube-multus-additional-cni-plugins/0.log" Apr 17 21:30:50.034078 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:50.034047 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w6rk2_4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26/egress-router-binary-copy/0.log" Apr 17 21:30:50.052658 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:50.052631 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w6rk2_4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26/cni-plugins/0.log" Apr 17 21:30:50.071735 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:50.071705 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w6rk2_4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26/bond-cni-plugin/0.log" Apr 17 21:30:50.091992 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:50.091965 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w6rk2_4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26/routeoverride-cni/0.log" Apr 17 21:30:50.110370 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:50.110337 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w6rk2_4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26/whereabouts-cni-bincopy/0.log" Apr 17 21:30:50.129468 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:50.129441 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w6rk2_4ec35e02-e4dc-4f4b-8be7-ab8dceabbc26/whereabouts-cni/0.log" Apr 17 21:30:50.335072 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:50.334971 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nbbnz_066bf68c-ad0c-4b29-a288-000664effe73/kube-multus/0.log" Apr 17 21:30:50.431857 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:50.431816 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-nprtf_4574ebc4-f71f-480d-9e07-5e822f12bb1a/network-metrics-daemon/0.log" Apr 17 21:30:50.449654 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:50.449622 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-nprtf_4574ebc4-f71f-480d-9e07-5e822f12bb1a/kube-rbac-proxy/0.log" Apr 17 21:30:51.326241 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:51.326203 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7q89l_6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82/ovn-controller/0.log" Apr 17 21:30:51.341506 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:51.341470 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7q89l_6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82/ovn-acl-logging/0.log" Apr 17 21:30:51.350546 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:51.350520 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7q89l_6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82/ovn-acl-logging/1.log" Apr 17 21:30:51.370689 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:51.370668 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7q89l_6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82/kube-rbac-proxy-node/0.log" Apr 17 21:30:51.391702 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:51.391666 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7q89l_6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 21:30:51.406549 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:51.406519 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7q89l_6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82/northd/0.log" Apr 17 21:30:51.424134 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:51.424112 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7q89l_6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82/nbdb/0.log" Apr 17 21:30:51.442181 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:51.442148 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7q89l_6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82/sbdb/0.log" Apr 17 21:30:51.594191 ip-10-0-135-174 kubenswrapper[2572]: I0417 21:30:51.594096 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7q89l_6b49b1b9-2c21-4e2b-aed1-79c8cdc16b82/ovnkube-controller/0.log"