Apr 17 20:12:51.281613 ip-10-0-132-57 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 20:12:51.281623 ip-10-0-132-57 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 20:12:51.281630 ip-10-0-132-57 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 20:12:51.281844 ip-10-0-132-57 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 20:13:01.430643 ip-10-0-132-57 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 20:13:01.430658 ip-10-0-132-57 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 49bd0f933f844d988129674e22ae0caf -- Apr 17 20:15:32.795758 ip-10-0-132-57 systemd[1]: Starting Kubernetes Kubelet... Apr 17 20:15:33.267833 ip-10-0-132-57 kubenswrapper[2580]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 20:15:33.267833 ip-10-0-132-57 kubenswrapper[2580]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 20:15:33.267833 ip-10-0-132-57 kubenswrapper[2580]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 20:15:33.267833 ip-10-0-132-57 kubenswrapper[2580]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 20:15:33.267833 ip-10-0-132-57 kubenswrapper[2580]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 20:15:33.271187 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.271096 2580 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 20:15:33.278041 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278016 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:15:33.278041 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278035 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:15:33.278041 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278039 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:15:33.278041 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278042 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:15:33.278041 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278046 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:15:33.278041 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278049 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:15:33.278277 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278052 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:15:33.278277 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278056 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:15:33.278277 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278061 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:15:33.278277 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278064 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:15:33.278277 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278067 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:15:33.278277 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278069 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:15:33.278277 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278072 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:15:33.278277 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278074 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:15:33.278277 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278077 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:15:33.278277 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278080 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:15:33.278277 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278082 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:15:33.278277 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278085 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:15:33.278277 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278087 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:15:33.278277 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278090 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:15:33.278277 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278093 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:15:33.278277 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278095 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:15:33.278277 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278098 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:15:33.278277 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278101 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:15:33.278277 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278103 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:15:33.278783 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278106 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:15:33.278783 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278108 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:15:33.278783 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278111 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:15:33.278783 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278113 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:15:33.278783 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278116 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:15:33.278783 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278118 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:15:33.278783 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278121 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:15:33.278783 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278125 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:15:33.278783 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278128 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:15:33.278783 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278130 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:15:33.278783 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278133 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:15:33.278783 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278135 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:15:33.278783 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278138 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:15:33.278783 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278140 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:15:33.278783 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278143 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:15:33.278783 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278146 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:15:33.278783 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278149 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:15:33.278783 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278151 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:15:33.278783 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278154 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:15:33.278783 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278156 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:15:33.279264 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278159 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:15:33.279264 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278161 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:15:33.279264 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278164 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:15:33.279264 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278167 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:15:33.279264 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278169 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:15:33.279264 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278172 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:15:33.279264 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278174 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:15:33.279264 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278177 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:15:33.279264 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278180 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:15:33.279264 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278182 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:15:33.279264 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278185 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:15:33.279264 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278187 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:15:33.279264 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278189 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:15:33.279264 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278192 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:15:33.279264 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278194 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:15:33.279264 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278197 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:15:33.279264 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278199 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:15:33.279264 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278202 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:15:33.279264 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278204 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:15:33.279264 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278207 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:15:33.279761 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278209 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:15:33.279761 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278212 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:15:33.279761 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278214 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:15:33.279761 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278217 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:15:33.279761 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278219 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:15:33.279761 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278223 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:15:33.279761 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278228 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:15:33.279761 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278231 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:15:33.279761 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278234 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:15:33.279761 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278237 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:15:33.279761 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278240 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:15:33.279761 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278242 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:15:33.279761 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278245 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:15:33.279761 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278248 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:15:33.279761 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278251 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:15:33.279761 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278253 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:15:33.279761 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278255 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:15:33.279761 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278259 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:15:33.279761 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278261 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:15:33.280220 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278264 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:15:33.280220 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278266 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:15:33.280220 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278659 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:15:33.280220 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278665 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:15:33.280220 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278668 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:15:33.280220 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278670 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:15:33.280220 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278673 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:15:33.280220 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278675 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:15:33.280220 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278678 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:15:33.280220 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278681 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:15:33.280220 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278683 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:15:33.280220 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278686 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:15:33.280220 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278688 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:15:33.280220 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278691 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:15:33.280220 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278693 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:15:33.280220 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278696 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:15:33.280220 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278698 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:15:33.280220 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278701 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:15:33.280220 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278703 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:15:33.280220 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278706 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:15:33.280724 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278709 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:15:33.280724 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278712 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:15:33.280724 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278715 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:15:33.280724 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278718 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:15:33.280724 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278721 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:15:33.280724 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278723 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:15:33.280724 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278727 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:15:33.280724 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278730 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:15:33.280724 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278732 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:15:33.280724 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278735 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:15:33.280724 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278737 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:15:33.280724 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278739 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:15:33.280724 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278742 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:15:33.280724 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278744 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:15:33.280724 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278747 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:15:33.280724 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278749 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:15:33.280724 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278751 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:15:33.280724 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278754 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:15:33.280724 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278757 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:15:33.280724 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278760 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:15:33.281246 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278762 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:15:33.281246 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278765 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:15:33.281246 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278767 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:15:33.281246 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278770 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:15:33.281246 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278772 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:15:33.281246 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278775 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:15:33.281246 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278777 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:15:33.281246 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278779 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:15:33.281246 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278782 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:15:33.281246 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278785 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:15:33.281246 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278787 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:15:33.281246 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278793 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:15:33.281246 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278797 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:15:33.281246 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278800 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:15:33.281246 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278803 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:15:33.281246 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278807 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:15:33.281246 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278809 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:15:33.281246 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278812 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:15:33.281246 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278814 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:15:33.281726 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278820 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:15:33.281726 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278823 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:15:33.281726 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278825 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:15:33.281726 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278827 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:15:33.281726 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278830 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:15:33.281726 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278832 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:15:33.281726 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278835 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:15:33.281726 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278837 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:15:33.281726 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278840 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:15:33.281726 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278842 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:15:33.281726 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278845 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:15:33.281726 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278847 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:15:33.281726 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278850 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:15:33.281726 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278852 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:15:33.281726 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278854 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:15:33.281726 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278857 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:15:33.281726 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278859 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:15:33.281726 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278861 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:15:33.281726 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278864 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:15:33.281726 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278866 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:15:33.282229 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278869 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:15:33.282229 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278872 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:15:33.282229 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278874 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:15:33.282229 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278877 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:15:33.282229 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278880 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:15:33.282229 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278882 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:15:33.282229 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278885 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:15:33.282229 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278887 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:15:33.282229 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.278890 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:15:33.282229 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280144 2580 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 20:15:33.282229 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280154 2580 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 20:15:33.282229 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280159 2580 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 20:15:33.282229 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280164 2580 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 20:15:33.282229 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280170 2580 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 20:15:33.282229 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280173 2580 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 20:15:33.282229 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280178 2580 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 20:15:33.282229 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280182 2580 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 20:15:33.282229 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280185 2580 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 20:15:33.282229 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280189 2580 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 20:15:33.282229 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280192 2580 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 20:15:33.282229 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280196 2580 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 20:15:33.282750 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280199 2580 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 20:15:33.282750 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280201 2580 flags.go:64] FLAG: --cgroup-root="" Apr 17 20:15:33.282750 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280204 2580 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 20:15:33.282750 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280207 2580 flags.go:64] FLAG: --client-ca-file="" Apr 17 20:15:33.282750 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280210 2580 flags.go:64] FLAG: --cloud-config="" Apr 17 20:15:33.282750 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280213 2580 flags.go:64] FLAG: --cloud-provider="external" Apr 17 20:15:33.282750 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280216 2580 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 20:15:33.282750 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280221 2580 flags.go:64] FLAG: --cluster-domain="" Apr 17 20:15:33.282750 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280224 2580 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 20:15:33.282750 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280227 2580 flags.go:64] FLAG: --config-dir="" Apr 17 20:15:33.282750 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280230 2580 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 20:15:33.282750 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280234 2580 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 20:15:33.282750 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280238 2580 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 20:15:33.282750 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280241 2580 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 20:15:33.282750 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280244 2580 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 20:15:33.282750 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280247 2580 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 20:15:33.282750 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280251 2580 flags.go:64] FLAG: --contention-profiling="false" Apr 17 20:15:33.282750 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280254 2580 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 20:15:33.282750 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280257 2580 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 20:15:33.282750 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280260 2580 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 20:15:33.282750 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280263 2580 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 20:15:33.282750 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280267 2580 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 20:15:33.282750 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280269 2580 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 20:15:33.282750 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280272 2580 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 20:15:33.282750 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280275 2580 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 20:15:33.283347 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280281 2580 flags.go:64] FLAG: --enable-server="true" Apr 17 20:15:33.283347 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280285 2580 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 20:15:33.283347 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280288 2580 flags.go:64] FLAG: --event-burst="100" Apr 17 20:15:33.283347 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280292 2580 flags.go:64] FLAG: --event-qps="50" Apr 17 20:15:33.283347 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280302 2580 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 20:15:33.283347 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280306 2580 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 20:15:33.283347 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280309 2580 flags.go:64] FLAG: --eviction-hard="" Apr 17 20:15:33.283347 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280313 2580 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 20:15:33.283347 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280316 2580 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 20:15:33.283347 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280319 2580 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 20:15:33.283347 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280322 2580 flags.go:64] FLAG: --eviction-soft="" Apr 17 20:15:33.283347 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280325 2580 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 20:15:33.283347 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280327 2580 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 20:15:33.283347 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280330 2580 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 20:15:33.283347 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280333 2580 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 20:15:33.283347 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280336 2580 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 20:15:33.283347 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280339 2580 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 20:15:33.283347 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280342 2580 flags.go:64] FLAG: --feature-gates="" Apr 17 20:15:33.283347 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280345 2580 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 20:15:33.283347 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280348 2580 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 20:15:33.283347 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280351 2580 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 20:15:33.283347 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280354 2580 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 20:15:33.283347 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280357 2580 flags.go:64] FLAG: --healthz-port="10248" Apr 17 20:15:33.283347 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280360 2580 flags.go:64] FLAG: --help="false" Apr 17 20:15:33.283347 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280363 2580 flags.go:64] FLAG: --hostname-override="ip-10-0-132-57.ec2.internal" Apr 17 20:15:33.283956 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280366 2580 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 20:15:33.283956 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280368 2580 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 20:15:33.283956 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280371 2580 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 20:15:33.283956 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280375 2580 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 20:15:33.283956 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280378 2580 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 20:15:33.283956 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280380 2580 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 20:15:33.283956 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280383 2580 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 20:15:33.283956 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280386 2580 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 20:15:33.283956 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280394 2580 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 20:15:33.283956 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280398 2580 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 20:15:33.283956 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280401 2580 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 20:15:33.283956 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280403 2580 flags.go:64] FLAG: --kube-reserved="" Apr 17 20:15:33.283956 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280406 2580 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 20:15:33.283956 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280409 2580 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 20:15:33.283956 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280412 2580 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 20:15:33.283956 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280415 2580 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 20:15:33.283956 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280418 2580 flags.go:64] FLAG: --lock-file="" Apr 17 20:15:33.283956 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280420 2580 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 20:15:33.283956 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280423 2580 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 20:15:33.283956 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280426 2580 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 20:15:33.283956 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280431 2580 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 20:15:33.283956 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280434 2580 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 20:15:33.283956 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280436 2580 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 20:15:33.283956 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280439 2580 flags.go:64] FLAG: --logging-format="text" Apr 17 20:15:33.284572 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280442 2580 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 20:15:33.284572 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280445 2580 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 20:15:33.284572 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280448 2580 flags.go:64] FLAG: --manifest-url="" Apr 17 20:15:33.284572 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280450 2580 flags.go:64] FLAG: --manifest-url-header="" Apr 17 20:15:33.284572 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280467 2580 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 20:15:33.284572 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280472 2580 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 20:15:33.284572 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280479 2580 flags.go:64] FLAG: --max-pods="110" Apr 17 20:15:33.284572 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280483 2580 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 20:15:33.284572 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280485 2580 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 20:15:33.284572 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280488 2580 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 20:15:33.284572 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280491 2580 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 20:15:33.284572 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280494 2580 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 20:15:33.284572 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280497 2580 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 20:15:33.284572 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280500 2580 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 20:15:33.284572 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280507 2580 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 20:15:33.284572 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280510 2580 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 20:15:33.284572 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280513 2580 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 20:15:33.284572 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280517 2580 flags.go:64] FLAG: --pod-cidr="" Apr 17 20:15:33.284572 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280520 2580 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 20:15:33.284572 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280525 2580 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 20:15:33.284572 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280529 2580 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 20:15:33.284572 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280532 2580 flags.go:64] FLAG: --pods-per-core="0" Apr 17 20:15:33.284572 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280535 2580 flags.go:64] FLAG: --port="10250" Apr 17 20:15:33.284572 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280538 2580 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 20:15:33.285159 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280541 2580 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-022b9bb13c1abbaff" Apr 17 20:15:33.285159 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280544 2580 flags.go:64] FLAG: --qos-reserved="" Apr 17 20:15:33.285159 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280547 2580 flags.go:64] FLAG: --read-only-port="10255" Apr 17 20:15:33.285159 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280550 2580 flags.go:64] FLAG: --register-node="true" Apr 17 20:15:33.285159 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280553 2580 flags.go:64] FLAG: --register-schedulable="true" Apr 17 20:15:33.285159 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280555 2580 flags.go:64] FLAG: --register-with-taints="" Apr 17 20:15:33.285159 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280559 2580 flags.go:64] FLAG: --registry-burst="10" Apr 17 20:15:33.285159 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280562 2580 flags.go:64] FLAG: --registry-qps="5" Apr 17 20:15:33.285159 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280564 2580 flags.go:64] FLAG: --reserved-cpus="" Apr 17 20:15:33.285159 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280567 2580 flags.go:64] FLAG: --reserved-memory="" Apr 17 20:15:33.285159 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280571 2580 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 20:15:33.285159 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280574 2580 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 20:15:33.285159 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280576 2580 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 20:15:33.285159 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280579 2580 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 20:15:33.285159 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280583 2580 flags.go:64] FLAG: --runonce="false" Apr 17 20:15:33.285159 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280585 2580 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 20:15:33.285159 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280589 2580 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 20:15:33.285159 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280592 2580 flags.go:64] FLAG: --seccomp-default="false" Apr 17 20:15:33.285159 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280595 2580 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 20:15:33.285159 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280597 2580 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 20:15:33.285159 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280600 2580 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 20:15:33.285159 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280603 2580 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 20:15:33.285159 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280606 2580 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 20:15:33.285159 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280609 2580 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 20:15:33.285159 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280612 2580 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 20:15:33.285159 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280614 2580 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 20:15:33.285803 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280618 2580 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 20:15:33.285803 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280621 2580 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 20:15:33.285803 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280624 2580 flags.go:64] FLAG: --system-cgroups="" Apr 17 20:15:33.285803 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280626 2580 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 20:15:33.285803 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280632 2580 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 20:15:33.285803 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280634 2580 flags.go:64] FLAG: --tls-cert-file="" Apr 17 20:15:33.285803 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280637 2580 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 20:15:33.285803 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280642 2580 flags.go:64] FLAG: --tls-min-version="" Apr 17 20:15:33.285803 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280645 2580 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 20:15:33.285803 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280648 2580 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 20:15:33.285803 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280650 2580 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 20:15:33.285803 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280653 2580 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 20:15:33.285803 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280657 2580 flags.go:64] FLAG: --v="2" Apr 17 20:15:33.285803 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280661 2580 flags.go:64] FLAG: --version="false" Apr 17 20:15:33.285803 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280665 2580 flags.go:64] FLAG: --vmodule="" Apr 17 20:15:33.285803 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280669 2580 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 20:15:33.285803 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.280672 2580 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 20:15:33.285803 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280759 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:15:33.285803 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280763 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:15:33.285803 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280766 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:15:33.285803 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280768 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:15:33.285803 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280771 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:15:33.285803 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280773 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:15:33.286349 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280776 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:15:33.286349 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280779 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:15:33.286349 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280781 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:15:33.286349 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280784 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:15:33.286349 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280786 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:15:33.286349 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280789 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:15:33.286349 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280792 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:15:33.286349 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280795 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:15:33.286349 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280797 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:15:33.286349 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280801 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:15:33.286349 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280805 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:15:33.286349 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280807 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:15:33.286349 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280810 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:15:33.286349 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280812 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:15:33.286349 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280815 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:15:33.286349 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280818 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:15:33.286349 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280820 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:15:33.286349 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280823 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:15:33.286349 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280826 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:15:33.286349 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280828 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:15:33.286904 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280830 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:15:33.286904 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280833 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:15:33.286904 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280835 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:15:33.286904 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280837 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:15:33.286904 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280840 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:15:33.286904 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280842 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:15:33.286904 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280846 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:15:33.286904 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280850 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:15:33.286904 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280853 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:15:33.286904 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280856 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:15:33.286904 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280858 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:15:33.286904 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280861 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:15:33.286904 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280864 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:15:33.286904 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280866 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:15:33.286904 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280869 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:15:33.286904 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280871 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:15:33.286904 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280873 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:15:33.286904 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280876 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:15:33.286904 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280878 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:15:33.287377 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280881 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:15:33.287377 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280883 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:15:33.287377 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280885 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:15:33.287377 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280888 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:15:33.287377 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280891 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:15:33.287377 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280897 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:15:33.287377 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280899 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:15:33.287377 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280901 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:15:33.287377 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280904 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:15:33.287377 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280907 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:15:33.287377 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280909 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:15:33.287377 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280912 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:15:33.287377 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280914 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:15:33.287377 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280917 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:15:33.287377 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280919 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:15:33.287377 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280922 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:15:33.287377 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280924 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:15:33.287377 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280926 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:15:33.287377 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280929 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:15:33.287377 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280931 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:15:33.287876 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280934 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:15:33.287876 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280936 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:15:33.287876 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280939 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:15:33.287876 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280941 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:15:33.287876 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280943 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:15:33.287876 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280946 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:15:33.287876 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280948 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:15:33.287876 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280950 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:15:33.287876 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280953 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:15:33.287876 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280955 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:15:33.287876 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280958 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:15:33.287876 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280960 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:15:33.287876 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280962 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:15:33.287876 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280965 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:15:33.287876 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280967 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:15:33.287876 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280969 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:15:33.287876 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280972 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:15:33.287876 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280976 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:15:33.287876 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280979 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:15:33.287876 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280981 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:15:33.288375 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.280983 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:15:33.288375 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.281703 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 20:15:33.289587 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.289570 2580 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 20:15:33.289619 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.289589 2580 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 20:15:33.289652 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289644 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:15:33.289652 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289650 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:15:33.289706 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289654 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:15:33.289706 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289658 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:15:33.289706 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289660 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:15:33.289706 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289663 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:15:33.289706 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289666 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:15:33.289706 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289669 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:15:33.289706 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289672 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:15:33.289706 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289675 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:15:33.289706 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289677 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:15:33.289706 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289680 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:15:33.289706 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289682 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:15:33.289706 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289685 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:15:33.289706 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289687 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:15:33.289706 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289689 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:15:33.289706 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289692 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:15:33.289706 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289694 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:15:33.289706 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289697 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:15:33.289706 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289699 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:15:33.289706 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289703 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:15:33.289706 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289708 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:15:33.290189 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289711 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:15:33.290189 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289714 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:15:33.290189 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289717 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:15:33.290189 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289720 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:15:33.290189 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289723 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:15:33.290189 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289725 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:15:33.290189 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289728 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:15:33.290189 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289731 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:15:33.290189 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289733 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:15:33.290189 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289737 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:15:33.290189 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289740 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:15:33.290189 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289743 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:15:33.290189 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289746 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:15:33.290189 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289748 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:15:33.290189 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289751 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:15:33.290189 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289753 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:15:33.290189 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289755 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:15:33.290189 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289758 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:15:33.290189 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289760 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:15:33.290189 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289762 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:15:33.290712 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289765 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:15:33.290712 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289767 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:15:33.290712 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289770 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:15:33.290712 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289772 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:15:33.290712 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289775 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:15:33.290712 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289777 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:15:33.290712 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289780 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:15:33.290712 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289782 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:15:33.290712 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289786 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:15:33.290712 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289790 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:15:33.290712 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289792 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:15:33.290712 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289795 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:15:33.290712 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289797 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:15:33.290712 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289800 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:15:33.290712 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289803 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:15:33.290712 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289805 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:15:33.290712 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289808 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:15:33.290712 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289810 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:15:33.290712 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289812 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:15:33.291177 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289815 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:15:33.291177 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289817 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:15:33.291177 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289819 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:15:33.291177 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289822 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:15:33.291177 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289825 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:15:33.291177 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289827 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:15:33.291177 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289830 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:15:33.291177 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289833 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:15:33.291177 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289835 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:15:33.291177 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289837 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:15:33.291177 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289840 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:15:33.291177 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289842 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:15:33.291177 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289845 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:15:33.291177 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289847 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:15:33.291177 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289849 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:15:33.291177 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289852 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:15:33.291177 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289854 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:15:33.291177 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289857 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:15:33.291177 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289859 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:15:33.291681 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289862 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:15:33.291681 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289864 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:15:33.291681 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289867 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:15:33.291681 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289869 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:15:33.291681 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289871 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:15:33.291681 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289874 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:15:33.291681 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.289879 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 20:15:33.291681 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289987 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:15:33.291681 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289992 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:15:33.291681 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289996 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:15:33.291681 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.289999 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:15:33.291681 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290002 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:15:33.291681 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290005 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:15:33.291681 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290008 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:15:33.291681 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290011 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:15:33.291681 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290013 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:15:33.292073 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290016 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:15:33.292073 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290019 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:15:33.292073 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290022 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:15:33.292073 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290025 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:15:33.292073 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290027 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:15:33.292073 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290029 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:15:33.292073 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290032 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:15:33.292073 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290034 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:15:33.292073 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290037 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:15:33.292073 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290039 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:15:33.292073 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290042 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:15:33.292073 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290044 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:15:33.292073 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290047 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:15:33.292073 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290049 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:15:33.292073 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290051 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:15:33.292073 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290054 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:15:33.292073 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290056 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:15:33.292073 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290059 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:15:33.292073 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290061 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:15:33.292073 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290064 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:15:33.292571 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290066 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:15:33.292571 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290068 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:15:33.292571 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290071 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:15:33.292571 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290074 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:15:33.292571 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290076 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:15:33.292571 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290078 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:15:33.292571 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290081 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:15:33.292571 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290083 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:15:33.292571 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290086 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:15:33.292571 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290088 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:15:33.292571 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290091 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:15:33.292571 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290093 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:15:33.292571 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290096 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:15:33.292571 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290098 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:15:33.292571 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290101 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:15:33.292571 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290103 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:15:33.292571 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290105 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:15:33.292571 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290108 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:15:33.292571 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290111 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:15:33.292571 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290113 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:15:33.293041 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290116 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:15:33.293041 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290118 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:15:33.293041 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290120 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:15:33.293041 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290123 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:15:33.293041 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290125 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:15:33.293041 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290127 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:15:33.293041 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290130 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:15:33.293041 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290132 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:15:33.293041 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290135 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:15:33.293041 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290137 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:15:33.293041 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290139 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:15:33.293041 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290142 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:15:33.293041 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290144 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:15:33.293041 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290147 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:15:33.293041 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290149 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:15:33.293041 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290152 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:15:33.293041 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290154 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:15:33.293041 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290156 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:15:33.293041 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290159 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:15:33.293041 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290161 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:15:33.293545 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290163 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:15:33.293545 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290166 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:15:33.293545 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290168 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:15:33.293545 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290170 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:15:33.293545 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290173 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:15:33.293545 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290176 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:15:33.293545 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290179 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:15:33.293545 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290182 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:15:33.293545 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290185 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:15:33.293545 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290187 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:15:33.293545 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290189 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:15:33.293545 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290194 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:15:33.293545 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290197 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:15:33.293545 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290200 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:15:33.293545 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290203 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:15:33.293545 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290205 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:15:33.293545 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:33.290208 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:15:33.293956 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.290212 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 20:15:33.293956 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.292649 2580 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 20:15:33.294595 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.294581 2580 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 20:15:33.295585 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.295574 2580 server.go:1019] "Starting client certificate rotation" Apr 17 20:15:33.295691 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.295674 2580 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 20:15:33.296744 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.296733 2580 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 20:15:33.320252 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.320236 2580 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 20:15:33.323881 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.323854 2580 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 20:15:33.336627 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.336605 2580 log.go:25] "Validated CRI v1 runtime API" Apr 17 20:15:33.342526 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.342506 2580 log.go:25] "Validated CRI v1 image API" Apr 17 20:15:33.346978 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.346964 2580 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 20:15:33.349697 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.349680 2580 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 20:15:33.350321 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.350297 2580 fs.go:135] Filesystem UUIDs: map[4ed3a701-8f3d-4fb1-b947-1363d7d3c8ce:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 e690ff7a-930b-496f-9c46-b2317b5a5c0c:/dev/nvme0n1p4] Apr 17 20:15:33.350394 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.350326 2580 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 20:15:33.356223 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.356108 2580 manager.go:217] Machine: {Timestamp:2026-04-17 20:15:33.354354034 +0000 UTC m=+0.430726028 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:2499994 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2352ad439eed6c1798acf033b3df08 SystemUUID:ec2352ad-439e-ed6c-1798-acf033b3df08 BootID:49bd0f93-3f84-4d98-8129-674e22ae0caf Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:b9:49:d8:ca:37 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:b9:49:d8:ca:37 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:42:04:3e:25:4b:35 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 20:15:33.356223 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.356211 2580 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 20:15:33.356392 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.356318 2580 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 20:15:33.357220 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.357192 2580 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 20:15:33.357393 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.357222 2580 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-132-57.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 20:15:33.357498 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.357405 2580 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 20:15:33.357498 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.357417 2580 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 20:15:33.357498 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.357440 2580 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 20:15:33.358440 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.358426 2580 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 20:15:33.359626 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.359613 2580 state_mem.go:36] "Initialized new in-memory state store" Apr 17 20:15:33.359755 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.359744 2580 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 20:15:33.362014 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.362003 2580 kubelet.go:491] "Attempting to sync node with API server" Apr 17 20:15:33.362078 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.362021 2580 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 20:15:33.362078 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.362038 2580 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 20:15:33.362078 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.362053 2580 kubelet.go:397] "Adding apiserver pod source" Apr 17 20:15:33.362078 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.362066 2580 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 20:15:33.363129 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.363116 2580 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 20:15:33.363192 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.363141 2580 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 20:15:33.366216 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.366197 2580 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wd97m" Apr 17 20:15:33.366521 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.366507 2580 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 20:15:33.368086 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.368070 2580 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 20:15:33.369672 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.369658 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 20:15:33.369739 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.369676 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 20:15:33.369739 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.369682 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 20:15:33.369739 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.369688 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 20:15:33.369739 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.369694 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 20:15:33.369739 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.369704 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 20:15:33.369739 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.369713 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 20:15:33.369739 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.369721 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 20:15:33.369739 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.369730 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 20:15:33.369739 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.369735 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 20:15:33.369974 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.369749 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 20:15:33.369974 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.369759 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 20:15:33.370662 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.370650 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 20:15:33.370662 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.370661 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 20:15:33.373020 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.372996 2580 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wd97m" Apr 17 20:15:33.373387 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:33.373360 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-132-57.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 20:15:33.373513 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:33.373486 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 20:15:33.375016 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.375001 2580 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 20:15:33.375097 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.375045 2580 server.go:1295] "Started kubelet" Apr 17 20:15:33.375143 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.375092 2580 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 20:15:33.375177 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.375147 2580 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 20:15:33.375206 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.375187 2580 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 20:15:33.375868 ip-10-0-132-57 systemd[1]: Started Kubernetes Kubelet. Apr 17 20:15:33.376838 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.376761 2580 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 20:15:33.377845 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.377828 2580 server.go:317] "Adding debug handlers to kubelet server" Apr 17 20:15:33.383044 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.383021 2580 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 20:15:33.383121 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:33.383105 2580 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 20:15:33.383484 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.383452 2580 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 20:15:33.384547 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.384531 2580 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 20:15:33.384721 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.384535 2580 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 20:15:33.384874 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.384728 2580 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 20:15:33.384874 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.384834 2580 reconstruct.go:97] "Volume reconstruction finished" Apr 17 20:15:33.384874 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.384843 2580 reconciler.go:26] "Reconciler: start to sync state" Apr 17 20:15:33.385091 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:33.384899 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-57.ec2.internal\" not found" Apr 17 20:15:33.385091 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.385067 2580 factory.go:55] Registering systemd factory Apr 17 20:15:33.385091 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.385087 2580 factory.go:223] Registration of the systemd container factory successfully Apr 17 20:15:33.385295 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.385281 2580 factory.go:153] Registering CRI-O factory Apr 17 20:15:33.385348 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.385301 2580 factory.go:223] Registration of the crio container factory successfully Apr 17 20:15:33.385423 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.385410 2580 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 20:15:33.385506 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.385440 2580 factory.go:103] Registering Raw factory Apr 17 20:15:33.385506 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.385474 2580 manager.go:1196] Started watching for new ooms in manager Apr 17 20:15:33.385603 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.385562 2580 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-132-57.ec2.internal" not found Apr 17 20:15:33.385603 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.385589 2580 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:15:33.385859 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.385844 2580 manager.go:319] Starting recovery of all containers Apr 17 20:15:33.390651 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:33.390625 2580 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-132-57.ec2.internal\" not found" node="ip-10-0-132-57.ec2.internal" Apr 17 20:15:33.395145 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.395127 2580 manager.go:324] Recovery completed Apr 17 20:15:33.400525 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.400509 2580 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-132-57.ec2.internal" not found Apr 17 20:15:33.400633 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.400621 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:15:33.403256 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.403239 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-57.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:15:33.403315 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.403268 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-57.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:15:33.403315 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.403280 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-57.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:15:33.403752 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.403739 2580 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 20:15:33.403752 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.403752 2580 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 20:15:33.403837 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.403768 2580 state_mem.go:36] "Initialized new in-memory state store" Apr 17 20:15:33.405753 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.405742 2580 policy_none.go:49] "None policy: Start" Apr 17 20:15:33.405787 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.405758 2580 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 20:15:33.406154 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.406146 2580 state_mem.go:35] "Initializing new in-memory state store" Apr 17 20:15:33.447026 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.447011 2580 manager.go:341] "Starting Device Plugin manager" Apr 17 20:15:33.459833 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:33.447041 2580 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 20:15:33.459833 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.447054 2580 server.go:85] "Starting device plugin registration server" Apr 17 20:15:33.459833 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.447252 2580 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 20:15:33.459833 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.447262 2580 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 20:15:33.459833 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.447388 2580 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 20:15:33.459833 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.447499 2580 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 20:15:33.459833 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.447508 2580 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 20:15:33.459833 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:33.448048 2580 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 20:15:33.459833 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:33.448093 2580 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-132-57.ec2.internal\" not found" Apr 17 20:15:33.459833 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.455497 2580 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-132-57.ec2.internal" not found Apr 17 20:15:33.524537 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.524471 2580 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 20:15:33.525750 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.525728 2580 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 20:15:33.525750 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.525751 2580 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 20:15:33.525882 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.525767 2580 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 20:15:33.525882 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.525774 2580 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 20:15:33.525882 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:33.525801 2580 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 20:15:33.527950 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.527932 2580 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:15:33.547957 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.547942 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:15:33.548736 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.548719 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-57.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:15:33.548812 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.548748 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-57.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:15:33.548812 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.548759 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-57.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:15:33.548812 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.548780 2580 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-132-57.ec2.internal" Apr 17 20:15:33.557969 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.557953 2580 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-132-57.ec2.internal" Apr 17 20:15:33.558028 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:33.557973 2580 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-132-57.ec2.internal\": node \"ip-10-0-132-57.ec2.internal\" not found" Apr 17 20:15:33.574307 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:33.574282 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-57.ec2.internal\" not found" Apr 17 20:15:33.627420 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.627384 2580 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-57.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-132-57.ec2.internal"] Apr 17 20:15:33.627508 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.627476 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:15:33.628173 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.628161 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-57.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:15:33.628224 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.628186 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-57.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:15:33.628224 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.628197 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-57.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:15:33.629441 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.629430 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:15:33.629597 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.629582 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-57.ec2.internal" Apr 17 20:15:33.629635 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.629612 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:15:33.630105 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.630091 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-57.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:15:33.630166 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.630105 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-57.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:15:33.630166 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.630118 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-57.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:15:33.630166 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.630126 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-57.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:15:33.630166 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.630131 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-57.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:15:33.630166 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.630136 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-57.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:15:33.631523 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.631508 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-57.ec2.internal" Apr 17 20:15:33.631596 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.631530 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:15:33.632161 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.632143 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-57.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:15:33.632225 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.632173 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-57.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:15:33.632225 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.632183 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-57.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:15:33.655998 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:33.655980 2580 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-57.ec2.internal\" not found" node="ip-10-0-132-57.ec2.internal" Apr 17 20:15:33.660425 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:33.660410 2580 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-57.ec2.internal\" not found" node="ip-10-0-132-57.ec2.internal" Apr 17 20:15:33.674355 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:33.674324 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-57.ec2.internal\" not found" Apr 17 20:15:33.686737 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.686718 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e758ab3eb0bee98b6ae04d49ace534ee-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-57.ec2.internal\" (UID: \"e758ab3eb0bee98b6ae04d49ace534ee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-57.ec2.internal" Apr 17 20:15:33.686791 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.686742 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e758ab3eb0bee98b6ae04d49ace534ee-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-57.ec2.internal\" (UID: \"e758ab3eb0bee98b6ae04d49ace534ee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-57.ec2.internal" Apr 17 20:15:33.686791 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.686757 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ad918dee123885abb804caebda37d740-config\") pod \"kube-apiserver-proxy-ip-10-0-132-57.ec2.internal\" (UID: \"ad918dee123885abb804caebda37d740\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-57.ec2.internal" Apr 17 20:15:33.774729 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:33.774674 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-57.ec2.internal\" not found" Apr 17 20:15:33.787750 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.787731 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ad918dee123885abb804caebda37d740-config\") pod \"kube-apiserver-proxy-ip-10-0-132-57.ec2.internal\" (UID: \"ad918dee123885abb804caebda37d740\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-57.ec2.internal" Apr 17 20:15:33.787830 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.787812 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ad918dee123885abb804caebda37d740-config\") pod \"kube-apiserver-proxy-ip-10-0-132-57.ec2.internal\" (UID: \"ad918dee123885abb804caebda37d740\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-57.ec2.internal" Apr 17 20:15:33.787874 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.787833 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e758ab3eb0bee98b6ae04d49ace534ee-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-57.ec2.internal\" (UID: \"e758ab3eb0bee98b6ae04d49ace534ee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-57.ec2.internal" Apr 17 20:15:33.787910 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.787865 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e758ab3eb0bee98b6ae04d49ace534ee-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-57.ec2.internal\" (UID: \"e758ab3eb0bee98b6ae04d49ace534ee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-57.ec2.internal" Apr 17 20:15:33.787910 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.787902 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e758ab3eb0bee98b6ae04d49ace534ee-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-57.ec2.internal\" (UID: \"e758ab3eb0bee98b6ae04d49ace534ee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-57.ec2.internal" Apr 17 20:15:33.787979 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.787907 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e758ab3eb0bee98b6ae04d49ace534ee-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-57.ec2.internal\" (UID: \"e758ab3eb0bee98b6ae04d49ace534ee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-57.ec2.internal" Apr 17 20:15:33.875153 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:33.875133 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-57.ec2.internal\" not found" Apr 17 20:15:33.957696 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.957666 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-57.ec2.internal" Apr 17 20:15:33.963219 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:33.963203 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-57.ec2.internal" Apr 17 20:15:33.975776 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:33.975758 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-57.ec2.internal\" not found" Apr 17 20:15:34.076378 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:34.076307 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-57.ec2.internal\" not found" Apr 17 20:15:34.176856 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:34.176829 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-57.ec2.internal\" not found" Apr 17 20:15:34.277401 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:34.277383 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-57.ec2.internal\" not found" Apr 17 20:15:34.295898 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:34.295877 2580 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 20:15:34.296020 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:34.296005 2580 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 20:15:34.296059 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:34.296033 2580 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 20:15:34.375555 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:34.375489 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 20:10:33 +0000 UTC" deadline="2027-12-28 13:54:53.329507652 +0000 UTC" Apr 17 20:15:34.375555 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:34.375524 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14873h39m18.95398734s" Apr 17 20:15:34.377652 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:34.377633 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-57.ec2.internal\" not found" Apr 17 20:15:34.383789 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:34.383772 2580 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 20:15:34.398698 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:34.398678 2580 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 20:15:34.418769 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:34.418745 2580 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-pm56n" Apr 17 20:15:34.425702 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:34.425677 2580 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-pm56n" Apr 17 20:15:34.437616 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:34.437597 2580 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:15:34.484568 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:34.484550 2580 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-57.ec2.internal" Apr 17 20:15:34.494900 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:34.494872 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode758ab3eb0bee98b6ae04d49ace534ee.slice/crio-05a4b08acc443f388ea5b16014735e6801be7ad83b250611ce651e79e9c7b158 WatchSource:0}: Error finding container 05a4b08acc443f388ea5b16014735e6801be7ad83b250611ce651e79e9c7b158: Status 404 returned error can't find the container with id 05a4b08acc443f388ea5b16014735e6801be7ad83b250611ce651e79e9c7b158 Apr 17 20:15:34.495116 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:34.495098 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad918dee123885abb804caebda37d740.slice/crio-dc27221c04db7413ee295b8d3bbe6dfe4eaf7c44ebf2bf8122e054e3bd7dbf5c WatchSource:0}: Error finding container dc27221c04db7413ee295b8d3bbe6dfe4eaf7c44ebf2bf8122e054e3bd7dbf5c: Status 404 returned error can't find the container with id dc27221c04db7413ee295b8d3bbe6dfe4eaf7c44ebf2bf8122e054e3bd7dbf5c Apr 17 20:15:34.495806 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:34.495790 2580 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 20:15:34.497922 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:34.497905 2580 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-57.ec2.internal" Apr 17 20:15:34.498763 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:34.498749 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 20:15:34.505543 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:34.505528 2580 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 20:15:34.528009 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:34.527964 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-57.ec2.internal" event={"ID":"ad918dee123885abb804caebda37d740","Type":"ContainerStarted","Data":"dc27221c04db7413ee295b8d3bbe6dfe4eaf7c44ebf2bf8122e054e3bd7dbf5c"} Apr 17 20:15:34.528826 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:34.528806 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-57.ec2.internal" event={"ID":"e758ab3eb0bee98b6ae04d49ace534ee","Type":"ContainerStarted","Data":"05a4b08acc443f388ea5b16014735e6801be7ad83b250611ce651e79e9c7b158"} Apr 17 20:15:34.699184 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:34.699113 2580 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:15:35.363934 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.363902 2580 apiserver.go:52] "Watching apiserver" Apr 17 20:15:35.372151 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.372127 2580 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 20:15:35.372660 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.372635 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w8d2r","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-57.ec2.internal","openshift-multus/multus-additional-cni-plugins-qvs8p","openshift-multus/multus-lj6hr","openshift-multus/network-metrics-daemon-99wq2","openshift-ovn-kubernetes/ovnkube-node-vrmd6","kube-system/kube-apiserver-proxy-ip-10-0-132-57.ec2.internal","openshift-cluster-node-tuning-operator/tuned-zgtt5","openshift-dns/node-resolver-h6fk4","openshift-image-registry/node-ca-n6lwg","openshift-network-diagnostics/network-check-target-d9lfz","openshift-network-operator/iptables-alerter-s6gxh","kube-system/konnectivity-agent-xtvgg"] Apr 17 20:15:35.375060 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.375030 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-n6lwg" Apr 17 20:15:35.377127 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.377095 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 20:15:35.377127 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.377110 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 20:15:35.377284 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.377132 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 20:15:35.377284 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.377182 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qvs8p" Apr 17 20:15:35.377348 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.377285 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-ct92n\"" Apr 17 20:15:35.379838 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.379115 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 20:15:35.379838 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.379237 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.379838 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.379473 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 20:15:35.379838 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.379585 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 20:15:35.379838 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.379643 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-w2cwx\"" Apr 17 20:15:35.379838 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.379650 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 20:15:35.379838 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.379585 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 20:15:35.381146 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.381126 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 20:15:35.381256 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.381243 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-gqhsv\"" Apr 17 20:15:35.381584 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.381547 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99wq2" Apr 17 20:15:35.381690 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:35.381634 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99wq2" podUID="d943896a-8c08-4d43-b1c4-d738b0079503" Apr 17 20:15:35.383881 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.383862 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.385796 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.385777 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 20:15:35.385979 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.385960 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 20:15:35.386058 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.386000 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 20:15:35.386146 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.385964 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 20:15:35.386233 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.386183 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" Apr 17 20:15:35.386392 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.386370 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 20:15:35.386551 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.386531 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 20:15:35.386654 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.386534 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-l6bsz\"" Apr 17 20:15:35.388181 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.388165 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 20:15:35.388587 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.388571 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-gp4jr\"" Apr 17 20:15:35.388725 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.388709 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:15:35.389914 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.389897 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-h6fk4" Apr 17 20:15:35.391729 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.391710 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 20:15:35.392085 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.391957 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-cv7rk\"" Apr 17 20:15:35.392085 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.392066 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 20:15:35.392751 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.392732 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w8d2r" Apr 17 20:15:35.394665 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.394637 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 20:15:35.394665 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.394659 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 20:15:35.394862 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.394673 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-8dfth\"" Apr 17 20:15:35.394862 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.394674 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 20:15:35.394973 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.394890 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/630511e5-409a-44f6-9416-ee72411c751e-sys\") pod \"tuned-zgtt5\" (UID: \"630511e5-409a-44f6-9416-ee72411c751e\") " pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" Apr 17 20:15:35.394973 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.394921 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/630511e5-409a-44f6-9416-ee72411c751e-var-lib-kubelet\") pod \"tuned-zgtt5\" (UID: \"630511e5-409a-44f6-9416-ee72411c751e\") " pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" Apr 17 20:15:35.394973 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.394946 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/16a5e25e-23a4-4106-a67e-adda44b1aaa6-os-release\") pod \"multus-additional-cni-plugins-qvs8p\" (UID: \"16a5e25e-23a4-4106-a67e-adda44b1aaa6\") " pod="openshift-multus/multus-additional-cni-plugins-qvs8p" Apr 17 20:15:35.394973 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.394968 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/16a5e25e-23a4-4106-a67e-adda44b1aaa6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qvs8p\" (UID: \"16a5e25e-23a4-4106-a67e-adda44b1aaa6\") " pod="openshift-multus/multus-additional-cni-plugins-qvs8p" Apr 17 20:15:35.395160 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.394993 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzwk4\" (UniqueName: \"kubernetes.io/projected/16a5e25e-23a4-4106-a67e-adda44b1aaa6-kube-api-access-qzwk4\") pod \"multus-additional-cni-plugins-qvs8p\" (UID: \"16a5e25e-23a4-4106-a67e-adda44b1aaa6\") " pod="openshift-multus/multus-additional-cni-plugins-qvs8p" Apr 17 20:15:35.395160 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.394969 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d9lfz" Apr 17 20:15:35.395160 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.395016 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/018ba037-0cf6-4ce0-ba07-95893c240cd2-multus-cni-dir\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.395160 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:35.395066 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d9lfz" podUID="05059fd8-9f1b-4374-81cf-fd56830ab0bb" Apr 17 20:15:35.395160 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.395106 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/018ba037-0cf6-4ce0-ba07-95893c240cd2-etc-kubernetes\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.395160 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.395136 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gcn4\" (UniqueName: \"kubernetes.io/projected/018ba037-0cf6-4ce0-ba07-95893c240cd2-kube-api-access-4gcn4\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.395160 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.395164 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-var-lib-openvswitch\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.395668 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.395188 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/630511e5-409a-44f6-9416-ee72411c751e-run\") pod \"tuned-zgtt5\" (UID: \"630511e5-409a-44f6-9416-ee72411c751e\") " pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" Apr 17 20:15:35.395668 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.395231 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/018ba037-0cf6-4ce0-ba07-95893c240cd2-multus-socket-dir-parent\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.395668 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.395254 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/630511e5-409a-44f6-9416-ee72411c751e-etc-sysctl-conf\") pod \"tuned-zgtt5\" (UID: \"630511e5-409a-44f6-9416-ee72411c751e\") " pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" Apr 17 20:15:35.395668 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.395279 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-host-kubelet\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.395668 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.395303 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c81797ca-3338-4294-8ef8-fb0416677637-ovnkube-config\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.395668 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.395327 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c81797ca-3338-4294-8ef8-fb0416677637-ovnkube-script-lib\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.395668 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.395349 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/630511e5-409a-44f6-9416-ee72411c751e-lib-modules\") pod \"tuned-zgtt5\" (UID: \"630511e5-409a-44f6-9416-ee72411c751e\") " pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" Apr 17 20:15:35.395668 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.395369 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/018ba037-0cf6-4ce0-ba07-95893c240cd2-os-release\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.395668 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.395394 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/018ba037-0cf6-4ce0-ba07-95893c240cd2-host-var-lib-cni-bin\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.395668 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.395422 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-etc-openvswitch\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.395668 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.395483 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/630511e5-409a-44f6-9416-ee72411c751e-etc-sysctl-d\") pod \"tuned-zgtt5\" (UID: \"630511e5-409a-44f6-9416-ee72411c751e\") " pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" Apr 17 20:15:35.395668 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.395511 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9d8cbec6-ac91-4373-b7ef-593404bf8a86-serviceca\") pod \"node-ca-n6lwg\" (UID: \"9d8cbec6-ac91-4373-b7ef-593404bf8a86\") " pod="openshift-image-registry/node-ca-n6lwg" Apr 17 20:15:35.395668 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.395534 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7jjf\" (UniqueName: \"kubernetes.io/projected/9d8cbec6-ac91-4373-b7ef-593404bf8a86-kube-api-access-c7jjf\") pod \"node-ca-n6lwg\" (UID: \"9d8cbec6-ac91-4373-b7ef-593404bf8a86\") " pod="openshift-image-registry/node-ca-n6lwg" Apr 17 20:15:35.395668 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.395570 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/018ba037-0cf6-4ce0-ba07-95893c240cd2-system-cni-dir\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.395668 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.395594 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/018ba037-0cf6-4ce0-ba07-95893c240cd2-host-run-multus-certs\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.395668 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.395657 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc88n\" (UniqueName: \"kubernetes.io/projected/d943896a-8c08-4d43-b1c4-d738b0079503-kube-api-access-mc88n\") pod \"network-metrics-daemon-99wq2\" (UID: \"d943896a-8c08-4d43-b1c4-d738b0079503\") " pod="openshift-multus/network-metrics-daemon-99wq2" Apr 17 20:15:35.396440 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.395680 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-log-socket\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.396440 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.395705 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/16a5e25e-23a4-4106-a67e-adda44b1aaa6-system-cni-dir\") pod \"multus-additional-cni-plugins-qvs8p\" (UID: \"16a5e25e-23a4-4106-a67e-adda44b1aaa6\") " pod="openshift-multus/multus-additional-cni-plugins-qvs8p" Apr 17 20:15:35.396440 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.395735 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/018ba037-0cf6-4ce0-ba07-95893c240cd2-host-var-lib-cni-multus\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.396440 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.395759 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/018ba037-0cf6-4ce0-ba07-95893c240cd2-multus-daemon-config\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.396440 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.395803 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-systemd-units\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.396440 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.395835 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/630511e5-409a-44f6-9416-ee72411c751e-tmp\") pod \"tuned-zgtt5\" (UID: \"630511e5-409a-44f6-9416-ee72411c751e\") " pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" Apr 17 20:15:35.396440 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.395868 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-run-openvswitch\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.396440 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.395884 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-run-ovn\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.396440 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.395903 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-node-log\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.396440 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.395957 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/630511e5-409a-44f6-9416-ee72411c751e-etc-kubernetes\") pod \"tuned-zgtt5\" (UID: \"630511e5-409a-44f6-9416-ee72411c751e\") " pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" Apr 17 20:15:35.396440 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.395986 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/630511e5-409a-44f6-9416-ee72411c751e-etc-systemd\") pod \"tuned-zgtt5\" (UID: \"630511e5-409a-44f6-9416-ee72411c751e\") " pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" Apr 17 20:15:35.396440 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.396001 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/018ba037-0cf6-4ce0-ba07-95893c240cd2-host-run-k8s-cni-cncf-io\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.396440 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.396042 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/16a5e25e-23a4-4106-a67e-adda44b1aaa6-cni-binary-copy\") pod \"multus-additional-cni-plugins-qvs8p\" (UID: \"16a5e25e-23a4-4106-a67e-adda44b1aaa6\") " pod="openshift-multus/multus-additional-cni-plugins-qvs8p" Apr 17 20:15:35.396440 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.396078 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/16a5e25e-23a4-4106-a67e-adda44b1aaa6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qvs8p\" (UID: \"16a5e25e-23a4-4106-a67e-adda44b1aaa6\") " pod="openshift-multus/multus-additional-cni-plugins-qvs8p" Apr 17 20:15:35.396440 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.396108 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/16a5e25e-23a4-4106-a67e-adda44b1aaa6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qvs8p\" (UID: \"16a5e25e-23a4-4106-a67e-adda44b1aaa6\") " pod="openshift-multus/multus-additional-cni-plugins-qvs8p" Apr 17 20:15:35.396440 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.396131 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/630511e5-409a-44f6-9416-ee72411c751e-etc-sysconfig\") pod \"tuned-zgtt5\" (UID: \"630511e5-409a-44f6-9416-ee72411c751e\") " pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" Apr 17 20:15:35.397050 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.396153 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/630511e5-409a-44f6-9416-ee72411c751e-etc-tuned\") pod \"tuned-zgtt5\" (UID: \"630511e5-409a-44f6-9416-ee72411c751e\") " pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" Apr 17 20:15:35.397050 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.396175 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/018ba037-0cf6-4ce0-ba07-95893c240cd2-cni-binary-copy\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.397050 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.396196 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/630511e5-409a-44f6-9416-ee72411c751e-host\") pod \"tuned-zgtt5\" (UID: \"630511e5-409a-44f6-9416-ee72411c751e\") " pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" Apr 17 20:15:35.397050 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.396239 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-host-slash\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.397050 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.396257 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-run-systemd\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.397050 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.396815 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.397050 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.396856 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d943896a-8c08-4d43-b1c4-d738b0079503-metrics-certs\") pod \"network-metrics-daemon-99wq2\" (UID: \"d943896a-8c08-4d43-b1c4-d738b0079503\") " pod="openshift-multus/network-metrics-daemon-99wq2" Apr 17 20:15:35.397050 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.396937 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c81797ca-3338-4294-8ef8-fb0416677637-ovn-node-metrics-cert\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.397050 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.396968 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/630511e5-409a-44f6-9416-ee72411c751e-etc-modprobe-d\") pod \"tuned-zgtt5\" (UID: \"630511e5-409a-44f6-9416-ee72411c751e\") " pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" Apr 17 20:15:35.397050 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.396996 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/018ba037-0cf6-4ce0-ba07-95893c240cd2-host-var-lib-kubelet\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.397050 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.397020 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-host-cni-bin\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.397050 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.397046 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9d8cbec6-ac91-4373-b7ef-593404bf8a86-host\") pod \"node-ca-n6lwg\" (UID: \"9d8cbec6-ac91-4373-b7ef-593404bf8a86\") " pod="openshift-image-registry/node-ca-n6lwg" Apr 17 20:15:35.397571 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.397075 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/018ba037-0cf6-4ce0-ba07-95893c240cd2-host-run-netns\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.397571 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.397102 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/018ba037-0cf6-4ce0-ba07-95893c240cd2-multus-conf-dir\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.397571 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.397129 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-host-run-netns\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.397571 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.397194 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c81797ca-3338-4294-8ef8-fb0416677637-env-overrides\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.397571 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.397236 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8ps6\" (UniqueName: \"kubernetes.io/projected/c81797ca-3338-4294-8ef8-fb0416677637-kube-api-access-z8ps6\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.397968 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.397941 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9ppl\" (UniqueName: \"kubernetes.io/projected/630511e5-409a-44f6-9416-ee72411c751e-kube-api-access-b9ppl\") pod \"tuned-zgtt5\" (UID: \"630511e5-409a-44f6-9416-ee72411c751e\") " pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" Apr 17 20:15:35.398042 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.397300 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-s6gxh" Apr 17 20:15:35.398042 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.398016 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/018ba037-0cf6-4ce0-ba07-95893c240cd2-hostroot\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.398121 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.398076 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-host-run-ovn-kubernetes\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.398121 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.398097 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-host-cni-netd\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.398184 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.398128 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/16a5e25e-23a4-4106-a67e-adda44b1aaa6-cnibin\") pod \"multus-additional-cni-plugins-qvs8p\" (UID: \"16a5e25e-23a4-4106-a67e-adda44b1aaa6\") " pod="openshift-multus/multus-additional-cni-plugins-qvs8p" Apr 17 20:15:35.398184 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.398156 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/018ba037-0cf6-4ce0-ba07-95893c240cd2-cnibin\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.400124 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.400096 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 20:15:35.400371 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.400353 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:15:35.400371 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.400371 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 20:15:35.400678 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.400658 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-h88qx\"" Apr 17 20:15:35.402183 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.402153 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-xtvgg" Apr 17 20:15:35.404448 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.404430 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-6njhp\"" Apr 17 20:15:35.404557 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.404450 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 20:15:35.404706 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.404686 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 20:15:35.426613 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.426586 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 20:10:34 +0000 UTC" deadline="2027-11-03 00:01:12.112708793 +0000 UTC" Apr 17 20:15:35.426730 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.426614 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13539h45m36.686098604s" Apr 17 20:15:35.485650 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.485626 2580 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 20:15:35.499183 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.499160 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-host-run-netns\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.499307 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.499190 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c81797ca-3338-4294-8ef8-fb0416677637-env-overrides\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.499307 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.499212 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8ps6\" (UniqueName: \"kubernetes.io/projected/c81797ca-3338-4294-8ef8-fb0416677637-kube-api-access-z8ps6\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.499307 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.499231 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b9ppl\" (UniqueName: \"kubernetes.io/projected/630511e5-409a-44f6-9416-ee72411c751e-kube-api-access-b9ppl\") pod \"tuned-zgtt5\" (UID: \"630511e5-409a-44f6-9416-ee72411c751e\") " pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" Apr 17 20:15:35.499492 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.499344 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-host-run-netns\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.499492 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.499393 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/018ba037-0cf6-4ce0-ba07-95893c240cd2-hostroot\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.499492 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.499424 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-host-run-ovn-kubernetes\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.499492 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.499450 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-host-cni-netd\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.499492 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.499481 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/018ba037-0cf6-4ce0-ba07-95893c240cd2-hostroot\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.499736 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.499502 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/16a5e25e-23a4-4106-a67e-adda44b1aaa6-cnibin\") pod \"multus-additional-cni-plugins-qvs8p\" (UID: \"16a5e25e-23a4-4106-a67e-adda44b1aaa6\") " pod="openshift-multus/multus-additional-cni-plugins-qvs8p" Apr 17 20:15:35.499736 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.499519 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-host-run-ovn-kubernetes\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.499736 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.499530 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/018ba037-0cf6-4ce0-ba07-95893c240cd2-cnibin\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.499736 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.499535 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-host-cni-netd\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.499736 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.499569 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/630511e5-409a-44f6-9416-ee72411c751e-sys\") pod \"tuned-zgtt5\" (UID: \"630511e5-409a-44f6-9416-ee72411c751e\") " pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" Apr 17 20:15:35.499736 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.499578 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/16a5e25e-23a4-4106-a67e-adda44b1aaa6-cnibin\") pod \"multus-additional-cni-plugins-qvs8p\" (UID: \"16a5e25e-23a4-4106-a67e-adda44b1aaa6\") " pod="openshift-multus/multus-additional-cni-plugins-qvs8p" Apr 17 20:15:35.499736 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.499582 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/018ba037-0cf6-4ce0-ba07-95893c240cd2-cnibin\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.499736 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.499596 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/630511e5-409a-44f6-9416-ee72411c751e-var-lib-kubelet\") pod \"tuned-zgtt5\" (UID: \"630511e5-409a-44f6-9416-ee72411c751e\") " pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" Apr 17 20:15:35.499736 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.499615 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/630511e5-409a-44f6-9416-ee72411c751e-sys\") pod \"tuned-zgtt5\" (UID: \"630511e5-409a-44f6-9416-ee72411c751e\") " pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" Apr 17 20:15:35.499736 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.499625 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/16a5e25e-23a4-4106-a67e-adda44b1aaa6-os-release\") pod \"multus-additional-cni-plugins-qvs8p\" (UID: \"16a5e25e-23a4-4106-a67e-adda44b1aaa6\") " pod="openshift-multus/multus-additional-cni-plugins-qvs8p" Apr 17 20:15:35.499736 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.499652 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/16a5e25e-23a4-4106-a67e-adda44b1aaa6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qvs8p\" (UID: \"16a5e25e-23a4-4106-a67e-adda44b1aaa6\") " pod="openshift-multus/multus-additional-cni-plugins-qvs8p" Apr 17 20:15:35.499736 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.499657 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/630511e5-409a-44f6-9416-ee72411c751e-var-lib-kubelet\") pod \"tuned-zgtt5\" (UID: \"630511e5-409a-44f6-9416-ee72411c751e\") " pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" Apr 17 20:15:35.499736 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.499677 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qzwk4\" (UniqueName: \"kubernetes.io/projected/16a5e25e-23a4-4106-a67e-adda44b1aaa6-kube-api-access-qzwk4\") pod \"multus-additional-cni-plugins-qvs8p\" (UID: \"16a5e25e-23a4-4106-a67e-adda44b1aaa6\") " pod="openshift-multus/multus-additional-cni-plugins-qvs8p" Apr 17 20:15:35.499736 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.499696 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/16a5e25e-23a4-4106-a67e-adda44b1aaa6-os-release\") pod \"multus-additional-cni-plugins-qvs8p\" (UID: \"16a5e25e-23a4-4106-a67e-adda44b1aaa6\") " pod="openshift-multus/multus-additional-cni-plugins-qvs8p" Apr 17 20:15:35.499736 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.499703 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/018ba037-0cf6-4ce0-ba07-95893c240cd2-multus-cni-dir\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.499736 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.499727 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/018ba037-0cf6-4ce0-ba07-95893c240cd2-etc-kubernetes\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.500450 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.499749 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gcn4\" (UniqueName: \"kubernetes.io/projected/018ba037-0cf6-4ce0-ba07-95893c240cd2-kube-api-access-4gcn4\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.500450 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.499745 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c81797ca-3338-4294-8ef8-fb0416677637-env-overrides\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.500450 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.499775 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/018ba037-0cf6-4ce0-ba07-95893c240cd2-multus-cni-dir\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.500450 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.499773 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-var-lib-openvswitch\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.500450 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.499782 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/16a5e25e-23a4-4106-a67e-adda44b1aaa6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qvs8p\" (UID: \"16a5e25e-23a4-4106-a67e-adda44b1aaa6\") " pod="openshift-multus/multus-additional-cni-plugins-qvs8p" Apr 17 20:15:35.500450 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.499812 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/630511e5-409a-44f6-9416-ee72411c751e-run\") pod \"tuned-zgtt5\" (UID: \"630511e5-409a-44f6-9416-ee72411c751e\") " pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" Apr 17 20:15:35.500450 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.499818 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-var-lib-openvswitch\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.500450 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.499829 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/018ba037-0cf6-4ce0-ba07-95893c240cd2-etc-kubernetes\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.500450 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.499837 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/018ba037-0cf6-4ce0-ba07-95893c240cd2-multus-socket-dir-parent\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.500450 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.499867 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htcdf\" (UniqueName: \"kubernetes.io/projected/28b18d89-4df2-405e-8e06-5f5e39694305-kube-api-access-htcdf\") pod \"node-resolver-h6fk4\" (UID: \"28b18d89-4df2-405e-8e06-5f5e39694305\") " pod="openshift-dns/node-resolver-h6fk4" Apr 17 20:15:35.500450 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.499896 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7c43b13c-fe3e-4514-82f0-afc4e752be0a-sys-fs\") pod \"aws-ebs-csi-driver-node-w8d2r\" (UID: \"7c43b13c-fe3e-4514-82f0-afc4e752be0a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w8d2r" Apr 17 20:15:35.500450 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.499869 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/630511e5-409a-44f6-9416-ee72411c751e-run\") pod \"tuned-zgtt5\" (UID: \"630511e5-409a-44f6-9416-ee72411c751e\") " pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" Apr 17 20:15:35.500450 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.499909 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/018ba037-0cf6-4ce0-ba07-95893c240cd2-multus-socket-dir-parent\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.500450 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.499920 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m7jr\" (UniqueName: \"kubernetes.io/projected/05059fd8-9f1b-4374-81cf-fd56830ab0bb-kube-api-access-7m7jr\") pod \"network-check-target-d9lfz\" (UID: \"05059fd8-9f1b-4374-81cf-fd56830ab0bb\") " pod="openshift-network-diagnostics/network-check-target-d9lfz" Apr 17 20:15:35.500450 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.499945 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/630511e5-409a-44f6-9416-ee72411c751e-etc-sysctl-conf\") pod \"tuned-zgtt5\" (UID: \"630511e5-409a-44f6-9416-ee72411c751e\") " pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" Apr 17 20:15:35.500450 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.499996 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-host-kubelet\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.500450 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500027 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c81797ca-3338-4294-8ef8-fb0416677637-ovnkube-config\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.501281 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500050 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-host-kubelet\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.501281 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500052 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c81797ca-3338-4294-8ef8-fb0416677637-ovnkube-script-lib\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.501281 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500093 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/630511e5-409a-44f6-9416-ee72411c751e-lib-modules\") pod \"tuned-zgtt5\" (UID: \"630511e5-409a-44f6-9416-ee72411c751e\") " pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" Apr 17 20:15:35.501281 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500110 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/018ba037-0cf6-4ce0-ba07-95893c240cd2-os-release\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.501281 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500125 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/018ba037-0cf6-4ce0-ba07-95893c240cd2-host-var-lib-cni-bin\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.501281 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500122 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/630511e5-409a-44f6-9416-ee72411c751e-etc-sysctl-conf\") pod \"tuned-zgtt5\" (UID: \"630511e5-409a-44f6-9416-ee72411c751e\") " pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" Apr 17 20:15:35.501281 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500147 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/87977dd5-cce0-46ae-8d11-ddfd87452aef-konnectivity-ca\") pod \"konnectivity-agent-xtvgg\" (UID: \"87977dd5-cce0-46ae-8d11-ddfd87452aef\") " pod="kube-system/konnectivity-agent-xtvgg" Apr 17 20:15:35.501281 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500168 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-etc-openvswitch\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.501281 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500183 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/630511e5-409a-44f6-9416-ee72411c751e-etc-sysctl-d\") pod \"tuned-zgtt5\" (UID: \"630511e5-409a-44f6-9416-ee72411c751e\") " pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" Apr 17 20:15:35.501281 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500189 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/018ba037-0cf6-4ce0-ba07-95893c240cd2-os-release\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.501281 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500197 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9d8cbec6-ac91-4373-b7ef-593404bf8a86-serviceca\") pod \"node-ca-n6lwg\" (UID: \"9d8cbec6-ac91-4373-b7ef-593404bf8a86\") " pod="openshift-image-registry/node-ca-n6lwg" Apr 17 20:15:35.501281 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500213 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c7jjf\" (UniqueName: \"kubernetes.io/projected/9d8cbec6-ac91-4373-b7ef-593404bf8a86-kube-api-access-c7jjf\") pod \"node-ca-n6lwg\" (UID: \"9d8cbec6-ac91-4373-b7ef-593404bf8a86\") " pod="openshift-image-registry/node-ca-n6lwg" Apr 17 20:15:35.501281 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500238 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/018ba037-0cf6-4ce0-ba07-95893c240cd2-system-cni-dir\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.501281 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500254 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/018ba037-0cf6-4ce0-ba07-95893c240cd2-host-run-multus-certs\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.501281 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500270 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mc88n\" (UniqueName: \"kubernetes.io/projected/d943896a-8c08-4d43-b1c4-d738b0079503-kube-api-access-mc88n\") pod \"network-metrics-daemon-99wq2\" (UID: \"d943896a-8c08-4d43-b1c4-d738b0079503\") " pod="openshift-multus/network-metrics-daemon-99wq2" Apr 17 20:15:35.501281 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500274 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/630511e5-409a-44f6-9416-ee72411c751e-lib-modules\") pod \"tuned-zgtt5\" (UID: \"630511e5-409a-44f6-9416-ee72411c751e\") " pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" Apr 17 20:15:35.501281 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500285 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-log-socket\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.501281 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500314 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/28b18d89-4df2-405e-8e06-5f5e39694305-tmp-dir\") pod \"node-resolver-h6fk4\" (UID: \"28b18d89-4df2-405e-8e06-5f5e39694305\") " pod="openshift-dns/node-resolver-h6fk4" Apr 17 20:15:35.502052 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500338 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7c43b13c-fe3e-4514-82f0-afc4e752be0a-socket-dir\") pod \"aws-ebs-csi-driver-node-w8d2r\" (UID: \"7c43b13c-fe3e-4514-82f0-afc4e752be0a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w8d2r" Apr 17 20:15:35.502052 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500353 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7c43b13c-fe3e-4514-82f0-afc4e752be0a-etc-selinux\") pod \"aws-ebs-csi-driver-node-w8d2r\" (UID: \"7c43b13c-fe3e-4514-82f0-afc4e752be0a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w8d2r" Apr 17 20:15:35.502052 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500368 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/16a5e25e-23a4-4106-a67e-adda44b1aaa6-system-cni-dir\") pod \"multus-additional-cni-plugins-qvs8p\" (UID: \"16a5e25e-23a4-4106-a67e-adda44b1aaa6\") " pod="openshift-multus/multus-additional-cni-plugins-qvs8p" Apr 17 20:15:35.502052 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500383 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/018ba037-0cf6-4ce0-ba07-95893c240cd2-host-var-lib-cni-multus\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.502052 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500408 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/018ba037-0cf6-4ce0-ba07-95893c240cd2-multus-daemon-config\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.502052 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500427 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-systemd-units\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.502052 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500442 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/630511e5-409a-44f6-9416-ee72411c751e-tmp\") pod \"tuned-zgtt5\" (UID: \"630511e5-409a-44f6-9416-ee72411c751e\") " pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" Apr 17 20:15:35.502052 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500485 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c81797ca-3338-4294-8ef8-fb0416677637-ovnkube-config\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.502052 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500478 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7c43b13c-fe3e-4514-82f0-afc4e752be0a-device-dir\") pod \"aws-ebs-csi-driver-node-w8d2r\" (UID: \"7c43b13c-fe3e-4514-82f0-afc4e752be0a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w8d2r" Apr 17 20:15:35.502052 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500526 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szkzd\" (UniqueName: \"kubernetes.io/projected/7c43b13c-fe3e-4514-82f0-afc4e752be0a-kube-api-access-szkzd\") pod \"aws-ebs-csi-driver-node-w8d2r\" (UID: \"7c43b13c-fe3e-4514-82f0-afc4e752be0a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w8d2r" Apr 17 20:15:35.502052 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500546 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-run-openvswitch\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.502052 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500561 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-run-ovn\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.502052 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500575 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-node-log\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.502052 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500581 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9d8cbec6-ac91-4373-b7ef-593404bf8a86-serviceca\") pod \"node-ca-n6lwg\" (UID: \"9d8cbec6-ac91-4373-b7ef-593404bf8a86\") " pod="openshift-image-registry/node-ca-n6lwg" Apr 17 20:15:35.502052 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500592 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/630511e5-409a-44f6-9416-ee72411c751e-etc-kubernetes\") pod \"tuned-zgtt5\" (UID: \"630511e5-409a-44f6-9416-ee72411c751e\") " pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" Apr 17 20:15:35.502052 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500608 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/630511e5-409a-44f6-9416-ee72411c751e-etc-systemd\") pod \"tuned-zgtt5\" (UID: \"630511e5-409a-44f6-9416-ee72411c751e\") " pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" Apr 17 20:15:35.502052 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500624 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-etc-openvswitch\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.502713 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500652 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/018ba037-0cf6-4ce0-ba07-95893c240cd2-host-run-k8s-cni-cncf-io\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.502713 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500657 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-run-openvswitch\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.502713 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500681 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-run-ovn\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.502713 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500686 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/018ba037-0cf6-4ce0-ba07-95893c240cd2-system-cni-dir\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.502713 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500698 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c81797ca-3338-4294-8ef8-fb0416677637-ovnkube-script-lib\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.502713 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500701 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-node-log\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.502713 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500729 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/630511e5-409a-44f6-9416-ee72411c751e-etc-sysctl-d\") pod \"tuned-zgtt5\" (UID: \"630511e5-409a-44f6-9416-ee72411c751e\") " pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" Apr 17 20:15:35.502713 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500624 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/018ba037-0cf6-4ce0-ba07-95893c240cd2-host-run-k8s-cni-cncf-io\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.502713 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500727 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/630511e5-409a-44f6-9416-ee72411c751e-etc-kubernetes\") pod \"tuned-zgtt5\" (UID: \"630511e5-409a-44f6-9416-ee72411c751e\") " pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" Apr 17 20:15:35.502713 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500772 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/16a5e25e-23a4-4106-a67e-adda44b1aaa6-system-cni-dir\") pod \"multus-additional-cni-plugins-qvs8p\" (UID: \"16a5e25e-23a4-4106-a67e-adda44b1aaa6\") " pod="openshift-multus/multus-additional-cni-plugins-qvs8p" Apr 17 20:15:35.502713 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500785 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/018ba037-0cf6-4ce0-ba07-95893c240cd2-host-run-multus-certs\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.502713 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500749 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/018ba037-0cf6-4ce0-ba07-95893c240cd2-host-var-lib-cni-bin\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.502713 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500781 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7c43b13c-fe3e-4514-82f0-afc4e752be0a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-w8d2r\" (UID: \"7c43b13c-fe3e-4514-82f0-afc4e752be0a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w8d2r" Apr 17 20:15:35.502713 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500801 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-log-socket\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.502713 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500817 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/630511e5-409a-44f6-9416-ee72411c751e-etc-systemd\") pod \"tuned-zgtt5\" (UID: \"630511e5-409a-44f6-9416-ee72411c751e\") " pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" Apr 17 20:15:35.502713 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500827 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/16a5e25e-23a4-4106-a67e-adda44b1aaa6-cni-binary-copy\") pod \"multus-additional-cni-plugins-qvs8p\" (UID: \"16a5e25e-23a4-4106-a67e-adda44b1aaa6\") " pod="openshift-multus/multus-additional-cni-plugins-qvs8p" Apr 17 20:15:35.502713 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500848 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-systemd-units\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.502713 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500853 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/018ba037-0cf6-4ce0-ba07-95893c240cd2-host-var-lib-cni-multus\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.503332 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500855 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/16a5e25e-23a4-4106-a67e-adda44b1aaa6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qvs8p\" (UID: \"16a5e25e-23a4-4106-a67e-adda44b1aaa6\") " pod="openshift-multus/multus-additional-cni-plugins-qvs8p" Apr 17 20:15:35.503332 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500880 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/16a5e25e-23a4-4106-a67e-adda44b1aaa6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qvs8p\" (UID: \"16a5e25e-23a4-4106-a67e-adda44b1aaa6\") " pod="openshift-multus/multus-additional-cni-plugins-qvs8p" Apr 17 20:15:35.503332 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.500899 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nc8k\" (UniqueName: \"kubernetes.io/projected/6efeec4a-10f8-42bb-a4e6-2efbc5f6a6bb-kube-api-access-2nc8k\") pod \"iptables-alerter-s6gxh\" (UID: \"6efeec4a-10f8-42bb-a4e6-2efbc5f6a6bb\") " pod="openshift-network-operator/iptables-alerter-s6gxh" Apr 17 20:15:35.503332 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.501082 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/630511e5-409a-44f6-9416-ee72411c751e-etc-sysconfig\") pod \"tuned-zgtt5\" (UID: \"630511e5-409a-44f6-9416-ee72411c751e\") " pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" Apr 17 20:15:35.503332 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.501070 2580 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 20:15:35.503332 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.501114 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/630511e5-409a-44f6-9416-ee72411c751e-etc-tuned\") pod \"tuned-zgtt5\" (UID: \"630511e5-409a-44f6-9416-ee72411c751e\") " pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" Apr 17 20:15:35.503332 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.501141 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/018ba037-0cf6-4ce0-ba07-95893c240cd2-cni-binary-copy\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.503332 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.501194 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/018ba037-0cf6-4ce0-ba07-95893c240cd2-multus-daemon-config\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.503332 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.501234 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/28b18d89-4df2-405e-8e06-5f5e39694305-hosts-file\") pod \"node-resolver-h6fk4\" (UID: \"28b18d89-4df2-405e-8e06-5f5e39694305\") " pod="openshift-dns/node-resolver-h6fk4" Apr 17 20:15:35.503332 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.501253 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/630511e5-409a-44f6-9416-ee72411c751e-host\") pod \"tuned-zgtt5\" (UID: \"630511e5-409a-44f6-9416-ee72411c751e\") " pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" Apr 17 20:15:35.503332 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.501291 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/630511e5-409a-44f6-9416-ee72411c751e-etc-sysconfig\") pod \"tuned-zgtt5\" (UID: \"630511e5-409a-44f6-9416-ee72411c751e\") " pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" Apr 17 20:15:35.503332 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.501343 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6efeec4a-10f8-42bb-a4e6-2efbc5f6a6bb-host-slash\") pod \"iptables-alerter-s6gxh\" (UID: \"6efeec4a-10f8-42bb-a4e6-2efbc5f6a6bb\") " pod="openshift-network-operator/iptables-alerter-s6gxh" Apr 17 20:15:35.503332 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.501376 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/16a5e25e-23a4-4106-a67e-adda44b1aaa6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qvs8p\" (UID: \"16a5e25e-23a4-4106-a67e-adda44b1aaa6\") " pod="openshift-multus/multus-additional-cni-plugins-qvs8p" Apr 17 20:15:35.503332 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.501378 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-host-slash\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.503332 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.501416 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-host-slash\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.503332 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.501416 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/630511e5-409a-44f6-9416-ee72411c751e-host\") pod \"tuned-zgtt5\" (UID: \"630511e5-409a-44f6-9416-ee72411c751e\") " pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" Apr 17 20:15:35.503332 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.501425 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-run-systemd\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.503907 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.501452 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.503907 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.501499 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d943896a-8c08-4d43-b1c4-d738b0079503-metrics-certs\") pod \"network-metrics-daemon-99wq2\" (UID: \"d943896a-8c08-4d43-b1c4-d738b0079503\") " pod="openshift-multus/network-metrics-daemon-99wq2" Apr 17 20:15:35.503907 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.501569 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-run-systemd\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.503907 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.501719 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.503907 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.501842 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/16a5e25e-23a4-4106-a67e-adda44b1aaa6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qvs8p\" (UID: \"16a5e25e-23a4-4106-a67e-adda44b1aaa6\") " pod="openshift-multus/multus-additional-cni-plugins-qvs8p" Apr 17 20:15:35.503907 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:35.501903 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:15:35.503907 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.501982 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c81797ca-3338-4294-8ef8-fb0416677637-ovn-node-metrics-cert\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.503907 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.501987 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/018ba037-0cf6-4ce0-ba07-95893c240cd2-cni-binary-copy\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.503907 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.502011 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/630511e5-409a-44f6-9416-ee72411c751e-etc-modprobe-d\") pod \"tuned-zgtt5\" (UID: \"630511e5-409a-44f6-9416-ee72411c751e\") " pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" Apr 17 20:15:35.503907 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:35.502041 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d943896a-8c08-4d43-b1c4-d738b0079503-metrics-certs podName:d943896a-8c08-4d43-b1c4-d738b0079503 nodeName:}" failed. No retries permitted until 2026-04-17 20:15:36.002020452 +0000 UTC m=+3.078392438 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d943896a-8c08-4d43-b1c4-d738b0079503-metrics-certs") pod "network-metrics-daemon-99wq2" (UID: "d943896a-8c08-4d43-b1c4-d738b0079503") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:15:35.503907 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.502057 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/018ba037-0cf6-4ce0-ba07-95893c240cd2-host-var-lib-kubelet\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.503907 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.502085 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7c43b13c-fe3e-4514-82f0-afc4e752be0a-registration-dir\") pod \"aws-ebs-csi-driver-node-w8d2r\" (UID: \"7c43b13c-fe3e-4514-82f0-afc4e752be0a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w8d2r" Apr 17 20:15:35.503907 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.502095 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/630511e5-409a-44f6-9416-ee72411c751e-etc-modprobe-d\") pod \"tuned-zgtt5\" (UID: \"630511e5-409a-44f6-9416-ee72411c751e\") " pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" Apr 17 20:15:35.503907 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.502113 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6efeec4a-10f8-42bb-a4e6-2efbc5f6a6bb-iptables-alerter-script\") pod \"iptables-alerter-s6gxh\" (UID: \"6efeec4a-10f8-42bb-a4e6-2efbc5f6a6bb\") " pod="openshift-network-operator/iptables-alerter-s6gxh" Apr 17 20:15:35.503907 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.502140 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-host-cni-bin\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.503907 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.502137 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/018ba037-0cf6-4ce0-ba07-95893c240cd2-host-var-lib-kubelet\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.503907 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.502163 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9d8cbec6-ac91-4373-b7ef-593404bf8a86-host\") pod \"node-ca-n6lwg\" (UID: \"9d8cbec6-ac91-4373-b7ef-593404bf8a86\") " pod="openshift-image-registry/node-ca-n6lwg" Apr 17 20:15:35.504638 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.502175 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/16a5e25e-23a4-4106-a67e-adda44b1aaa6-cni-binary-copy\") pod \"multus-additional-cni-plugins-qvs8p\" (UID: \"16a5e25e-23a4-4106-a67e-adda44b1aaa6\") " pod="openshift-multus/multus-additional-cni-plugins-qvs8p" Apr 17 20:15:35.504638 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.502187 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/018ba037-0cf6-4ce0-ba07-95893c240cd2-host-run-netns\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.504638 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.502192 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c81797ca-3338-4294-8ef8-fb0416677637-host-cni-bin\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.504638 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.502225 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/018ba037-0cf6-4ce0-ba07-95893c240cd2-host-run-netns\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.504638 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.502211 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/018ba037-0cf6-4ce0-ba07-95893c240cd2-multus-conf-dir\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.504638 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.502242 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9d8cbec6-ac91-4373-b7ef-593404bf8a86-host\") pod \"node-ca-n6lwg\" (UID: \"9d8cbec6-ac91-4373-b7ef-593404bf8a86\") " pod="openshift-image-registry/node-ca-n6lwg" Apr 17 20:15:35.504638 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.502260 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/87977dd5-cce0-46ae-8d11-ddfd87452aef-agent-certs\") pod \"konnectivity-agent-xtvgg\" (UID: \"87977dd5-cce0-46ae-8d11-ddfd87452aef\") " pod="kube-system/konnectivity-agent-xtvgg" Apr 17 20:15:35.504638 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.502287 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/018ba037-0cf6-4ce0-ba07-95893c240cd2-multus-conf-dir\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.504638 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.504451 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/630511e5-409a-44f6-9416-ee72411c751e-etc-tuned\") pod \"tuned-zgtt5\" (UID: \"630511e5-409a-44f6-9416-ee72411c751e\") " pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" Apr 17 20:15:35.504940 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.504761 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c81797ca-3338-4294-8ef8-fb0416677637-ovn-node-metrics-cert\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.505636 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.505615 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/630511e5-409a-44f6-9416-ee72411c751e-tmp\") pod \"tuned-zgtt5\" (UID: \"630511e5-409a-44f6-9416-ee72411c751e\") " pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" Apr 17 20:15:35.510534 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.510511 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8ps6\" (UniqueName: \"kubernetes.io/projected/c81797ca-3338-4294-8ef8-fb0416677637-kube-api-access-z8ps6\") pod \"ovnkube-node-vrmd6\" (UID: \"c81797ca-3338-4294-8ef8-fb0416677637\") " pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.514599 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.514576 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9ppl\" (UniqueName: \"kubernetes.io/projected/630511e5-409a-44f6-9416-ee72411c751e-kube-api-access-b9ppl\") pod \"tuned-zgtt5\" (UID: \"630511e5-409a-44f6-9416-ee72411c751e\") " pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" Apr 17 20:15:35.514919 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.514883 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzwk4\" (UniqueName: \"kubernetes.io/projected/16a5e25e-23a4-4106-a67e-adda44b1aaa6-kube-api-access-qzwk4\") pod \"multus-additional-cni-plugins-qvs8p\" (UID: \"16a5e25e-23a4-4106-a67e-adda44b1aaa6\") " pod="openshift-multus/multus-additional-cni-plugins-qvs8p" Apr 17 20:15:35.518556 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.516013 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gcn4\" (UniqueName: \"kubernetes.io/projected/018ba037-0cf6-4ce0-ba07-95893c240cd2-kube-api-access-4gcn4\") pod \"multus-lj6hr\" (UID: \"018ba037-0cf6-4ce0-ba07-95893c240cd2\") " pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.518556 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.516341 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc88n\" (UniqueName: \"kubernetes.io/projected/d943896a-8c08-4d43-b1c4-d738b0079503-kube-api-access-mc88n\") pod \"network-metrics-daemon-99wq2\" (UID: \"d943896a-8c08-4d43-b1c4-d738b0079503\") " pod="openshift-multus/network-metrics-daemon-99wq2" Apr 17 20:15:35.519443 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.518898 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7jjf\" (UniqueName: \"kubernetes.io/projected/9d8cbec6-ac91-4373-b7ef-593404bf8a86-kube-api-access-c7jjf\") pod \"node-ca-n6lwg\" (UID: \"9d8cbec6-ac91-4373-b7ef-593404bf8a86\") " pod="openshift-image-registry/node-ca-n6lwg" Apr 17 20:15:35.601419 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.601392 2580 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:15:35.602586 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.602564 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/28b18d89-4df2-405e-8e06-5f5e39694305-hosts-file\") pod \"node-resolver-h6fk4\" (UID: \"28b18d89-4df2-405e-8e06-5f5e39694305\") " pod="openshift-dns/node-resolver-h6fk4" Apr 17 20:15:35.602689 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.602598 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6efeec4a-10f8-42bb-a4e6-2efbc5f6a6bb-host-slash\") pod \"iptables-alerter-s6gxh\" (UID: \"6efeec4a-10f8-42bb-a4e6-2efbc5f6a6bb\") " pod="openshift-network-operator/iptables-alerter-s6gxh" Apr 17 20:15:35.602689 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.602655 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7c43b13c-fe3e-4514-82f0-afc4e752be0a-registration-dir\") pod \"aws-ebs-csi-driver-node-w8d2r\" (UID: \"7c43b13c-fe3e-4514-82f0-afc4e752be0a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w8d2r" Apr 17 20:15:35.602689 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.602660 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6efeec4a-10f8-42bb-a4e6-2efbc5f6a6bb-host-slash\") pod \"iptables-alerter-s6gxh\" (UID: \"6efeec4a-10f8-42bb-a4e6-2efbc5f6a6bb\") " pod="openshift-network-operator/iptables-alerter-s6gxh" Apr 17 20:15:35.602689 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.602679 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6efeec4a-10f8-42bb-a4e6-2efbc5f6a6bb-iptables-alerter-script\") pod \"iptables-alerter-s6gxh\" (UID: \"6efeec4a-10f8-42bb-a4e6-2efbc5f6a6bb\") " pod="openshift-network-operator/iptables-alerter-s6gxh" Apr 17 20:15:35.602890 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.602708 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/87977dd5-cce0-46ae-8d11-ddfd87452aef-agent-certs\") pod \"konnectivity-agent-xtvgg\" (UID: \"87977dd5-cce0-46ae-8d11-ddfd87452aef\") " pod="kube-system/konnectivity-agent-xtvgg" Apr 17 20:15:35.602890 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.602720 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/28b18d89-4df2-405e-8e06-5f5e39694305-hosts-file\") pod \"node-resolver-h6fk4\" (UID: \"28b18d89-4df2-405e-8e06-5f5e39694305\") " pod="openshift-dns/node-resolver-h6fk4" Apr 17 20:15:35.602890 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.602745 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-htcdf\" (UniqueName: \"kubernetes.io/projected/28b18d89-4df2-405e-8e06-5f5e39694305-kube-api-access-htcdf\") pod \"node-resolver-h6fk4\" (UID: \"28b18d89-4df2-405e-8e06-5f5e39694305\") " pod="openshift-dns/node-resolver-h6fk4" Apr 17 20:15:35.602890 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.602727 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7c43b13c-fe3e-4514-82f0-afc4e752be0a-registration-dir\") pod \"aws-ebs-csi-driver-node-w8d2r\" (UID: \"7c43b13c-fe3e-4514-82f0-afc4e752be0a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w8d2r" Apr 17 20:15:35.602890 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.602783 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7c43b13c-fe3e-4514-82f0-afc4e752be0a-sys-fs\") pod \"aws-ebs-csi-driver-node-w8d2r\" (UID: \"7c43b13c-fe3e-4514-82f0-afc4e752be0a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w8d2r" Apr 17 20:15:35.602890 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.602808 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7m7jr\" (UniqueName: \"kubernetes.io/projected/05059fd8-9f1b-4374-81cf-fd56830ab0bb-kube-api-access-7m7jr\") pod \"network-check-target-d9lfz\" (UID: \"05059fd8-9f1b-4374-81cf-fd56830ab0bb\") " pod="openshift-network-diagnostics/network-check-target-d9lfz" Apr 17 20:15:35.602890 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.602837 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/87977dd5-cce0-46ae-8d11-ddfd87452aef-konnectivity-ca\") pod \"konnectivity-agent-xtvgg\" (UID: \"87977dd5-cce0-46ae-8d11-ddfd87452aef\") " pod="kube-system/konnectivity-agent-xtvgg" Apr 17 20:15:35.602890 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.602855 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7c43b13c-fe3e-4514-82f0-afc4e752be0a-sys-fs\") pod \"aws-ebs-csi-driver-node-w8d2r\" (UID: \"7c43b13c-fe3e-4514-82f0-afc4e752be0a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w8d2r" Apr 17 20:15:35.602890 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.602866 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/28b18d89-4df2-405e-8e06-5f5e39694305-tmp-dir\") pod \"node-resolver-h6fk4\" (UID: \"28b18d89-4df2-405e-8e06-5f5e39694305\") " pod="openshift-dns/node-resolver-h6fk4" Apr 17 20:15:35.602890 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.602892 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7c43b13c-fe3e-4514-82f0-afc4e752be0a-socket-dir\") pod \"aws-ebs-csi-driver-node-w8d2r\" (UID: \"7c43b13c-fe3e-4514-82f0-afc4e752be0a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w8d2r" Apr 17 20:15:35.603352 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.602916 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7c43b13c-fe3e-4514-82f0-afc4e752be0a-etc-selinux\") pod \"aws-ebs-csi-driver-node-w8d2r\" (UID: \"7c43b13c-fe3e-4514-82f0-afc4e752be0a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w8d2r" Apr 17 20:15:35.603352 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.602945 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7c43b13c-fe3e-4514-82f0-afc4e752be0a-device-dir\") pod \"aws-ebs-csi-driver-node-w8d2r\" (UID: \"7c43b13c-fe3e-4514-82f0-afc4e752be0a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w8d2r" Apr 17 20:15:35.603352 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.602969 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-szkzd\" (UniqueName: \"kubernetes.io/projected/7c43b13c-fe3e-4514-82f0-afc4e752be0a-kube-api-access-szkzd\") pod \"aws-ebs-csi-driver-node-w8d2r\" (UID: \"7c43b13c-fe3e-4514-82f0-afc4e752be0a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w8d2r" Apr 17 20:15:35.603352 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.603023 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7c43b13c-fe3e-4514-82f0-afc4e752be0a-etc-selinux\") pod \"aws-ebs-csi-driver-node-w8d2r\" (UID: \"7c43b13c-fe3e-4514-82f0-afc4e752be0a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w8d2r" Apr 17 20:15:35.603352 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.603059 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7c43b13c-fe3e-4514-82f0-afc4e752be0a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-w8d2r\" (UID: \"7c43b13c-fe3e-4514-82f0-afc4e752be0a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w8d2r" Apr 17 20:15:35.603352 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.603094 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2nc8k\" (UniqueName: \"kubernetes.io/projected/6efeec4a-10f8-42bb-a4e6-2efbc5f6a6bb-kube-api-access-2nc8k\") pod \"iptables-alerter-s6gxh\" (UID: \"6efeec4a-10f8-42bb-a4e6-2efbc5f6a6bb\") " pod="openshift-network-operator/iptables-alerter-s6gxh" Apr 17 20:15:35.603352 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.603193 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7c43b13c-fe3e-4514-82f0-afc4e752be0a-socket-dir\") pod \"aws-ebs-csi-driver-node-w8d2r\" (UID: \"7c43b13c-fe3e-4514-82f0-afc4e752be0a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w8d2r" Apr 17 20:15:35.603352 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.603226 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/28b18d89-4df2-405e-8e06-5f5e39694305-tmp-dir\") pod \"node-resolver-h6fk4\" (UID: \"28b18d89-4df2-405e-8e06-5f5e39694305\") " pod="openshift-dns/node-resolver-h6fk4" Apr 17 20:15:35.603352 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.603248 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7c43b13c-fe3e-4514-82f0-afc4e752be0a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-w8d2r\" (UID: \"7c43b13c-fe3e-4514-82f0-afc4e752be0a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w8d2r" Apr 17 20:15:35.603352 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.603283 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7c43b13c-fe3e-4514-82f0-afc4e752be0a-device-dir\") pod \"aws-ebs-csi-driver-node-w8d2r\" (UID: \"7c43b13c-fe3e-4514-82f0-afc4e752be0a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w8d2r" Apr 17 20:15:35.603717 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.603698 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6efeec4a-10f8-42bb-a4e6-2efbc5f6a6bb-iptables-alerter-script\") pod \"iptables-alerter-s6gxh\" (UID: \"6efeec4a-10f8-42bb-a4e6-2efbc5f6a6bb\") " pod="openshift-network-operator/iptables-alerter-s6gxh" Apr 17 20:15:35.603717 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.603704 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/87977dd5-cce0-46ae-8d11-ddfd87452aef-konnectivity-ca\") pod \"konnectivity-agent-xtvgg\" (UID: \"87977dd5-cce0-46ae-8d11-ddfd87452aef\") " pod="kube-system/konnectivity-agent-xtvgg" Apr 17 20:15:35.605334 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.605305 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/87977dd5-cce0-46ae-8d11-ddfd87452aef-agent-certs\") pod \"konnectivity-agent-xtvgg\" (UID: \"87977dd5-cce0-46ae-8d11-ddfd87452aef\") " pod="kube-system/konnectivity-agent-xtvgg" Apr 17 20:15:35.606399 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.606381 2580 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:15:35.609057 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:35.609035 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:15:35.609136 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:35.609063 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:15:35.609136 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:35.609078 2580 projected.go:194] Error preparing data for projected volume kube-api-access-7m7jr for pod openshift-network-diagnostics/network-check-target-d9lfz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:15:35.609221 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:35.609149 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/05059fd8-9f1b-4374-81cf-fd56830ab0bb-kube-api-access-7m7jr podName:05059fd8-9f1b-4374-81cf-fd56830ab0bb nodeName:}" failed. No retries permitted until 2026-04-17 20:15:36.109131541 +0000 UTC m=+3.185503538 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-7m7jr" (UniqueName: "kubernetes.io/projected/05059fd8-9f1b-4374-81cf-fd56830ab0bb-kube-api-access-7m7jr") pod "network-check-target-d9lfz" (UID: "05059fd8-9f1b-4374-81cf-fd56830ab0bb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:15:35.610381 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.610357 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-htcdf\" (UniqueName: \"kubernetes.io/projected/28b18d89-4df2-405e-8e06-5f5e39694305-kube-api-access-htcdf\") pod \"node-resolver-h6fk4\" (UID: \"28b18d89-4df2-405e-8e06-5f5e39694305\") " pod="openshift-dns/node-resolver-h6fk4" Apr 17 20:15:35.610617 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.610599 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nc8k\" (UniqueName: \"kubernetes.io/projected/6efeec4a-10f8-42bb-a4e6-2efbc5f6a6bb-kube-api-access-2nc8k\") pod \"iptables-alerter-s6gxh\" (UID: \"6efeec4a-10f8-42bb-a4e6-2efbc5f6a6bb\") " pod="openshift-network-operator/iptables-alerter-s6gxh" Apr 17 20:15:35.612088 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.612067 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-szkzd\" (UniqueName: \"kubernetes.io/projected/7c43b13c-fe3e-4514-82f0-afc4e752be0a-kube-api-access-szkzd\") pod \"aws-ebs-csi-driver-node-w8d2r\" (UID: \"7c43b13c-fe3e-4514-82f0-afc4e752be0a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w8d2r" Apr 17 20:15:35.691971 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.691909 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-n6lwg" Apr 17 20:15:35.699592 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.699568 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qvs8p" Apr 17 20:15:35.710238 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.710220 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lj6hr" Apr 17 20:15:35.714872 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.714841 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:35.721348 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.721330 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" Apr 17 20:15:35.728917 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.728895 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-h6fk4" Apr 17 20:15:35.735476 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.735441 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w8d2r" Apr 17 20:15:35.741952 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.741935 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-s6gxh" Apr 17 20:15:35.747487 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:35.747471 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-xtvgg" Apr 17 20:15:36.006217 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:36.006133 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d943896a-8c08-4d43-b1c4-d738b0079503-metrics-certs\") pod \"network-metrics-daemon-99wq2\" (UID: \"d943896a-8c08-4d43-b1c4-d738b0079503\") " pod="openshift-multus/network-metrics-daemon-99wq2" Apr 17 20:15:36.006375 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:36.006241 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:15:36.006375 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:36.006306 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d943896a-8c08-4d43-b1c4-d738b0079503-metrics-certs podName:d943896a-8c08-4d43-b1c4-d738b0079503 nodeName:}" failed. No retries permitted until 2026-04-17 20:15:37.006289035 +0000 UTC m=+4.082661015 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d943896a-8c08-4d43-b1c4-d738b0079503-metrics-certs") pod "network-metrics-daemon-99wq2" (UID: "d943896a-8c08-4d43-b1c4-d738b0079503") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:15:36.134808 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:36.134738 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6efeec4a_10f8_42bb_a4e6_2efbc5f6a6bb.slice/crio-70ed866027633c7fd55e07eacef1a459dae636f48c013ee9612f67ada8be7ea3 WatchSource:0}: Error finding container 70ed866027633c7fd55e07eacef1a459dae636f48c013ee9612f67ada8be7ea3: Status 404 returned error can't find the container with id 70ed866027633c7fd55e07eacef1a459dae636f48c013ee9612f67ada8be7ea3 Apr 17 20:15:36.138911 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:36.138888 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87977dd5_cce0_46ae_8d11_ddfd87452aef.slice/crio-c3c17d00b5f684f053cfe88d730a4ad5cbb7d07fdb3dac9c002e7ee5690b7ada WatchSource:0}: Error finding container c3c17d00b5f684f053cfe88d730a4ad5cbb7d07fdb3dac9c002e7ee5690b7ada: Status 404 returned error can't find the container with id c3c17d00b5f684f053cfe88d730a4ad5cbb7d07fdb3dac9c002e7ee5690b7ada Apr 17 20:15:36.140113 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:36.140091 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc81797ca_3338_4294_8ef8_fb0416677637.slice/crio-c72daddae0bc05a4ee4f019a7c5b64c6186f5096752ed140b46a3bc0955c8bdb WatchSource:0}: Error finding container c72daddae0bc05a4ee4f019a7c5b64c6186f5096752ed140b46a3bc0955c8bdb: Status 404 returned error can't find the container with id c72daddae0bc05a4ee4f019a7c5b64c6186f5096752ed140b46a3bc0955c8bdb Apr 17 20:15:36.140927 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:36.140901 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d8cbec6_ac91_4373_b7ef_593404bf8a86.slice/crio-318222ed117768ba139a1a007c411191158e5a04dcef8526b1ffb52b41d7470f WatchSource:0}: Error finding container 318222ed117768ba139a1a007c411191158e5a04dcef8526b1ffb52b41d7470f: Status 404 returned error can't find the container with id 318222ed117768ba139a1a007c411191158e5a04dcef8526b1ffb52b41d7470f Apr 17 20:15:36.141925 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:36.141903 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16a5e25e_23a4_4106_a67e_adda44b1aaa6.slice/crio-f20f720520c07f1ce6376e9e920cda143ae391d1c6a528d3e0e0983b2a04a99c WatchSource:0}: Error finding container f20f720520c07f1ce6376e9e920cda143ae391d1c6a528d3e0e0983b2a04a99c: Status 404 returned error can't find the container with id f20f720520c07f1ce6376e9e920cda143ae391d1c6a528d3e0e0983b2a04a99c Apr 17 20:15:36.145224 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:15:36.145200 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod018ba037_0cf6_4ce0_ba07_95893c240cd2.slice/crio-fe0442ef4e019a010b5af06e72ff3e5e1dcd5d35ae88ee524b51583d48f3b107 WatchSource:0}: Error finding container fe0442ef4e019a010b5af06e72ff3e5e1dcd5d35ae88ee524b51583d48f3b107: Status 404 returned error can't find the container with id fe0442ef4e019a010b5af06e72ff3e5e1dcd5d35ae88ee524b51583d48f3b107 Apr 17 20:15:36.207060 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:36.206895 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7m7jr\" (UniqueName: \"kubernetes.io/projected/05059fd8-9f1b-4374-81cf-fd56830ab0bb-kube-api-access-7m7jr\") pod \"network-check-target-d9lfz\" (UID: \"05059fd8-9f1b-4374-81cf-fd56830ab0bb\") " pod="openshift-network-diagnostics/network-check-target-d9lfz" Apr 17 20:15:36.207156 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:36.207028 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:15:36.207195 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:36.207160 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:15:36.207195 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:36.207173 2580 projected.go:194] Error preparing data for projected volume kube-api-access-7m7jr for pod openshift-network-diagnostics/network-check-target-d9lfz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:15:36.207274 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:36.207220 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/05059fd8-9f1b-4374-81cf-fd56830ab0bb-kube-api-access-7m7jr podName:05059fd8-9f1b-4374-81cf-fd56830ab0bb nodeName:}" failed. No retries permitted until 2026-04-17 20:15:37.207206506 +0000 UTC m=+4.283578487 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-7m7jr" (UniqueName: "kubernetes.io/projected/05059fd8-9f1b-4374-81cf-fd56830ab0bb-kube-api-access-7m7jr") pod "network-check-target-d9lfz" (UID: "05059fd8-9f1b-4374-81cf-fd56830ab0bb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:15:36.427796 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:36.427692 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 20:10:34 +0000 UTC" deadline="2028-01-11 07:58:14.044223466 +0000 UTC" Apr 17 20:15:36.427796 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:36.427727 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15203h42m37.61649957s" Apr 17 20:15:36.546631 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:36.546596 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-57.ec2.internal" event={"ID":"ad918dee123885abb804caebda37d740","Type":"ContainerStarted","Data":"1fbf75cd9131a114e48c9d189709e524424e1305ea572f1153640521f9969f32"} Apr 17 20:15:36.555182 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:36.555145 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" event={"ID":"630511e5-409a-44f6-9416-ee72411c751e","Type":"ContainerStarted","Data":"90cb11689f51d6bd0c93387687db6bb168d98f4511f59acf8691fd11fbf24ac3"} Apr 17 20:15:36.562923 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:36.562895 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w8d2r" event={"ID":"7c43b13c-fe3e-4514-82f0-afc4e752be0a","Type":"ContainerStarted","Data":"86aa7be0954bc386b6e307367829d0d9ce9dca4ccf2ebd4c42f32390e0f4b2f7"} Apr 17 20:15:36.570017 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:36.569987 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lj6hr" event={"ID":"018ba037-0cf6-4ce0-ba07-95893c240cd2","Type":"ContainerStarted","Data":"fe0442ef4e019a010b5af06e72ff3e5e1dcd5d35ae88ee524b51583d48f3b107"} Apr 17 20:15:36.575082 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:36.575056 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-h6fk4" event={"ID":"28b18d89-4df2-405e-8e06-5f5e39694305","Type":"ContainerStarted","Data":"246428f7c3b5e646edf6a6db83261cdb7e880cc04cab76a23651768e058c7e2b"} Apr 17 20:15:36.582858 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:36.582831 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-n6lwg" event={"ID":"9d8cbec6-ac91-4373-b7ef-593404bf8a86","Type":"ContainerStarted","Data":"318222ed117768ba139a1a007c411191158e5a04dcef8526b1ffb52b41d7470f"} Apr 17 20:15:36.586014 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:36.585945 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-xtvgg" event={"ID":"87977dd5-cce0-46ae-8d11-ddfd87452aef","Type":"ContainerStarted","Data":"c3c17d00b5f684f053cfe88d730a4ad5cbb7d07fdb3dac9c002e7ee5690b7ada"} Apr 17 20:15:36.588403 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:36.588278 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qvs8p" event={"ID":"16a5e25e-23a4-4106-a67e-adda44b1aaa6","Type":"ContainerStarted","Data":"f20f720520c07f1ce6376e9e920cda143ae391d1c6a528d3e0e0983b2a04a99c"} Apr 17 20:15:36.596739 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:36.596592 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" event={"ID":"c81797ca-3338-4294-8ef8-fb0416677637","Type":"ContainerStarted","Data":"c72daddae0bc05a4ee4f019a7c5b64c6186f5096752ed140b46a3bc0955c8bdb"} Apr 17 20:15:36.599223 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:36.599074 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-s6gxh" event={"ID":"6efeec4a-10f8-42bb-a4e6-2efbc5f6a6bb","Type":"ContainerStarted","Data":"70ed866027633c7fd55e07eacef1a459dae636f48c013ee9612f67ada8be7ea3"} Apr 17 20:15:37.013264 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:37.013222 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d943896a-8c08-4d43-b1c4-d738b0079503-metrics-certs\") pod \"network-metrics-daemon-99wq2\" (UID: \"d943896a-8c08-4d43-b1c4-d738b0079503\") " pod="openshift-multus/network-metrics-daemon-99wq2" Apr 17 20:15:37.013437 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:37.013396 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:15:37.013517 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:37.013478 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d943896a-8c08-4d43-b1c4-d738b0079503-metrics-certs podName:d943896a-8c08-4d43-b1c4-d738b0079503 nodeName:}" failed. No retries permitted until 2026-04-17 20:15:39.013441159 +0000 UTC m=+6.089813145 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d943896a-8c08-4d43-b1c4-d738b0079503-metrics-certs") pod "network-metrics-daemon-99wq2" (UID: "d943896a-8c08-4d43-b1c4-d738b0079503") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:15:37.215763 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:37.215678 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7m7jr\" (UniqueName: \"kubernetes.io/projected/05059fd8-9f1b-4374-81cf-fd56830ab0bb-kube-api-access-7m7jr\") pod \"network-check-target-d9lfz\" (UID: \"05059fd8-9f1b-4374-81cf-fd56830ab0bb\") " pod="openshift-network-diagnostics/network-check-target-d9lfz" Apr 17 20:15:37.228325 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:37.227660 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:15:37.228325 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:37.227688 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:15:37.228325 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:37.227709 2580 projected.go:194] Error preparing data for projected volume kube-api-access-7m7jr for pod openshift-network-diagnostics/network-check-target-d9lfz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:15:37.228325 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:37.227773 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/05059fd8-9f1b-4374-81cf-fd56830ab0bb-kube-api-access-7m7jr podName:05059fd8-9f1b-4374-81cf-fd56830ab0bb nodeName:}" failed. No retries permitted until 2026-04-17 20:15:39.2277472 +0000 UTC m=+6.304119200 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-7m7jr" (UniqueName: "kubernetes.io/projected/05059fd8-9f1b-4374-81cf-fd56830ab0bb-kube-api-access-7m7jr") pod "network-check-target-d9lfz" (UID: "05059fd8-9f1b-4374-81cf-fd56830ab0bb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:15:37.529003 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:37.528923 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99wq2" Apr 17 20:15:37.529424 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:37.529067 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99wq2" podUID="d943896a-8c08-4d43-b1c4-d738b0079503" Apr 17 20:15:37.529517 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:37.529484 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d9lfz" Apr 17 20:15:37.529594 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:37.529572 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d9lfz" podUID="05059fd8-9f1b-4374-81cf-fd56830ab0bb" Apr 17 20:15:37.606223 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:37.606189 2580 generic.go:358] "Generic (PLEG): container finished" podID="e758ab3eb0bee98b6ae04d49ace534ee" containerID="ea445b44777073cdb3d751c43de47dff0fb83eb9ee230116d9ee2410c5270809" exitCode=0 Apr 17 20:15:37.607109 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:37.607085 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-57.ec2.internal" event={"ID":"e758ab3eb0bee98b6ae04d49ace534ee","Type":"ContainerDied","Data":"ea445b44777073cdb3d751c43de47dff0fb83eb9ee230116d9ee2410c5270809"} Apr 17 20:15:37.620828 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:37.620777 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-57.ec2.internal" podStartSLOduration=3.620760627 podStartE2EDuration="3.620760627s" podCreationTimestamp="2026-04-17 20:15:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:15:36.559247551 +0000 UTC m=+3.635619556" watchObservedRunningTime="2026-04-17 20:15:37.620760627 +0000 UTC m=+4.697132630" Apr 17 20:15:38.615926 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:38.615888 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-57.ec2.internal" event={"ID":"e758ab3eb0bee98b6ae04d49ace534ee","Type":"ContainerStarted","Data":"52796344dea24d2238b63066f3e03338464b6a3c8e4113dcf099bbaa34eeb879"} Apr 17 20:15:39.030012 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:39.029901 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d943896a-8c08-4d43-b1c4-d738b0079503-metrics-certs\") pod \"network-metrics-daemon-99wq2\" (UID: \"d943896a-8c08-4d43-b1c4-d738b0079503\") " pod="openshift-multus/network-metrics-daemon-99wq2" Apr 17 20:15:39.030177 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:39.030082 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:15:39.030177 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:39.030146 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d943896a-8c08-4d43-b1c4-d738b0079503-metrics-certs podName:d943896a-8c08-4d43-b1c4-d738b0079503 nodeName:}" failed. No retries permitted until 2026-04-17 20:15:43.030127549 +0000 UTC m=+10.106499532 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d943896a-8c08-4d43-b1c4-d738b0079503-metrics-certs") pod "network-metrics-daemon-99wq2" (UID: "d943896a-8c08-4d43-b1c4-d738b0079503") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:15:39.232315 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:39.231662 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7m7jr\" (UniqueName: \"kubernetes.io/projected/05059fd8-9f1b-4374-81cf-fd56830ab0bb-kube-api-access-7m7jr\") pod \"network-check-target-d9lfz\" (UID: \"05059fd8-9f1b-4374-81cf-fd56830ab0bb\") " pod="openshift-network-diagnostics/network-check-target-d9lfz" Apr 17 20:15:39.232315 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:39.231860 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:15:39.232315 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:39.231882 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:15:39.232315 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:39.231894 2580 projected.go:194] Error preparing data for projected volume kube-api-access-7m7jr for pod openshift-network-diagnostics/network-check-target-d9lfz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:15:39.232315 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:39.231960 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/05059fd8-9f1b-4374-81cf-fd56830ab0bb-kube-api-access-7m7jr podName:05059fd8-9f1b-4374-81cf-fd56830ab0bb nodeName:}" failed. No retries permitted until 2026-04-17 20:15:43.231932221 +0000 UTC m=+10.308304218 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-7m7jr" (UniqueName: "kubernetes.io/projected/05059fd8-9f1b-4374-81cf-fd56830ab0bb-kube-api-access-7m7jr") pod "network-check-target-d9lfz" (UID: "05059fd8-9f1b-4374-81cf-fd56830ab0bb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:15:39.528622 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:39.528539 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99wq2" Apr 17 20:15:39.528787 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:39.528657 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99wq2" podUID="d943896a-8c08-4d43-b1c4-d738b0079503" Apr 17 20:15:39.528787 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:39.528709 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d9lfz" Apr 17 20:15:39.528787 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:39.528765 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d9lfz" podUID="05059fd8-9f1b-4374-81cf-fd56830ab0bb" Apr 17 20:15:41.527639 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:41.527606 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99wq2" Apr 17 20:15:41.528059 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:41.527655 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d9lfz" Apr 17 20:15:41.528059 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:41.527733 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99wq2" podUID="d943896a-8c08-4d43-b1c4-d738b0079503" Apr 17 20:15:41.528059 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:41.527942 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d9lfz" podUID="05059fd8-9f1b-4374-81cf-fd56830ab0bb" Apr 17 20:15:43.061770 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:43.061725 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d943896a-8c08-4d43-b1c4-d738b0079503-metrics-certs\") pod \"network-metrics-daemon-99wq2\" (UID: \"d943896a-8c08-4d43-b1c4-d738b0079503\") " pod="openshift-multus/network-metrics-daemon-99wq2" Apr 17 20:15:43.062158 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:43.061871 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:15:43.062158 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:43.061932 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d943896a-8c08-4d43-b1c4-d738b0079503-metrics-certs podName:d943896a-8c08-4d43-b1c4-d738b0079503 nodeName:}" failed. No retries permitted until 2026-04-17 20:15:51.06191616 +0000 UTC m=+18.138288140 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d943896a-8c08-4d43-b1c4-d738b0079503-metrics-certs") pod "network-metrics-daemon-99wq2" (UID: "d943896a-8c08-4d43-b1c4-d738b0079503") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:15:43.263700 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:43.263662 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7m7jr\" (UniqueName: \"kubernetes.io/projected/05059fd8-9f1b-4374-81cf-fd56830ab0bb-kube-api-access-7m7jr\") pod \"network-check-target-d9lfz\" (UID: \"05059fd8-9f1b-4374-81cf-fd56830ab0bb\") " pod="openshift-network-diagnostics/network-check-target-d9lfz" Apr 17 20:15:43.263895 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:43.263844 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:15:43.263895 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:43.263867 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:15:43.263895 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:43.263881 2580 projected.go:194] Error preparing data for projected volume kube-api-access-7m7jr for pod openshift-network-diagnostics/network-check-target-d9lfz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:15:43.264051 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:43.263940 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/05059fd8-9f1b-4374-81cf-fd56830ab0bb-kube-api-access-7m7jr podName:05059fd8-9f1b-4374-81cf-fd56830ab0bb nodeName:}" failed. No retries permitted until 2026-04-17 20:15:51.26391995 +0000 UTC m=+18.340291948 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-7m7jr" (UniqueName: "kubernetes.io/projected/05059fd8-9f1b-4374-81cf-fd56830ab0bb-kube-api-access-7m7jr") pod "network-check-target-d9lfz" (UID: "05059fd8-9f1b-4374-81cf-fd56830ab0bb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:15:43.527380 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:43.527348 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d9lfz" Apr 17 20:15:43.527565 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:43.527541 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d9lfz" podUID="05059fd8-9f1b-4374-81cf-fd56830ab0bb" Apr 17 20:15:43.527640 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:43.527600 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99wq2" Apr 17 20:15:43.527748 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:43.527725 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99wq2" podUID="d943896a-8c08-4d43-b1c4-d738b0079503" Apr 17 20:15:45.526140 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:45.526102 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99wq2" Apr 17 20:15:45.526537 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:45.526233 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99wq2" podUID="d943896a-8c08-4d43-b1c4-d738b0079503" Apr 17 20:15:45.528125 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:45.528102 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d9lfz" Apr 17 20:15:45.528257 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:45.528202 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d9lfz" podUID="05059fd8-9f1b-4374-81cf-fd56830ab0bb" Apr 17 20:15:47.526476 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:47.526434 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d9lfz" Apr 17 20:15:47.526840 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:47.526555 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d9lfz" podUID="05059fd8-9f1b-4374-81cf-fd56830ab0bb" Apr 17 20:15:47.526840 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:47.526619 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99wq2" Apr 17 20:15:47.526840 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:47.526724 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99wq2" podUID="d943896a-8c08-4d43-b1c4-d738b0079503" Apr 17 20:15:49.526265 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:49.526232 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99wq2" Apr 17 20:15:49.526753 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:49.526232 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d9lfz" Apr 17 20:15:49.526753 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:49.526363 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99wq2" podUID="d943896a-8c08-4d43-b1c4-d738b0079503" Apr 17 20:15:49.526753 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:49.526477 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d9lfz" podUID="05059fd8-9f1b-4374-81cf-fd56830ab0bb" Apr 17 20:15:51.120772 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:51.120723 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d943896a-8c08-4d43-b1c4-d738b0079503-metrics-certs\") pod \"network-metrics-daemon-99wq2\" (UID: \"d943896a-8c08-4d43-b1c4-d738b0079503\") " pod="openshift-multus/network-metrics-daemon-99wq2" Apr 17 20:15:51.121308 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:51.120850 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:15:51.121308 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:51.120927 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d943896a-8c08-4d43-b1c4-d738b0079503-metrics-certs podName:d943896a-8c08-4d43-b1c4-d738b0079503 nodeName:}" failed. No retries permitted until 2026-04-17 20:16:07.120905035 +0000 UTC m=+34.197277018 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d943896a-8c08-4d43-b1c4-d738b0079503-metrics-certs") pod "network-metrics-daemon-99wq2" (UID: "d943896a-8c08-4d43-b1c4-d738b0079503") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:15:51.322234 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:51.322193 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7m7jr\" (UniqueName: \"kubernetes.io/projected/05059fd8-9f1b-4374-81cf-fd56830ab0bb-kube-api-access-7m7jr\") pod \"network-check-target-d9lfz\" (UID: \"05059fd8-9f1b-4374-81cf-fd56830ab0bb\") " pod="openshift-network-diagnostics/network-check-target-d9lfz" Apr 17 20:15:51.322534 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:51.322323 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:15:51.322534 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:51.322348 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:15:51.322534 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:51.322360 2580 projected.go:194] Error preparing data for projected volume kube-api-access-7m7jr for pod openshift-network-diagnostics/network-check-target-d9lfz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:15:51.322534 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:51.322424 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/05059fd8-9f1b-4374-81cf-fd56830ab0bb-kube-api-access-7m7jr podName:05059fd8-9f1b-4374-81cf-fd56830ab0bb nodeName:}" failed. No retries permitted until 2026-04-17 20:16:07.32240637 +0000 UTC m=+34.398778356 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-7m7jr" (UniqueName: "kubernetes.io/projected/05059fd8-9f1b-4374-81cf-fd56830ab0bb-kube-api-access-7m7jr") pod "network-check-target-d9lfz" (UID: "05059fd8-9f1b-4374-81cf-fd56830ab0bb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:15:51.526597 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:51.526519 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d9lfz" Apr 17 20:15:51.526758 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:51.526522 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99wq2" Apr 17 20:15:51.526758 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:51.526647 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d9lfz" podUID="05059fd8-9f1b-4374-81cf-fd56830ab0bb" Apr 17 20:15:51.526758 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:51.526732 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99wq2" podUID="d943896a-8c08-4d43-b1c4-d738b0079503" Apr 17 20:15:53.527965 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:53.527939 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99wq2" Apr 17 20:15:53.528238 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:53.528063 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99wq2" podUID="d943896a-8c08-4d43-b1c4-d738b0079503" Apr 17 20:15:53.528238 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:53.528115 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d9lfz" Apr 17 20:15:53.528238 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:53.528219 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d9lfz" podUID="05059fd8-9f1b-4374-81cf-fd56830ab0bb" Apr 17 20:15:54.645181 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:54.644778 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" event={"ID":"630511e5-409a-44f6-9416-ee72411c751e","Type":"ContainerStarted","Data":"8c2f7c3dff151557223265fd26003c71e9cdfbaca4158f9616552a5e62159da9"} Apr 17 20:15:54.646086 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:54.646068 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w8d2r" event={"ID":"7c43b13c-fe3e-4514-82f0-afc4e752be0a","Type":"ContainerStarted","Data":"9e067acdf3f679397c239de45ac85f70fd7f6e48a942f999fdaeeb2065e4589e"} Apr 17 20:15:54.647203 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:54.647181 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lj6hr" event={"ID":"018ba037-0cf6-4ce0-ba07-95893c240cd2","Type":"ContainerStarted","Data":"e16454fad4b1f475bb5beacf1773b09bf37fbd2b6b73ad3a0b9c02c10b3a0faf"} Apr 17 20:15:54.648526 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:54.648506 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-h6fk4" event={"ID":"28b18d89-4df2-405e-8e06-5f5e39694305","Type":"ContainerStarted","Data":"3ad951d685311eb5b1e009a4817466f09917e5d5813772b543723a116d8c48c1"} Apr 17 20:15:54.649721 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:54.649700 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-n6lwg" event={"ID":"9d8cbec6-ac91-4373-b7ef-593404bf8a86","Type":"ContainerStarted","Data":"50405b21432f6c25f0a43dd5762dba40bf60190d7f6239b44de79a3646831035"} Apr 17 20:15:54.650870 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:54.650849 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-xtvgg" event={"ID":"87977dd5-cce0-46ae-8d11-ddfd87452aef","Type":"ContainerStarted","Data":"c121cc779aab3eb5fd0cb22d7ad6875d6a9eff74aaa40a52e7c3f79f32f629fa"} Apr 17 20:15:54.652077 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:54.652055 2580 generic.go:358] "Generic (PLEG): container finished" podID="16a5e25e-23a4-4106-a67e-adda44b1aaa6" containerID="a85abf13f52c8c75d096bb5ef0d5b4fd4921c2dadfe6ed3eb0d1598ead777edb" exitCode=0 Apr 17 20:15:54.652141 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:54.652112 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qvs8p" event={"ID":"16a5e25e-23a4-4106-a67e-adda44b1aaa6","Type":"ContainerDied","Data":"a85abf13f52c8c75d096bb5ef0d5b4fd4921c2dadfe6ed3eb0d1598ead777edb"} Apr 17 20:15:54.654370 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:54.654352 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrmd6_c81797ca-3338-4294-8ef8-fb0416677637/ovn-acl-logging/0.log" Apr 17 20:15:54.654671 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:54.654655 2580 generic.go:358] "Generic (PLEG): container finished" podID="c81797ca-3338-4294-8ef8-fb0416677637" containerID="0fa435e24a7a3995bba835508487f7572b09c0f324ecc5c4b50a39fde8211306" exitCode=1 Apr 17 20:15:54.654740 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:54.654681 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" event={"ID":"c81797ca-3338-4294-8ef8-fb0416677637","Type":"ContainerStarted","Data":"59bd63ab15fc55264f3e2cbf3d82d6ad89c024eb34558a005bfa28a5dd1ff28f"} Apr 17 20:15:54.654740 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:54.654695 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" event={"ID":"c81797ca-3338-4294-8ef8-fb0416677637","Type":"ContainerStarted","Data":"a41a33e249823c23af45a83175aa87a589a6b5e3bd3cc5893443333abb362fa2"} Apr 17 20:15:54.654740 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:54.654705 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" event={"ID":"c81797ca-3338-4294-8ef8-fb0416677637","Type":"ContainerStarted","Data":"37a0aaaa3a11385e09152adbfaf3504cf593fb52f1e3728155129dcbaef974ce"} Apr 17 20:15:54.654740 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:54.654713 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" event={"ID":"c81797ca-3338-4294-8ef8-fb0416677637","Type":"ContainerStarted","Data":"16214eed3f3778c70312b0e01d96ac3cf2442cf405dab3c641836c3ab3a0f4f2"} Apr 17 20:15:54.654740 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:54.654721 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" event={"ID":"c81797ca-3338-4294-8ef8-fb0416677637","Type":"ContainerDied","Data":"0fa435e24a7a3995bba835508487f7572b09c0f324ecc5c4b50a39fde8211306"} Apr 17 20:15:54.654740 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:54.654731 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" event={"ID":"c81797ca-3338-4294-8ef8-fb0416677637","Type":"ContainerStarted","Data":"c5a5ca03bb0917e391a3b5f3198caec17f72600166869f43092a3bb94ca37a96"} Apr 17 20:15:54.662505 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:54.662473 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-57.ec2.internal" podStartSLOduration=20.662446331 podStartE2EDuration="20.662446331s" podCreationTimestamp="2026-04-17 20:15:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:15:38.630539384 +0000 UTC m=+5.706911388" watchObservedRunningTime="2026-04-17 20:15:54.662446331 +0000 UTC m=+21.738818331" Apr 17 20:15:54.662891 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:54.662867 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-zgtt5" podStartSLOduration=4.344221276 podStartE2EDuration="21.662858558s" podCreationTimestamp="2026-04-17 20:15:33 +0000 UTC" firstStartedPulling="2026-04-17 20:15:36.149108801 +0000 UTC m=+3.225480780" lastFinishedPulling="2026-04-17 20:15:53.467746069 +0000 UTC m=+20.544118062" observedRunningTime="2026-04-17 20:15:54.662506072 +0000 UTC m=+21.738878073" watchObservedRunningTime="2026-04-17 20:15:54.662858558 +0000 UTC m=+21.739230563" Apr 17 20:15:54.674963 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:54.674926 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-xtvgg" podStartSLOduration=4.382664851 podStartE2EDuration="21.674916774s" podCreationTimestamp="2026-04-17 20:15:33 +0000 UTC" firstStartedPulling="2026-04-17 20:15:36.140733592 +0000 UTC m=+3.217105576" lastFinishedPulling="2026-04-17 20:15:53.432985507 +0000 UTC m=+20.509357499" observedRunningTime="2026-04-17 20:15:54.674534265 +0000 UTC m=+21.750906268" watchObservedRunningTime="2026-04-17 20:15:54.674916774 +0000 UTC m=+21.751288775" Apr 17 20:15:54.686339 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:54.686310 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-n6lwg" podStartSLOduration=12.454615817 podStartE2EDuration="21.686299795s" podCreationTimestamp="2026-04-17 20:15:33 +0000 UTC" firstStartedPulling="2026-04-17 20:15:36.143744962 +0000 UTC m=+3.220116944" lastFinishedPulling="2026-04-17 20:15:45.375428939 +0000 UTC m=+12.451800922" observedRunningTime="2026-04-17 20:15:54.686240916 +0000 UTC m=+21.762612918" watchObservedRunningTime="2026-04-17 20:15:54.686299795 +0000 UTC m=+21.762671798" Apr 17 20:15:54.736722 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:54.736647 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-lj6hr" podStartSLOduration=4.412929836 podStartE2EDuration="21.736638102s" podCreationTimestamp="2026-04-17 20:15:33 +0000 UTC" firstStartedPulling="2026-04-17 20:15:36.147132064 +0000 UTC m=+3.223504057" lastFinishedPulling="2026-04-17 20:15:53.470840343 +0000 UTC m=+20.547212323" observedRunningTime="2026-04-17 20:15:54.725030745 +0000 UTC m=+21.801402747" watchObservedRunningTime="2026-04-17 20:15:54.736638102 +0000 UTC m=+21.813010103" Apr 17 20:15:54.736920 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:54.736901 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-h6fk4" podStartSLOduration=4.449820165 podStartE2EDuration="21.736896224s" podCreationTimestamp="2026-04-17 20:15:33 +0000 UTC" firstStartedPulling="2026-04-17 20:15:36.145863384 +0000 UTC m=+3.222235365" lastFinishedPulling="2026-04-17 20:15:53.432939429 +0000 UTC m=+20.509311424" observedRunningTime="2026-04-17 20:15:54.736546165 +0000 UTC m=+21.812918180" watchObservedRunningTime="2026-04-17 20:15:54.736896224 +0000 UTC m=+21.813268225" Apr 17 20:15:54.956325 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:54.956294 2580 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 20:15:55.460990 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:55.460857 2580 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T20:15:54.956321647Z","UUID":"7c3bdfb9-4996-46c2-9e51-21cd7b1f1f5d","Handler":null,"Name":"","Endpoint":""} Apr 17 20:15:55.464303 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:55.464264 2580 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 20:15:55.464303 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:55.464302 2580 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 20:15:55.528627 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:55.528599 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99wq2" Apr 17 20:15:55.528766 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:55.528722 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99wq2" podUID="d943896a-8c08-4d43-b1c4-d738b0079503" Apr 17 20:15:55.529104 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:55.529087 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d9lfz" Apr 17 20:15:55.529199 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:55.529174 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d9lfz" podUID="05059fd8-9f1b-4374-81cf-fd56830ab0bb" Apr 17 20:15:55.658823 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:55.658784 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w8d2r" event={"ID":"7c43b13c-fe3e-4514-82f0-afc4e752be0a","Type":"ContainerStarted","Data":"d3cbba42d8247bd3ebaed4443bde1c93e0bad1c27fb267e26bc62f03333ee204"} Apr 17 20:15:55.660597 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:55.660571 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-s6gxh" event={"ID":"6efeec4a-10f8-42bb-a4e6-2efbc5f6a6bb","Type":"ContainerStarted","Data":"2cf2035054cdfa60f796c143e6dbb6c43ce6b0ccd02046918b3d09a1ba1ba956"} Apr 17 20:15:55.677428 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:55.677378 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-s6gxh" podStartSLOduration=5.381673856 podStartE2EDuration="22.677365926s" podCreationTimestamp="2026-04-17 20:15:33 +0000 UTC" firstStartedPulling="2026-04-17 20:15:36.137285594 +0000 UTC m=+3.213657576" lastFinishedPulling="2026-04-17 20:15:53.432977656 +0000 UTC m=+20.509349646" observedRunningTime="2026-04-17 20:15:55.677267162 +0000 UTC m=+22.753639165" watchObservedRunningTime="2026-04-17 20:15:55.677365926 +0000 UTC m=+22.753737925" Apr 17 20:15:56.664123 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:56.663856 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w8d2r" event={"ID":"7c43b13c-fe3e-4514-82f0-afc4e752be0a","Type":"ContainerStarted","Data":"707f2d5669cbb90da5b1bf95c33dbcdb853a23b5768d28952df9e84a3f14e650"} Apr 17 20:15:56.666997 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:56.666969 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrmd6_c81797ca-3338-4294-8ef8-fb0416677637/ovn-acl-logging/0.log" Apr 17 20:15:56.667334 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:56.667304 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" event={"ID":"c81797ca-3338-4294-8ef8-fb0416677637","Type":"ContainerStarted","Data":"29c5909d5be333fa5de1d3acd6641c512c5e5a3e5d864afb4ba317499528e3f6"} Apr 17 20:15:56.678797 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:56.678743 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w8d2r" podStartSLOduration=3.8411271510000002 podStartE2EDuration="23.678725516s" podCreationTimestamp="2026-04-17 20:15:33 +0000 UTC" firstStartedPulling="2026-04-17 20:15:36.148319427 +0000 UTC m=+3.224691407" lastFinishedPulling="2026-04-17 20:15:55.985917777 +0000 UTC m=+23.062289772" observedRunningTime="2026-04-17 20:15:56.678281437 +0000 UTC m=+23.754653439" watchObservedRunningTime="2026-04-17 20:15:56.678725516 +0000 UTC m=+23.755097519" Apr 17 20:15:57.527029 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:57.526976 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99wq2" Apr 17 20:15:57.527253 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:57.526976 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d9lfz" Apr 17 20:15:57.527253 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:57.527109 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99wq2" podUID="d943896a-8c08-4d43-b1c4-d738b0079503" Apr 17 20:15:57.527253 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:57.527224 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d9lfz" podUID="05059fd8-9f1b-4374-81cf-fd56830ab0bb" Apr 17 20:15:58.635849 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:58.635245 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-xtvgg" Apr 17 20:15:58.636472 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:58.635897 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-xtvgg" Apr 17 20:15:58.676938 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:58.675882 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrmd6_c81797ca-3338-4294-8ef8-fb0416677637/ovn-acl-logging/0.log" Apr 17 20:15:58.676938 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:58.676812 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" event={"ID":"c81797ca-3338-4294-8ef8-fb0416677637","Type":"ContainerStarted","Data":"bfba9f97dae61405f26ca5e2a69f9426160bcc46ab3b5d79873cff5e3f0d1037"} Apr 17 20:15:58.676938 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:58.676886 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:58.677349 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:58.676972 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:58.677349 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:58.677014 2580 scope.go:117] "RemoveContainer" containerID="0fa435e24a7a3995bba835508487f7572b09c0f324ecc5c4b50a39fde8211306" Apr 17 20:15:58.677349 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:58.677054 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:58.699852 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:58.699694 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:58.700948 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:58.700665 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:15:59.527060 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:59.526868 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99wq2" Apr 17 20:15:59.527247 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:59.526868 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d9lfz" Apr 17 20:15:59.527247 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:59.527141 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99wq2" podUID="d943896a-8c08-4d43-b1c4-d738b0079503" Apr 17 20:15:59.527247 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:15:59.527194 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d9lfz" podUID="05059fd8-9f1b-4374-81cf-fd56830ab0bb" Apr 17 20:15:59.679596 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:59.679560 2580 generic.go:358] "Generic (PLEG): container finished" podID="16a5e25e-23a4-4106-a67e-adda44b1aaa6" containerID="100f7fd4b2126277a8a1f16dd146ed63e1a3a749f1c29c2d51c4af62ef02b19a" exitCode=0 Apr 17 20:15:59.680020 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:59.679628 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qvs8p" event={"ID":"16a5e25e-23a4-4106-a67e-adda44b1aaa6","Type":"ContainerDied","Data":"100f7fd4b2126277a8a1f16dd146ed63e1a3a749f1c29c2d51c4af62ef02b19a"} Apr 17 20:15:59.682929 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:59.682904 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrmd6_c81797ca-3338-4294-8ef8-fb0416677637/ovn-acl-logging/0.log" Apr 17 20:15:59.683179 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:59.683160 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" event={"ID":"c81797ca-3338-4294-8ef8-fb0416677637","Type":"ContainerStarted","Data":"7fa7fdf8f6cf0943c34326e2eea422fe8eaea1782feef8b8374211c859c6241a"} Apr 17 20:15:59.720097 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:59.720053 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" podStartSLOduration=9.328501387 podStartE2EDuration="26.720041663s" podCreationTimestamp="2026-04-17 20:15:33 +0000 UTC" firstStartedPulling="2026-04-17 20:15:36.142075271 +0000 UTC m=+3.218447250" lastFinishedPulling="2026-04-17 20:15:53.533615539 +0000 UTC m=+20.609987526" observedRunningTime="2026-04-17 20:15:59.719592227 +0000 UTC m=+26.795964265" watchObservedRunningTime="2026-04-17 20:15:59.720041663 +0000 UTC m=+26.796413665" Apr 17 20:15:59.921637 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:59.921606 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-xtvgg" Apr 17 20:15:59.921811 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:59.921751 2580 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 20:15:59.922109 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:15:59.922092 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-xtvgg" Apr 17 20:16:00.538290 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:00.538236 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-99wq2"] Apr 17 20:16:00.538413 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:00.538401 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99wq2" Apr 17 20:16:00.538572 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:16:00.538545 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99wq2" podUID="d943896a-8c08-4d43-b1c4-d738b0079503" Apr 17 20:16:00.539043 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:00.539017 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-d9lfz"] Apr 17 20:16:00.539143 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:00.539129 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d9lfz" Apr 17 20:16:00.539228 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:16:00.539210 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d9lfz" podUID="05059fd8-9f1b-4374-81cf-fd56830ab0bb" Apr 17 20:16:00.686443 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:00.686360 2580 generic.go:358] "Generic (PLEG): container finished" podID="16a5e25e-23a4-4106-a67e-adda44b1aaa6" containerID="bf75698647050c3cf6bb4e9dc17575220bfeabb0f4a7ca1f6e2d440648d336c9" exitCode=0 Apr 17 20:16:00.686803 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:00.686470 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qvs8p" event={"ID":"16a5e25e-23a4-4106-a67e-adda44b1aaa6","Type":"ContainerDied","Data":"bf75698647050c3cf6bb4e9dc17575220bfeabb0f4a7ca1f6e2d440648d336c9"} Apr 17 20:16:01.689904 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:01.689713 2580 generic.go:358] "Generic (PLEG): container finished" podID="16a5e25e-23a4-4106-a67e-adda44b1aaa6" containerID="d90b4c4e43f04e9e6242e96220e278c08f96385164789d3e974bbe5ca4f04bfe" exitCode=0 Apr 17 20:16:01.689904 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:01.689768 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qvs8p" event={"ID":"16a5e25e-23a4-4106-a67e-adda44b1aaa6","Type":"ContainerDied","Data":"d90b4c4e43f04e9e6242e96220e278c08f96385164789d3e974bbe5ca4f04bfe"} Apr 17 20:16:02.526332 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:02.526298 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99wq2" Apr 17 20:16:02.526332 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:02.526328 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d9lfz" Apr 17 20:16:02.526605 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:16:02.526418 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99wq2" podUID="d943896a-8c08-4d43-b1c4-d738b0079503" Apr 17 20:16:02.526605 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:16:02.526532 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d9lfz" podUID="05059fd8-9f1b-4374-81cf-fd56830ab0bb" Apr 17 20:16:04.525985 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:04.525955 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99wq2" Apr 17 20:16:04.526624 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:16:04.526075 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99wq2" podUID="d943896a-8c08-4d43-b1c4-d738b0079503" Apr 17 20:16:04.526624 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:04.526139 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d9lfz" Apr 17 20:16:04.526624 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:16:04.526258 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d9lfz" podUID="05059fd8-9f1b-4374-81cf-fd56830ab0bb" Apr 17 20:16:06.527040 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:06.526960 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d9lfz" Apr 17 20:16:06.527634 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:06.526960 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99wq2" Apr 17 20:16:06.527634 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:16:06.527091 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d9lfz" podUID="05059fd8-9f1b-4374-81cf-fd56830ab0bb" Apr 17 20:16:06.527634 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:16:06.527205 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99wq2" podUID="d943896a-8c08-4d43-b1c4-d738b0079503" Apr 17 20:16:06.686245 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:06.686214 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-57.ec2.internal" event="NodeReady" Apr 17 20:16:06.686428 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:06.686347 2580 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 20:16:06.727561 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:06.727526 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-nztqs"] Apr 17 20:16:06.738441 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:06.738410 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-4x62t"] Apr 17 20:16:06.739381 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:06.738753 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nztqs" Apr 17 20:16:06.741233 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:06.741211 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 20:16:06.741233 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:06.741234 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 20:16:06.741376 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:06.741362 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-x9tzj\"" Apr 17 20:16:06.745217 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:06.745194 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4x62t"] Apr 17 20:16:06.745328 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:06.745224 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nztqs"] Apr 17 20:16:06.745328 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:06.745320 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4x62t" Apr 17 20:16:06.747514 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:06.747495 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ndtv9\"" Apr 17 20:16:06.747959 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:06.747730 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 20:16:06.748027 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:06.747970 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 20:16:06.748212 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:06.748197 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 20:16:06.838543 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:06.838505 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vll2x\" (UniqueName: \"kubernetes.io/projected/29f57080-c48b-42b7-8c1a-747b7fd06533-kube-api-access-vll2x\") pod \"dns-default-nztqs\" (UID: \"29f57080-c48b-42b7-8c1a-747b7fd06533\") " pod="openshift-dns/dns-default-nztqs" Apr 17 20:16:06.838709 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:06.838571 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/29f57080-c48b-42b7-8c1a-747b7fd06533-metrics-tls\") pod \"dns-default-nztqs\" (UID: \"29f57080-c48b-42b7-8c1a-747b7fd06533\") " pod="openshift-dns/dns-default-nztqs" Apr 17 20:16:06.838709 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:06.838597 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e270f686-4250-41f8-a9c1-6f192df2ee57-cert\") pod \"ingress-canary-4x62t\" (UID: \"e270f686-4250-41f8-a9c1-6f192df2ee57\") " pod="openshift-ingress-canary/ingress-canary-4x62t" Apr 17 20:16:06.838709 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:06.838665 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/29f57080-c48b-42b7-8c1a-747b7fd06533-tmp-dir\") pod \"dns-default-nztqs\" (UID: \"29f57080-c48b-42b7-8c1a-747b7fd06533\") " pod="openshift-dns/dns-default-nztqs" Apr 17 20:16:06.838709 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:06.838698 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29f57080-c48b-42b7-8c1a-747b7fd06533-config-volume\") pod \"dns-default-nztqs\" (UID: \"29f57080-c48b-42b7-8c1a-747b7fd06533\") " pod="openshift-dns/dns-default-nztqs" Apr 17 20:16:06.838894 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:06.838716 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7dmf\" (UniqueName: \"kubernetes.io/projected/e270f686-4250-41f8-a9c1-6f192df2ee57-kube-api-access-x7dmf\") pod \"ingress-canary-4x62t\" (UID: \"e270f686-4250-41f8-a9c1-6f192df2ee57\") " pod="openshift-ingress-canary/ingress-canary-4x62t" Apr 17 20:16:06.939985 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:06.939943 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/29f57080-c48b-42b7-8c1a-747b7fd06533-tmp-dir\") pod \"dns-default-nztqs\" (UID: \"29f57080-c48b-42b7-8c1a-747b7fd06533\") " pod="openshift-dns/dns-default-nztqs" Apr 17 20:16:06.940161 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:06.939992 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29f57080-c48b-42b7-8c1a-747b7fd06533-config-volume\") pod \"dns-default-nztqs\" (UID: \"29f57080-c48b-42b7-8c1a-747b7fd06533\") " pod="openshift-dns/dns-default-nztqs" Apr 17 20:16:06.940161 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:06.940022 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x7dmf\" (UniqueName: \"kubernetes.io/projected/e270f686-4250-41f8-a9c1-6f192df2ee57-kube-api-access-x7dmf\") pod \"ingress-canary-4x62t\" (UID: \"e270f686-4250-41f8-a9c1-6f192df2ee57\") " pod="openshift-ingress-canary/ingress-canary-4x62t" Apr 17 20:16:06.940161 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:06.940049 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vll2x\" (UniqueName: \"kubernetes.io/projected/29f57080-c48b-42b7-8c1a-747b7fd06533-kube-api-access-vll2x\") pod \"dns-default-nztqs\" (UID: \"29f57080-c48b-42b7-8c1a-747b7fd06533\") " pod="openshift-dns/dns-default-nztqs" Apr 17 20:16:06.940161 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:06.940077 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/29f57080-c48b-42b7-8c1a-747b7fd06533-metrics-tls\") pod \"dns-default-nztqs\" (UID: \"29f57080-c48b-42b7-8c1a-747b7fd06533\") " pod="openshift-dns/dns-default-nztqs" Apr 17 20:16:06.940161 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:06.940100 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e270f686-4250-41f8-a9c1-6f192df2ee57-cert\") pod \"ingress-canary-4x62t\" (UID: \"e270f686-4250-41f8-a9c1-6f192df2ee57\") " pod="openshift-ingress-canary/ingress-canary-4x62t" Apr 17 20:16:06.940407 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:16:06.940231 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:16:06.940407 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:16:06.940299 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e270f686-4250-41f8-a9c1-6f192df2ee57-cert podName:e270f686-4250-41f8-a9c1-6f192df2ee57 nodeName:}" failed. No retries permitted until 2026-04-17 20:16:07.440275342 +0000 UTC m=+34.516647338 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e270f686-4250-41f8-a9c1-6f192df2ee57-cert") pod "ingress-canary-4x62t" (UID: "e270f686-4250-41f8-a9c1-6f192df2ee57") : secret "canary-serving-cert" not found Apr 17 20:16:06.940900 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:06.940879 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/29f57080-c48b-42b7-8c1a-747b7fd06533-tmp-dir\") pod \"dns-default-nztqs\" (UID: \"29f57080-c48b-42b7-8c1a-747b7fd06533\") " pod="openshift-dns/dns-default-nztqs" Apr 17 20:16:06.941307 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:06.941282 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29f57080-c48b-42b7-8c1a-747b7fd06533-config-volume\") pod \"dns-default-nztqs\" (UID: \"29f57080-c48b-42b7-8c1a-747b7fd06533\") " pod="openshift-dns/dns-default-nztqs" Apr 17 20:16:06.941779 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:16:06.941756 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:16:06.941873 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:16:06.941815 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29f57080-c48b-42b7-8c1a-747b7fd06533-metrics-tls podName:29f57080-c48b-42b7-8c1a-747b7fd06533 nodeName:}" failed. No retries permitted until 2026-04-17 20:16:07.441797833 +0000 UTC m=+34.518169812 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/29f57080-c48b-42b7-8c1a-747b7fd06533-metrics-tls") pod "dns-default-nztqs" (UID: "29f57080-c48b-42b7-8c1a-747b7fd06533") : secret "dns-default-metrics-tls" not found Apr 17 20:16:06.950947 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:06.950928 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vll2x\" (UniqueName: \"kubernetes.io/projected/29f57080-c48b-42b7-8c1a-747b7fd06533-kube-api-access-vll2x\") pod \"dns-default-nztqs\" (UID: \"29f57080-c48b-42b7-8c1a-747b7fd06533\") " pod="openshift-dns/dns-default-nztqs" Apr 17 20:16:06.951430 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:06.951410 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7dmf\" (UniqueName: \"kubernetes.io/projected/e270f686-4250-41f8-a9c1-6f192df2ee57-kube-api-access-x7dmf\") pod \"ingress-canary-4x62t\" (UID: \"e270f686-4250-41f8-a9c1-6f192df2ee57\") " pod="openshift-ingress-canary/ingress-canary-4x62t" Apr 17 20:16:07.142120 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:07.142042 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d943896a-8c08-4d43-b1c4-d738b0079503-metrics-certs\") pod \"network-metrics-daemon-99wq2\" (UID: \"d943896a-8c08-4d43-b1c4-d738b0079503\") " pod="openshift-multus/network-metrics-daemon-99wq2" Apr 17 20:16:07.142264 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:16:07.142187 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:16:07.142264 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:16:07.142259 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d943896a-8c08-4d43-b1c4-d738b0079503-metrics-certs podName:d943896a-8c08-4d43-b1c4-d738b0079503 nodeName:}" failed. No retries permitted until 2026-04-17 20:16:39.142241864 +0000 UTC m=+66.218613848 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d943896a-8c08-4d43-b1c4-d738b0079503-metrics-certs") pod "network-metrics-daemon-99wq2" (UID: "d943896a-8c08-4d43-b1c4-d738b0079503") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:16:07.343905 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:07.343762 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7m7jr\" (UniqueName: \"kubernetes.io/projected/05059fd8-9f1b-4374-81cf-fd56830ab0bb-kube-api-access-7m7jr\") pod \"network-check-target-d9lfz\" (UID: \"05059fd8-9f1b-4374-81cf-fd56830ab0bb\") " pod="openshift-network-diagnostics/network-check-target-d9lfz" Apr 17 20:16:07.344031 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:16:07.343923 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:16:07.344031 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:16:07.343947 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:16:07.344031 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:16:07.343959 2580 projected.go:194] Error preparing data for projected volume kube-api-access-7m7jr for pod openshift-network-diagnostics/network-check-target-d9lfz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:16:07.344031 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:16:07.344022 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/05059fd8-9f1b-4374-81cf-fd56830ab0bb-kube-api-access-7m7jr podName:05059fd8-9f1b-4374-81cf-fd56830ab0bb nodeName:}" failed. No retries permitted until 2026-04-17 20:16:39.344004556 +0000 UTC m=+66.420376539 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-7m7jr" (UniqueName: "kubernetes.io/projected/05059fd8-9f1b-4374-81cf-fd56830ab0bb-kube-api-access-7m7jr") pod "network-check-target-d9lfz" (UID: "05059fd8-9f1b-4374-81cf-fd56830ab0bb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:16:07.444569 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:07.444497 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/29f57080-c48b-42b7-8c1a-747b7fd06533-metrics-tls\") pod \"dns-default-nztqs\" (UID: \"29f57080-c48b-42b7-8c1a-747b7fd06533\") " pod="openshift-dns/dns-default-nztqs" Apr 17 20:16:07.444569 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:07.444530 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e270f686-4250-41f8-a9c1-6f192df2ee57-cert\") pod \"ingress-canary-4x62t\" (UID: \"e270f686-4250-41f8-a9c1-6f192df2ee57\") " pod="openshift-ingress-canary/ingress-canary-4x62t" Apr 17 20:16:07.444774 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:16:07.444625 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:16:07.444774 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:16:07.444658 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:16:07.444774 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:16:07.444684 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e270f686-4250-41f8-a9c1-6f192df2ee57-cert podName:e270f686-4250-41f8-a9c1-6f192df2ee57 nodeName:}" failed. No retries permitted until 2026-04-17 20:16:08.444668202 +0000 UTC m=+35.521040184 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e270f686-4250-41f8-a9c1-6f192df2ee57-cert") pod "ingress-canary-4x62t" (UID: "e270f686-4250-41f8-a9c1-6f192df2ee57") : secret "canary-serving-cert" not found Apr 17 20:16:07.444774 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:16:07.444725 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29f57080-c48b-42b7-8c1a-747b7fd06533-metrics-tls podName:29f57080-c48b-42b7-8c1a-747b7fd06533 nodeName:}" failed. No retries permitted until 2026-04-17 20:16:08.444705012 +0000 UTC m=+35.521076996 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/29f57080-c48b-42b7-8c1a-747b7fd06533-metrics-tls") pod "dns-default-nztqs" (UID: "29f57080-c48b-42b7-8c1a-747b7fd06533") : secret "dns-default-metrics-tls" not found Apr 17 20:16:07.703030 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:07.702957 2580 generic.go:358] "Generic (PLEG): container finished" podID="16a5e25e-23a4-4106-a67e-adda44b1aaa6" containerID="032350f2eb4124d75cecf1dbf985b62bcd59af7dae1a962ed0ee5215b0923c8b" exitCode=0 Apr 17 20:16:07.703030 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:07.703009 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qvs8p" event={"ID":"16a5e25e-23a4-4106-a67e-adda44b1aaa6","Type":"ContainerDied","Data":"032350f2eb4124d75cecf1dbf985b62bcd59af7dae1a962ed0ee5215b0923c8b"} Apr 17 20:16:08.451511 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:08.451446 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/29f57080-c48b-42b7-8c1a-747b7fd06533-metrics-tls\") pod \"dns-default-nztqs\" (UID: \"29f57080-c48b-42b7-8c1a-747b7fd06533\") " pod="openshift-dns/dns-default-nztqs" Apr 17 20:16:08.451511 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:08.451513 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e270f686-4250-41f8-a9c1-6f192df2ee57-cert\") pod \"ingress-canary-4x62t\" (UID: \"e270f686-4250-41f8-a9c1-6f192df2ee57\") " pod="openshift-ingress-canary/ingress-canary-4x62t" Apr 17 20:16:08.451717 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:16:08.451591 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:16:08.451717 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:16:08.451597 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:16:08.451717 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:16:08.451655 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29f57080-c48b-42b7-8c1a-747b7fd06533-metrics-tls podName:29f57080-c48b-42b7-8c1a-747b7fd06533 nodeName:}" failed. No retries permitted until 2026-04-17 20:16:10.451638066 +0000 UTC m=+37.528010047 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/29f57080-c48b-42b7-8c1a-747b7fd06533-metrics-tls") pod "dns-default-nztqs" (UID: "29f57080-c48b-42b7-8c1a-747b7fd06533") : secret "dns-default-metrics-tls" not found Apr 17 20:16:08.451717 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:16:08.451669 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e270f686-4250-41f8-a9c1-6f192df2ee57-cert podName:e270f686-4250-41f8-a9c1-6f192df2ee57 nodeName:}" failed. No retries permitted until 2026-04-17 20:16:10.451663296 +0000 UTC m=+37.528035276 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e270f686-4250-41f8-a9c1-6f192df2ee57-cert") pod "ingress-canary-4x62t" (UID: "e270f686-4250-41f8-a9c1-6f192df2ee57") : secret "canary-serving-cert" not found Apr 17 20:16:08.526410 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:08.526380 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99wq2" Apr 17 20:16:08.526561 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:08.526413 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d9lfz" Apr 17 20:16:08.529144 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:08.529123 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 20:16:08.529144 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:08.529134 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-x8drw\"" Apr 17 20:16:08.529326 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:08.529126 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 20:16:08.529326 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:08.529153 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 20:16:08.529326 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:08.529187 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-rfhz5\"" Apr 17 20:16:08.707372 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:08.707287 2580 generic.go:358] "Generic (PLEG): container finished" podID="16a5e25e-23a4-4106-a67e-adda44b1aaa6" containerID="0a56262dbf020c449adea63d460531294a9617874ce4adcb68a5c9eb240acfc8" exitCode=0 Apr 17 20:16:08.707372 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:08.707348 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qvs8p" event={"ID":"16a5e25e-23a4-4106-a67e-adda44b1aaa6","Type":"ContainerDied","Data":"0a56262dbf020c449adea63d460531294a9617874ce4adcb68a5c9eb240acfc8"} Apr 17 20:16:09.712532 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:09.712494 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qvs8p" event={"ID":"16a5e25e-23a4-4106-a67e-adda44b1aaa6","Type":"ContainerStarted","Data":"2a3b076024ccec2d5e5d52f3f8a16996c5eb7ceaa4b042f4829079eb471ea664"} Apr 17 20:16:09.732175 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:09.732134 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-qvs8p" podStartSLOduration=5.683579235 podStartE2EDuration="36.732119312s" podCreationTimestamp="2026-04-17 20:15:33 +0000 UTC" firstStartedPulling="2026-04-17 20:15:36.14553382 +0000 UTC m=+3.221905801" lastFinishedPulling="2026-04-17 20:16:07.194073898 +0000 UTC m=+34.270445878" observedRunningTime="2026-04-17 20:16:09.731703767 +0000 UTC m=+36.808075770" watchObservedRunningTime="2026-04-17 20:16:09.732119312 +0000 UTC m=+36.808491314" Apr 17 20:16:10.464241 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:10.464160 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/29f57080-c48b-42b7-8c1a-747b7fd06533-metrics-tls\") pod \"dns-default-nztqs\" (UID: \"29f57080-c48b-42b7-8c1a-747b7fd06533\") " pod="openshift-dns/dns-default-nztqs" Apr 17 20:16:10.464241 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:10.464197 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e270f686-4250-41f8-a9c1-6f192df2ee57-cert\") pod \"ingress-canary-4x62t\" (UID: \"e270f686-4250-41f8-a9c1-6f192df2ee57\") " pod="openshift-ingress-canary/ingress-canary-4x62t" Apr 17 20:16:10.464482 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:16:10.464302 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:16:10.464482 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:16:10.464330 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:16:10.464482 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:16:10.464356 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e270f686-4250-41f8-a9c1-6f192df2ee57-cert podName:e270f686-4250-41f8-a9c1-6f192df2ee57 nodeName:}" failed. No retries permitted until 2026-04-17 20:16:14.464343236 +0000 UTC m=+41.540715216 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e270f686-4250-41f8-a9c1-6f192df2ee57-cert") pod "ingress-canary-4x62t" (UID: "e270f686-4250-41f8-a9c1-6f192df2ee57") : secret "canary-serving-cert" not found Apr 17 20:16:10.464482 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:16:10.464394 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29f57080-c48b-42b7-8c1a-747b7fd06533-metrics-tls podName:29f57080-c48b-42b7-8c1a-747b7fd06533 nodeName:}" failed. No retries permitted until 2026-04-17 20:16:14.464375856 +0000 UTC m=+41.540747839 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/29f57080-c48b-42b7-8c1a-747b7fd06533-metrics-tls") pod "dns-default-nztqs" (UID: "29f57080-c48b-42b7-8c1a-747b7fd06533") : secret "dns-default-metrics-tls" not found Apr 17 20:16:14.492021 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:14.491985 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/29f57080-c48b-42b7-8c1a-747b7fd06533-metrics-tls\") pod \"dns-default-nztqs\" (UID: \"29f57080-c48b-42b7-8c1a-747b7fd06533\") " pod="openshift-dns/dns-default-nztqs" Apr 17 20:16:14.492021 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:14.492028 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e270f686-4250-41f8-a9c1-6f192df2ee57-cert\") pod \"ingress-canary-4x62t\" (UID: \"e270f686-4250-41f8-a9c1-6f192df2ee57\") " pod="openshift-ingress-canary/ingress-canary-4x62t" Apr 17 20:16:14.492475 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:16:14.492139 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:16:14.492475 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:16:14.492152 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:16:14.492475 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:16:14.492200 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e270f686-4250-41f8-a9c1-6f192df2ee57-cert podName:e270f686-4250-41f8-a9c1-6f192df2ee57 nodeName:}" failed. No retries permitted until 2026-04-17 20:16:22.492182782 +0000 UTC m=+49.568554762 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e270f686-4250-41f8-a9c1-6f192df2ee57-cert") pod "ingress-canary-4x62t" (UID: "e270f686-4250-41f8-a9c1-6f192df2ee57") : secret "canary-serving-cert" not found Apr 17 20:16:14.492475 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:16:14.492213 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29f57080-c48b-42b7-8c1a-747b7fd06533-metrics-tls podName:29f57080-c48b-42b7-8c1a-747b7fd06533 nodeName:}" failed. No retries permitted until 2026-04-17 20:16:22.492207185 +0000 UTC m=+49.568579164 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/29f57080-c48b-42b7-8c1a-747b7fd06533-metrics-tls") pod "dns-default-nztqs" (UID: "29f57080-c48b-42b7-8c1a-747b7fd06533") : secret "dns-default-metrics-tls" not found Apr 17 20:16:22.543693 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:22.543659 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/29f57080-c48b-42b7-8c1a-747b7fd06533-metrics-tls\") pod \"dns-default-nztqs\" (UID: \"29f57080-c48b-42b7-8c1a-747b7fd06533\") " pod="openshift-dns/dns-default-nztqs" Apr 17 20:16:22.543693 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:22.543693 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e270f686-4250-41f8-a9c1-6f192df2ee57-cert\") pod \"ingress-canary-4x62t\" (UID: \"e270f686-4250-41f8-a9c1-6f192df2ee57\") " pod="openshift-ingress-canary/ingress-canary-4x62t" Apr 17 20:16:22.544205 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:16:22.543785 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:16:22.544205 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:16:22.543794 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:16:22.544205 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:16:22.543836 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e270f686-4250-41f8-a9c1-6f192df2ee57-cert podName:e270f686-4250-41f8-a9c1-6f192df2ee57 nodeName:}" failed. No retries permitted until 2026-04-17 20:16:38.543823647 +0000 UTC m=+65.620195631 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e270f686-4250-41f8-a9c1-6f192df2ee57-cert") pod "ingress-canary-4x62t" (UID: "e270f686-4250-41f8-a9c1-6f192df2ee57") : secret "canary-serving-cert" not found Apr 17 20:16:22.544205 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:16:22.543856 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29f57080-c48b-42b7-8c1a-747b7fd06533-metrics-tls podName:29f57080-c48b-42b7-8c1a-747b7fd06533 nodeName:}" failed. No retries permitted until 2026-04-17 20:16:38.543841711 +0000 UTC m=+65.620213696 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/29f57080-c48b-42b7-8c1a-747b7fd06533-metrics-tls") pod "dns-default-nztqs" (UID: "29f57080-c48b-42b7-8c1a-747b7fd06533") : secret "dns-default-metrics-tls" not found Apr 17 20:16:30.697391 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:30.697357 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vrmd6" Apr 17 20:16:38.552165 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:38.552127 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/29f57080-c48b-42b7-8c1a-747b7fd06533-metrics-tls\") pod \"dns-default-nztqs\" (UID: \"29f57080-c48b-42b7-8c1a-747b7fd06533\") " pod="openshift-dns/dns-default-nztqs" Apr 17 20:16:38.552165 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:38.552163 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e270f686-4250-41f8-a9c1-6f192df2ee57-cert\") pod \"ingress-canary-4x62t\" (UID: \"e270f686-4250-41f8-a9c1-6f192df2ee57\") " pod="openshift-ingress-canary/ingress-canary-4x62t" Apr 17 20:16:38.552604 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:16:38.552258 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:16:38.552604 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:16:38.552265 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:16:38.552604 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:16:38.552337 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e270f686-4250-41f8-a9c1-6f192df2ee57-cert podName:e270f686-4250-41f8-a9c1-6f192df2ee57 nodeName:}" failed. No retries permitted until 2026-04-17 20:17:10.552312147 +0000 UTC m=+97.628684130 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e270f686-4250-41f8-a9c1-6f192df2ee57-cert") pod "ingress-canary-4x62t" (UID: "e270f686-4250-41f8-a9c1-6f192df2ee57") : secret "canary-serving-cert" not found Apr 17 20:16:38.552604 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:16:38.552350 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29f57080-c48b-42b7-8c1a-747b7fd06533-metrics-tls podName:29f57080-c48b-42b7-8c1a-747b7fd06533 nodeName:}" failed. No retries permitted until 2026-04-17 20:17:10.552344369 +0000 UTC m=+97.628716348 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/29f57080-c48b-42b7-8c1a-747b7fd06533-metrics-tls") pod "dns-default-nztqs" (UID: "29f57080-c48b-42b7-8c1a-747b7fd06533") : secret "dns-default-metrics-tls" not found Apr 17 20:16:39.156338 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:39.156297 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d943896a-8c08-4d43-b1c4-d738b0079503-metrics-certs\") pod \"network-metrics-daemon-99wq2\" (UID: \"d943896a-8c08-4d43-b1c4-d738b0079503\") " pod="openshift-multus/network-metrics-daemon-99wq2" Apr 17 20:16:39.158797 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:39.158779 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 20:16:39.167389 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:16:39.167370 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 20:16:39.167480 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:16:39.167449 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d943896a-8c08-4d43-b1c4-d738b0079503-metrics-certs podName:d943896a-8c08-4d43-b1c4-d738b0079503 nodeName:}" failed. No retries permitted until 2026-04-17 20:17:43.167426805 +0000 UTC m=+130.243798786 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d943896a-8c08-4d43-b1c4-d738b0079503-metrics-certs") pod "network-metrics-daemon-99wq2" (UID: "d943896a-8c08-4d43-b1c4-d738b0079503") : secret "metrics-daemon-secret" not found Apr 17 20:16:39.357929 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:39.357886 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7m7jr\" (UniqueName: \"kubernetes.io/projected/05059fd8-9f1b-4374-81cf-fd56830ab0bb-kube-api-access-7m7jr\") pod \"network-check-target-d9lfz\" (UID: \"05059fd8-9f1b-4374-81cf-fd56830ab0bb\") " pod="openshift-network-diagnostics/network-check-target-d9lfz" Apr 17 20:16:39.360223 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:39.360204 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 20:16:39.370815 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:39.370798 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 20:16:39.382780 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:39.382752 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m7jr\" (UniqueName: \"kubernetes.io/projected/05059fd8-9f1b-4374-81cf-fd56830ab0bb-kube-api-access-7m7jr\") pod \"network-check-target-d9lfz\" (UID: \"05059fd8-9f1b-4374-81cf-fd56830ab0bb\") " pod="openshift-network-diagnostics/network-check-target-d9lfz" Apr 17 20:16:39.448411 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:39.448358 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-x8drw\"" Apr 17 20:16:39.456103 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:39.456080 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d9lfz" Apr 17 20:16:39.576795 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:39.576769 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-d9lfz"] Apr 17 20:16:39.580224 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:16:39.580199 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05059fd8_9f1b_4374_81cf_fd56830ab0bb.slice/crio-ec51252736a4e94433e9981a77f2f638c5ec3995ecc3729eaf69306701514222 WatchSource:0}: Error finding container ec51252736a4e94433e9981a77f2f638c5ec3995ecc3729eaf69306701514222: Status 404 returned error can't find the container with id ec51252736a4e94433e9981a77f2f638c5ec3995ecc3729eaf69306701514222 Apr 17 20:16:39.767726 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:39.767648 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-d9lfz" event={"ID":"05059fd8-9f1b-4374-81cf-fd56830ab0bb","Type":"ContainerStarted","Data":"ec51252736a4e94433e9981a77f2f638c5ec3995ecc3729eaf69306701514222"} Apr 17 20:16:42.774924 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:42.774894 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-d9lfz" event={"ID":"05059fd8-9f1b-4374-81cf-fd56830ab0bb","Type":"ContainerStarted","Data":"27f03f5ccd06a880aa53d27cd48d2b53b544962bfca6851ad39956491a9c4579"} Apr 17 20:16:42.775270 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:42.775002 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-d9lfz" Apr 17 20:16:42.789033 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:16:42.788988 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-d9lfz" podStartSLOduration=67.160188867 podStartE2EDuration="1m9.788975075s" podCreationTimestamp="2026-04-17 20:15:33 +0000 UTC" firstStartedPulling="2026-04-17 20:16:39.582093473 +0000 UTC m=+66.658465453" lastFinishedPulling="2026-04-17 20:16:42.210879681 +0000 UTC m=+69.287251661" observedRunningTime="2026-04-17 20:16:42.78813169 +0000 UTC m=+69.864503691" watchObservedRunningTime="2026-04-17 20:16:42.788975075 +0000 UTC m=+69.865347078" Apr 17 20:17:02.159957 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:02.159848 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ht5sn"] Apr 17 20:17:02.163814 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:02.163793 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-65d65c8fdb-58dhc"] Apr 17 20:17:02.163952 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:02.163935 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ht5sn" Apr 17 20:17:02.166064 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:02.166042 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-bjps6\"" Apr 17 20:17:02.166317 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:02.166098 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:17:02.166317 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:02.166242 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 17 20:17:02.166317 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:02.166283 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 17 20:17:02.166699 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:02.166684 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-65d65c8fdb-58dhc" Apr 17 20:17:02.168820 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:02.168800 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 17 20:17:02.168941 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:02.168829 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 20:17:02.168941 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:02.168848 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 17 20:17:02.168941 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:02.168863 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 20:17:02.169100 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:02.168839 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 17 20:17:02.169176 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:02.169160 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 17 20:17:02.169231 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:02.169178 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-5jdcv\"" Apr 17 20:17:02.172809 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:02.172792 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ht5sn"] Apr 17 20:17:02.179335 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:02.179303 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-65d65c8fdb-58dhc"] Apr 17 20:17:02.209314 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:02.209288 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csndt\" (UniqueName: \"kubernetes.io/projected/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-kube-api-access-csndt\") pod \"router-default-65d65c8fdb-58dhc\" (UID: \"3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9\") " pod="openshift-ingress/router-default-65d65c8fdb-58dhc" Apr 17 20:17:02.209428 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:02.209326 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/658a2744-7815-42ec-bb05-a0eeeb012bc4-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-ht5sn\" (UID: \"658a2744-7815-42ec-bb05-a0eeeb012bc4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ht5sn" Apr 17 20:17:02.209428 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:02.209351 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-metrics-certs\") pod \"router-default-65d65c8fdb-58dhc\" (UID: \"3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9\") " pod="openshift-ingress/router-default-65d65c8fdb-58dhc" Apr 17 20:17:02.209428 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:02.209371 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-default-certificate\") pod \"router-default-65d65c8fdb-58dhc\" (UID: \"3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9\") " pod="openshift-ingress/router-default-65d65c8fdb-58dhc" Apr 17 20:17:02.209428 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:02.209390 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-stats-auth\") pod \"router-default-65d65c8fdb-58dhc\" (UID: \"3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9\") " pod="openshift-ingress/router-default-65d65c8fdb-58dhc" Apr 17 20:17:02.209428 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:02.209412 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvsv2\" (UniqueName: \"kubernetes.io/projected/658a2744-7815-42ec-bb05-a0eeeb012bc4-kube-api-access-vvsv2\") pod \"cluster-samples-operator-6dc5bdb6b4-ht5sn\" (UID: \"658a2744-7815-42ec-bb05-a0eeeb012bc4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ht5sn" Apr 17 20:17:02.209614 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:02.209438 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-service-ca-bundle\") pod \"router-default-65d65c8fdb-58dhc\" (UID: \"3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9\") " pod="openshift-ingress/router-default-65d65c8fdb-58dhc" Apr 17 20:17:02.310678 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:02.310648 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-csndt\" (UniqueName: \"kubernetes.io/projected/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-kube-api-access-csndt\") pod \"router-default-65d65c8fdb-58dhc\" (UID: \"3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9\") " pod="openshift-ingress/router-default-65d65c8fdb-58dhc" Apr 17 20:17:02.310839 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:02.310688 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/658a2744-7815-42ec-bb05-a0eeeb012bc4-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-ht5sn\" (UID: \"658a2744-7815-42ec-bb05-a0eeeb012bc4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ht5sn" Apr 17 20:17:02.310839 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:02.310713 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-metrics-certs\") pod \"router-default-65d65c8fdb-58dhc\" (UID: \"3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9\") " pod="openshift-ingress/router-default-65d65c8fdb-58dhc" Apr 17 20:17:02.310839 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:17:02.310790 2580 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 20:17:02.310839 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:17:02.310797 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 20:17:02.311052 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:02.310837 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-default-certificate\") pod \"router-default-65d65c8fdb-58dhc\" (UID: \"3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9\") " pod="openshift-ingress/router-default-65d65c8fdb-58dhc" Apr 17 20:17:02.311052 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:17:02.310845 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-metrics-certs podName:3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9 nodeName:}" failed. No retries permitted until 2026-04-17 20:17:02.810830979 +0000 UTC m=+89.887202959 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-metrics-certs") pod "router-default-65d65c8fdb-58dhc" (UID: "3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9") : secret "router-metrics-certs-default" not found Apr 17 20:17:02.311052 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:17:02.310894 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/658a2744-7815-42ec-bb05-a0eeeb012bc4-samples-operator-tls podName:658a2744-7815-42ec-bb05-a0eeeb012bc4 nodeName:}" failed. No retries permitted until 2026-04-17 20:17:02.810875253 +0000 UTC m=+89.887247233 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/658a2744-7815-42ec-bb05-a0eeeb012bc4-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-ht5sn" (UID: "658a2744-7815-42ec-bb05-a0eeeb012bc4") : secret "samples-operator-tls" not found Apr 17 20:17:02.311052 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:02.310914 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-stats-auth\") pod \"router-default-65d65c8fdb-58dhc\" (UID: \"3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9\") " pod="openshift-ingress/router-default-65d65c8fdb-58dhc" Apr 17 20:17:02.311052 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:02.310961 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vvsv2\" (UniqueName: \"kubernetes.io/projected/658a2744-7815-42ec-bb05-a0eeeb012bc4-kube-api-access-vvsv2\") pod \"cluster-samples-operator-6dc5bdb6b4-ht5sn\" (UID: \"658a2744-7815-42ec-bb05-a0eeeb012bc4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ht5sn" Apr 17 20:17:02.311052 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:02.311002 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-service-ca-bundle\") pod \"router-default-65d65c8fdb-58dhc\" (UID: \"3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9\") " pod="openshift-ingress/router-default-65d65c8fdb-58dhc" Apr 17 20:17:02.311282 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:17:02.311123 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-service-ca-bundle podName:3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9 nodeName:}" failed. No retries permitted until 2026-04-17 20:17:02.81111127 +0000 UTC m=+89.887483254 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-service-ca-bundle") pod "router-default-65d65c8fdb-58dhc" (UID: "3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9") : configmap references non-existent config key: service-ca.crt Apr 17 20:17:02.313261 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:02.313243 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-stats-auth\") pod \"router-default-65d65c8fdb-58dhc\" (UID: \"3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9\") " pod="openshift-ingress/router-default-65d65c8fdb-58dhc" Apr 17 20:17:02.313397 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:02.313382 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-default-certificate\") pod \"router-default-65d65c8fdb-58dhc\" (UID: \"3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9\") " pod="openshift-ingress/router-default-65d65c8fdb-58dhc" Apr 17 20:17:02.318444 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:02.318420 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-csndt\" (UniqueName: \"kubernetes.io/projected/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-kube-api-access-csndt\") pod \"router-default-65d65c8fdb-58dhc\" (UID: \"3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9\") " pod="openshift-ingress/router-default-65d65c8fdb-58dhc" Apr 17 20:17:02.318916 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:02.318899 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvsv2\" (UniqueName: \"kubernetes.io/projected/658a2744-7815-42ec-bb05-a0eeeb012bc4-kube-api-access-vvsv2\") pod \"cluster-samples-operator-6dc5bdb6b4-ht5sn\" (UID: \"658a2744-7815-42ec-bb05-a0eeeb012bc4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ht5sn" Apr 17 20:17:02.815173 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:02.815144 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/658a2744-7815-42ec-bb05-a0eeeb012bc4-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-ht5sn\" (UID: \"658a2744-7815-42ec-bb05-a0eeeb012bc4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ht5sn" Apr 17 20:17:02.815414 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:02.815183 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-metrics-certs\") pod \"router-default-65d65c8fdb-58dhc\" (UID: \"3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9\") " pod="openshift-ingress/router-default-65d65c8fdb-58dhc" Apr 17 20:17:02.815414 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:02.815216 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-service-ca-bundle\") pod \"router-default-65d65c8fdb-58dhc\" (UID: \"3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9\") " pod="openshift-ingress/router-default-65d65c8fdb-58dhc" Apr 17 20:17:02.815414 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:17:02.815297 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 20:17:02.815414 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:17:02.815324 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-service-ca-bundle podName:3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9 nodeName:}" failed. No retries permitted until 2026-04-17 20:17:03.815309999 +0000 UTC m=+90.891681979 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-service-ca-bundle") pod "router-default-65d65c8fdb-58dhc" (UID: "3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9") : configmap references non-existent config key: service-ca.crt Apr 17 20:17:02.815414 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:17:02.815361 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/658a2744-7815-42ec-bb05-a0eeeb012bc4-samples-operator-tls podName:658a2744-7815-42ec-bb05-a0eeeb012bc4 nodeName:}" failed. No retries permitted until 2026-04-17 20:17:03.815342687 +0000 UTC m=+90.891714670 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/658a2744-7815-42ec-bb05-a0eeeb012bc4-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-ht5sn" (UID: "658a2744-7815-42ec-bb05-a0eeeb012bc4") : secret "samples-operator-tls" not found Apr 17 20:17:02.815414 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:17:02.815297 2580 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 20:17:02.815414 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:17:02.815405 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-metrics-certs podName:3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9 nodeName:}" failed. No retries permitted until 2026-04-17 20:17:03.815396571 +0000 UTC m=+90.891768550 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-metrics-certs") pod "router-default-65d65c8fdb-58dhc" (UID: "3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9") : secret "router-metrics-certs-default" not found Apr 17 20:17:03.821623 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:03.821590 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-service-ca-bundle\") pod \"router-default-65d65c8fdb-58dhc\" (UID: \"3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9\") " pod="openshift-ingress/router-default-65d65c8fdb-58dhc" Apr 17 20:17:03.821986 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:03.821654 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/658a2744-7815-42ec-bb05-a0eeeb012bc4-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-ht5sn\" (UID: \"658a2744-7815-42ec-bb05-a0eeeb012bc4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ht5sn" Apr 17 20:17:03.821986 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:03.821686 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-metrics-certs\") pod \"router-default-65d65c8fdb-58dhc\" (UID: \"3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9\") " pod="openshift-ingress/router-default-65d65c8fdb-58dhc" Apr 17 20:17:03.821986 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:17:03.821735 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-service-ca-bundle podName:3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9 nodeName:}" failed. No retries permitted until 2026-04-17 20:17:05.821714923 +0000 UTC m=+92.898086906 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-service-ca-bundle") pod "router-default-65d65c8fdb-58dhc" (UID: "3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9") : configmap references non-existent config key: service-ca.crt Apr 17 20:17:03.821986 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:17:03.821771 2580 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 20:17:03.821986 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:17:03.821784 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 20:17:03.821986 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:17:03.821815 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-metrics-certs podName:3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9 nodeName:}" failed. No retries permitted until 2026-04-17 20:17:05.821804307 +0000 UTC m=+92.898176287 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-metrics-certs") pod "router-default-65d65c8fdb-58dhc" (UID: "3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9") : secret "router-metrics-certs-default" not found Apr 17 20:17:03.821986 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:17:03.821838 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/658a2744-7815-42ec-bb05-a0eeeb012bc4-samples-operator-tls podName:658a2744-7815-42ec-bb05-a0eeeb012bc4 nodeName:}" failed. No retries permitted until 2026-04-17 20:17:05.821824416 +0000 UTC m=+92.898196407 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/658a2744-7815-42ec-bb05-a0eeeb012bc4-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-ht5sn" (UID: "658a2744-7815-42ec-bb05-a0eeeb012bc4") : secret "samples-operator-tls" not found Apr 17 20:17:05.838231 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:05.838201 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/658a2744-7815-42ec-bb05-a0eeeb012bc4-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-ht5sn\" (UID: \"658a2744-7815-42ec-bb05-a0eeeb012bc4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ht5sn" Apr 17 20:17:05.838670 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:05.838244 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-metrics-certs\") pod \"router-default-65d65c8fdb-58dhc\" (UID: \"3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9\") " pod="openshift-ingress/router-default-65d65c8fdb-58dhc" Apr 17 20:17:05.838670 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:05.838292 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-service-ca-bundle\") pod \"router-default-65d65c8fdb-58dhc\" (UID: \"3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9\") " pod="openshift-ingress/router-default-65d65c8fdb-58dhc" Apr 17 20:17:05.838670 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:17:05.838352 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 20:17:05.838670 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:17:05.838414 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-service-ca-bundle podName:3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9 nodeName:}" failed. No retries permitted until 2026-04-17 20:17:09.838398097 +0000 UTC m=+96.914770076 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-service-ca-bundle") pod "router-default-65d65c8fdb-58dhc" (UID: "3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9") : configmap references non-existent config key: service-ca.crt Apr 17 20:17:05.838670 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:17:05.838432 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/658a2744-7815-42ec-bb05-a0eeeb012bc4-samples-operator-tls podName:658a2744-7815-42ec-bb05-a0eeeb012bc4 nodeName:}" failed. No retries permitted until 2026-04-17 20:17:09.838424304 +0000 UTC m=+96.914796287 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/658a2744-7815-42ec-bb05-a0eeeb012bc4-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-ht5sn" (UID: "658a2744-7815-42ec-bb05-a0eeeb012bc4") : secret "samples-operator-tls" not found Apr 17 20:17:05.838670 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:17:05.838440 2580 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 20:17:05.838670 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:17:05.838527 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-metrics-certs podName:3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9 nodeName:}" failed. No retries permitted until 2026-04-17 20:17:09.838508372 +0000 UTC m=+96.914880353 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-metrics-certs") pod "router-default-65d65c8fdb-58dhc" (UID: "3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9") : secret "router-metrics-certs-default" not found Apr 17 20:17:07.963382 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:07.963347 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-bmcs7"] Apr 17 20:17:07.966369 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:07.966354 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bmcs7" Apr 17 20:17:07.968529 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:07.968508 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 17 20:17:07.968723 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:07.968707 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-f4wvv\"" Apr 17 20:17:07.969429 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:07.969416 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 17 20:17:07.974160 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:07.974141 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-bmcs7"] Apr 17 20:17:08.051707 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:08.051676 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br6kb\" (UniqueName: \"kubernetes.io/projected/719e086f-1a8a-434b-90a0-cd72fcae76c0-kube-api-access-br6kb\") pod \"migrator-74bb7799d9-bmcs7\" (UID: \"719e086f-1a8a-434b-90a0-cd72fcae76c0\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bmcs7" Apr 17 20:17:08.153011 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:08.152971 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-br6kb\" (UniqueName: \"kubernetes.io/projected/719e086f-1a8a-434b-90a0-cd72fcae76c0-kube-api-access-br6kb\") pod \"migrator-74bb7799d9-bmcs7\" (UID: \"719e086f-1a8a-434b-90a0-cd72fcae76c0\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bmcs7" Apr 17 20:17:08.160379 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:08.160342 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-br6kb\" (UniqueName: \"kubernetes.io/projected/719e086f-1a8a-434b-90a0-cd72fcae76c0-kube-api-access-br6kb\") pod \"migrator-74bb7799d9-bmcs7\" (UID: \"719e086f-1a8a-434b-90a0-cd72fcae76c0\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bmcs7" Apr 17 20:17:08.275679 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:08.275617 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bmcs7" Apr 17 20:17:08.401365 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:08.401338 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-bmcs7"] Apr 17 20:17:08.404796 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:17:08.404765 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod719e086f_1a8a_434b_90a0_cd72fcae76c0.slice/crio-cf2967b955980b33f7b643b98f9255ac75443ab6a212bc8bf8a583117d159a27 WatchSource:0}: Error finding container cf2967b955980b33f7b643b98f9255ac75443ab6a212bc8bf8a583117d159a27: Status 404 returned error can't find the container with id cf2967b955980b33f7b643b98f9255ac75443ab6a212bc8bf8a583117d159a27 Apr 17 20:17:08.822902 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:08.822862 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bmcs7" event={"ID":"719e086f-1a8a-434b-90a0-cd72fcae76c0","Type":"ContainerStarted","Data":"cf2967b955980b33f7b643b98f9255ac75443ab6a212bc8bf8a583117d159a27"} Apr 17 20:17:09.825831 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:09.825798 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bmcs7" event={"ID":"719e086f-1a8a-434b-90a0-cd72fcae76c0","Type":"ContainerStarted","Data":"aeefc85736de9aa725b54dc72d94f0d84784550d5a396ba81047102120d431c2"} Apr 17 20:17:09.864958 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:09.864923 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/658a2744-7815-42ec-bb05-a0eeeb012bc4-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-ht5sn\" (UID: \"658a2744-7815-42ec-bb05-a0eeeb012bc4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ht5sn" Apr 17 20:17:09.865059 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:09.864960 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-metrics-certs\") pod \"router-default-65d65c8fdb-58dhc\" (UID: \"3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9\") " pod="openshift-ingress/router-default-65d65c8fdb-58dhc" Apr 17 20:17:09.865059 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:09.865011 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-service-ca-bundle\") pod \"router-default-65d65c8fdb-58dhc\" (UID: \"3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9\") " pod="openshift-ingress/router-default-65d65c8fdb-58dhc" Apr 17 20:17:09.865167 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:17:09.865058 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 20:17:09.865167 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:17:09.865105 2580 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 20:17:09.865167 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:17:09.865109 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/658a2744-7815-42ec-bb05-a0eeeb012bc4-samples-operator-tls podName:658a2744-7815-42ec-bb05-a0eeeb012bc4 nodeName:}" failed. No retries permitted until 2026-04-17 20:17:17.865091127 +0000 UTC m=+104.941463108 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/658a2744-7815-42ec-bb05-a0eeeb012bc4-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-ht5sn" (UID: "658a2744-7815-42ec-bb05-a0eeeb012bc4") : secret "samples-operator-tls" not found Apr 17 20:17:09.865167 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:17:09.865161 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-service-ca-bundle podName:3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9 nodeName:}" failed. No retries permitted until 2026-04-17 20:17:17.865148989 +0000 UTC m=+104.941520970 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-service-ca-bundle") pod "router-default-65d65c8fdb-58dhc" (UID: "3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9") : configmap references non-existent config key: service-ca.crt Apr 17 20:17:09.865375 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:17:09.865172 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-metrics-certs podName:3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9 nodeName:}" failed. No retries permitted until 2026-04-17 20:17:17.865166572 +0000 UTC m=+104.941538551 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-metrics-certs") pod "router-default-65d65c8fdb-58dhc" (UID: "3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9") : secret "router-metrics-certs-default" not found Apr 17 20:17:10.180281 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:10.180247 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-zckcj"] Apr 17 20:17:10.183164 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:10.183146 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-zckcj" Apr 17 20:17:10.185304 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:10.185276 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 17 20:17:10.185505 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:10.185488 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 17 20:17:10.185568 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:10.185548 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 17 20:17:10.185612 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:10.185562 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 17 20:17:10.186193 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:10.186179 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-bwhcc\"" Apr 17 20:17:10.190171 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:10.190147 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-zckcj"] Apr 17 20:17:10.267393 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:10.267359 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/58c3570c-5668-47bd-bf38-25544374725a-signing-key\") pod \"service-ca-865cb79987-zckcj\" (UID: \"58c3570c-5668-47bd-bf38-25544374725a\") " pod="openshift-service-ca/service-ca-865cb79987-zckcj" Apr 17 20:17:10.267393 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:10.267396 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqxfp\" (UniqueName: \"kubernetes.io/projected/58c3570c-5668-47bd-bf38-25544374725a-kube-api-access-wqxfp\") pod \"service-ca-865cb79987-zckcj\" (UID: \"58c3570c-5668-47bd-bf38-25544374725a\") " pod="openshift-service-ca/service-ca-865cb79987-zckcj" Apr 17 20:17:10.267629 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:10.267417 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/58c3570c-5668-47bd-bf38-25544374725a-signing-cabundle\") pod \"service-ca-865cb79987-zckcj\" (UID: \"58c3570c-5668-47bd-bf38-25544374725a\") " pod="openshift-service-ca/service-ca-865cb79987-zckcj" Apr 17 20:17:10.368044 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:10.367976 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/58c3570c-5668-47bd-bf38-25544374725a-signing-key\") pod \"service-ca-865cb79987-zckcj\" (UID: \"58c3570c-5668-47bd-bf38-25544374725a\") " pod="openshift-service-ca/service-ca-865cb79987-zckcj" Apr 17 20:17:10.368044 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:10.368012 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wqxfp\" (UniqueName: \"kubernetes.io/projected/58c3570c-5668-47bd-bf38-25544374725a-kube-api-access-wqxfp\") pod \"service-ca-865cb79987-zckcj\" (UID: \"58c3570c-5668-47bd-bf38-25544374725a\") " pod="openshift-service-ca/service-ca-865cb79987-zckcj" Apr 17 20:17:10.368044 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:10.368035 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/58c3570c-5668-47bd-bf38-25544374725a-signing-cabundle\") pod \"service-ca-865cb79987-zckcj\" (UID: \"58c3570c-5668-47bd-bf38-25544374725a\") " pod="openshift-service-ca/service-ca-865cb79987-zckcj" Apr 17 20:17:10.368749 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:10.368729 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/58c3570c-5668-47bd-bf38-25544374725a-signing-cabundle\") pod \"service-ca-865cb79987-zckcj\" (UID: \"58c3570c-5668-47bd-bf38-25544374725a\") " pod="openshift-service-ca/service-ca-865cb79987-zckcj" Apr 17 20:17:10.370602 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:10.370585 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/58c3570c-5668-47bd-bf38-25544374725a-signing-key\") pod \"service-ca-865cb79987-zckcj\" (UID: \"58c3570c-5668-47bd-bf38-25544374725a\") " pod="openshift-service-ca/service-ca-865cb79987-zckcj" Apr 17 20:17:10.375825 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:10.375806 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqxfp\" (UniqueName: \"kubernetes.io/projected/58c3570c-5668-47bd-bf38-25544374725a-kube-api-access-wqxfp\") pod \"service-ca-865cb79987-zckcj\" (UID: \"58c3570c-5668-47bd-bf38-25544374725a\") " pod="openshift-service-ca/service-ca-865cb79987-zckcj" Apr 17 20:17:10.492600 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:10.492574 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-zckcj" Apr 17 20:17:10.570023 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:10.569996 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/29f57080-c48b-42b7-8c1a-747b7fd06533-metrics-tls\") pod \"dns-default-nztqs\" (UID: \"29f57080-c48b-42b7-8c1a-747b7fd06533\") " pod="openshift-dns/dns-default-nztqs" Apr 17 20:17:10.570141 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:10.570049 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e270f686-4250-41f8-a9c1-6f192df2ee57-cert\") pod \"ingress-canary-4x62t\" (UID: \"e270f686-4250-41f8-a9c1-6f192df2ee57\") " pod="openshift-ingress-canary/ingress-canary-4x62t" Apr 17 20:17:10.570182 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:17:10.570151 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:17:10.570215 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:17:10.570181 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:17:10.570246 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:17:10.570223 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29f57080-c48b-42b7-8c1a-747b7fd06533-metrics-tls podName:29f57080-c48b-42b7-8c1a-747b7fd06533 nodeName:}" failed. No retries permitted until 2026-04-17 20:18:14.570204432 +0000 UTC m=+161.646576411 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/29f57080-c48b-42b7-8c1a-747b7fd06533-metrics-tls") pod "dns-default-nztqs" (UID: "29f57080-c48b-42b7-8c1a-747b7fd06533") : secret "dns-default-metrics-tls" not found Apr 17 20:17:10.570246 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:17:10.570242 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e270f686-4250-41f8-a9c1-6f192df2ee57-cert podName:e270f686-4250-41f8-a9c1-6f192df2ee57 nodeName:}" failed. No retries permitted until 2026-04-17 20:18:14.570232636 +0000 UTC m=+161.646604617 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e270f686-4250-41f8-a9c1-6f192df2ee57-cert") pod "ingress-canary-4x62t" (UID: "e270f686-4250-41f8-a9c1-6f192df2ee57") : secret "canary-serving-cert" not found Apr 17 20:17:10.599984 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:10.599959 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-zckcj"] Apr 17 20:17:10.604020 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:17:10.603983 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58c3570c_5668_47bd_bf38_25544374725a.slice/crio-02029fe8895fd9c2bb8450f75ec4137993f2191f212543d0c4654336cd903576 WatchSource:0}: Error finding container 02029fe8895fd9c2bb8450f75ec4137993f2191f212543d0c4654336cd903576: Status 404 returned error can't find the container with id 02029fe8895fd9c2bb8450f75ec4137993f2191f212543d0c4654336cd903576 Apr 17 20:17:10.829474 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:10.829435 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bmcs7" event={"ID":"719e086f-1a8a-434b-90a0-cd72fcae76c0","Type":"ContainerStarted","Data":"7ad3905de5e156d48f13d87b506e1388917b092e8be837f6a7276f9976fdf032"} Apr 17 20:17:10.830410 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:10.830388 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-zckcj" event={"ID":"58c3570c-5668-47bd-bf38-25544374725a","Type":"ContainerStarted","Data":"02029fe8895fd9c2bb8450f75ec4137993f2191f212543d0c4654336cd903576"} Apr 17 20:17:10.835333 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:10.835316 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-h6fk4_28b18d89-4df2-405e-8e06-5f5e39694305/dns-node-resolver/0.log" Apr 17 20:17:10.846338 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:10.846304 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bmcs7" podStartSLOduration=2.497859345 podStartE2EDuration="3.846290881s" podCreationTimestamp="2026-04-17 20:17:07 +0000 UTC" firstStartedPulling="2026-04-17 20:17:08.408260128 +0000 UTC m=+95.484632110" lastFinishedPulling="2026-04-17 20:17:09.756691665 +0000 UTC m=+96.833063646" observedRunningTime="2026-04-17 20:17:10.845358456 +0000 UTC m=+97.921730464" watchObservedRunningTime="2026-04-17 20:17:10.846290881 +0000 UTC m=+97.922662861" Apr 17 20:17:12.234788 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:12.234757 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-n6lwg_9d8cbec6-ac91-4373-b7ef-593404bf8a86/node-ca/0.log" Apr 17 20:17:12.835972 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:12.835940 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-zckcj" event={"ID":"58c3570c-5668-47bd-bf38-25544374725a","Type":"ContainerStarted","Data":"c7e2162f50bee0327f4144a5e91b78b16a4dfee6a4d294832c5fe2a0347d477a"} Apr 17 20:17:12.850931 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:12.850880 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-zckcj" podStartSLOduration=1.024957148 podStartE2EDuration="2.850863084s" podCreationTimestamp="2026-04-17 20:17:10 +0000 UTC" firstStartedPulling="2026-04-17 20:17:10.605801858 +0000 UTC m=+97.682173838" lastFinishedPulling="2026-04-17 20:17:12.431707793 +0000 UTC m=+99.508079774" observedRunningTime="2026-04-17 20:17:12.850573783 +0000 UTC m=+99.926946012" watchObservedRunningTime="2026-04-17 20:17:12.850863084 +0000 UTC m=+99.927235088" Apr 17 20:17:13.235806 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:13.235734 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-bmcs7_719e086f-1a8a-434b-90a0-cd72fcae76c0/migrator/0.log" Apr 17 20:17:13.435653 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:13.435611 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-bmcs7_719e086f-1a8a-434b-90a0-cd72fcae76c0/graceful-termination/0.log" Apr 17 20:17:13.779871 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:13.779838 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-d9lfz" Apr 17 20:17:17.927920 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:17.927883 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-metrics-certs\") pod \"router-default-65d65c8fdb-58dhc\" (UID: \"3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9\") " pod="openshift-ingress/router-default-65d65c8fdb-58dhc" Apr 17 20:17:17.928487 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:17.927944 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-service-ca-bundle\") pod \"router-default-65d65c8fdb-58dhc\" (UID: \"3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9\") " pod="openshift-ingress/router-default-65d65c8fdb-58dhc" Apr 17 20:17:17.928487 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:17.928009 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/658a2744-7815-42ec-bb05-a0eeeb012bc4-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-ht5sn\" (UID: \"658a2744-7815-42ec-bb05-a0eeeb012bc4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ht5sn" Apr 17 20:17:17.928487 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:17:17.928079 2580 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 20:17:17.928487 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:17:17.928143 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-metrics-certs podName:3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9 nodeName:}" failed. No retries permitted until 2026-04-17 20:17:33.928128532 +0000 UTC m=+121.004500517 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-metrics-certs") pod "router-default-65d65c8fdb-58dhc" (UID: "3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9") : secret "router-metrics-certs-default" not found Apr 17 20:17:17.928487 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:17:17.928166 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-service-ca-bundle podName:3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9 nodeName:}" failed. No retries permitted until 2026-04-17 20:17:33.928151261 +0000 UTC m=+121.004523241 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-service-ca-bundle") pod "router-default-65d65c8fdb-58dhc" (UID: "3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9") : configmap references non-existent config key: service-ca.crt Apr 17 20:17:17.930500 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:17.930484 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/658a2744-7815-42ec-bb05-a0eeeb012bc4-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-ht5sn\" (UID: \"658a2744-7815-42ec-bb05-a0eeeb012bc4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ht5sn" Apr 17 20:17:18.074671 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:18.074641 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ht5sn" Apr 17 20:17:18.185980 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:18.185913 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ht5sn"] Apr 17 20:17:18.849874 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:18.849791 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ht5sn" event={"ID":"658a2744-7815-42ec-bb05-a0eeeb012bc4","Type":"ContainerStarted","Data":"d3f4cafb14fa8c253f349e42a93b6c69b000044604fcffb6735fc75a7e967d04"} Apr 17 20:17:20.857629 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:20.857597 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ht5sn" event={"ID":"658a2744-7815-42ec-bb05-a0eeeb012bc4","Type":"ContainerStarted","Data":"c1c62c1cd269d381fe0b199add630b0bb5ff2c3d8660236ce252f5d0d6a67f97"} Apr 17 20:17:20.857988 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:20.857638 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ht5sn" event={"ID":"658a2744-7815-42ec-bb05-a0eeeb012bc4","Type":"ContainerStarted","Data":"c3266db842a71eee944a860752707d40defcb71f939fb102fccfd64bb6ae9346"} Apr 17 20:17:20.872025 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:20.871975 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ht5sn" podStartSLOduration=16.964263098 podStartE2EDuration="18.871962288s" podCreationTimestamp="2026-04-17 20:17:02 +0000 UTC" firstStartedPulling="2026-04-17 20:17:18.231242348 +0000 UTC m=+105.307614345" lastFinishedPulling="2026-04-17 20:17:20.138941554 +0000 UTC m=+107.215313535" observedRunningTime="2026-04-17 20:17:20.871220326 +0000 UTC m=+107.947592337" watchObservedRunningTime="2026-04-17 20:17:20.871962288 +0000 UTC m=+107.948334290" Apr 17 20:17:30.370552 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.370505 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-2vjgj"] Apr 17 20:17:30.373918 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.373894 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-5grgb"] Apr 17 20:17:30.374069 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.374051 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2vjgj" Apr 17 20:17:30.376695 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.376673 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 20:17:30.376695 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.376682 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 20:17:30.376855 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.376812 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-5grgb" Apr 17 20:17:30.377167 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.377147 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-59lxf\"" Apr 17 20:17:30.377280 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.377264 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 20:17:30.377342 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.377315 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 20:17:30.379131 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.379101 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 17 20:17:30.379131 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.379127 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-89r4h\"" Apr 17 20:17:30.379277 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.379142 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 17 20:17:30.383077 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.383055 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2vjgj"] Apr 17 20:17:30.383861 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.383841 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-5grgb"] Apr 17 20:17:30.467856 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.467832 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-49zfd"] Apr 17 20:17:30.470709 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.470692 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-49zfd" Apr 17 20:17:30.472931 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.472909 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 20:17:30.472931 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.472909 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-7hf2v\"" Apr 17 20:17:30.473078 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.473043 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 20:17:30.479774 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.479755 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-49zfd"] Apr 17 20:17:30.518399 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.518375 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a1e13515-b5f1-46d8-bc11-773c27792e7e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2vjgj\" (UID: \"a1e13515-b5f1-46d8-bc11-773c27792e7e\") " pod="openshift-insights/insights-runtime-extractor-2vjgj" Apr 17 20:17:30.518521 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.518404 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5e2d98af-d82a-4f4f-a48d-77a96c374f2c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-5grgb\" (UID: \"5e2d98af-d82a-4f4f-a48d-77a96c374f2c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5grgb" Apr 17 20:17:30.518521 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.518432 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5e2d98af-d82a-4f4f-a48d-77a96c374f2c-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-5grgb\" (UID: \"5e2d98af-d82a-4f4f-a48d-77a96c374f2c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5grgb" Apr 17 20:17:30.518647 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.518522 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a1e13515-b5f1-46d8-bc11-773c27792e7e-crio-socket\") pod \"insights-runtime-extractor-2vjgj\" (UID: \"a1e13515-b5f1-46d8-bc11-773c27792e7e\") " pod="openshift-insights/insights-runtime-extractor-2vjgj" Apr 17 20:17:30.518647 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.518546 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r5kp\" (UniqueName: \"kubernetes.io/projected/a1e13515-b5f1-46d8-bc11-773c27792e7e-kube-api-access-9r5kp\") pod \"insights-runtime-extractor-2vjgj\" (UID: \"a1e13515-b5f1-46d8-bc11-773c27792e7e\") " pod="openshift-insights/insights-runtime-extractor-2vjgj" Apr 17 20:17:30.518647 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.518588 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a1e13515-b5f1-46d8-bc11-773c27792e7e-data-volume\") pod \"insights-runtime-extractor-2vjgj\" (UID: \"a1e13515-b5f1-46d8-bc11-773c27792e7e\") " pod="openshift-insights/insights-runtime-extractor-2vjgj" Apr 17 20:17:30.518647 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.518604 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a1e13515-b5f1-46d8-bc11-773c27792e7e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2vjgj\" (UID: \"a1e13515-b5f1-46d8-bc11-773c27792e7e\") " pod="openshift-insights/insights-runtime-extractor-2vjgj" Apr 17 20:17:30.619884 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.619857 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a1e13515-b5f1-46d8-bc11-773c27792e7e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2vjgj\" (UID: \"a1e13515-b5f1-46d8-bc11-773c27792e7e\") " pod="openshift-insights/insights-runtime-extractor-2vjgj" Apr 17 20:17:30.620012 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.619886 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5e2d98af-d82a-4f4f-a48d-77a96c374f2c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-5grgb\" (UID: \"5e2d98af-d82a-4f4f-a48d-77a96c374f2c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5grgb" Apr 17 20:17:30.620012 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.619913 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5e2d98af-d82a-4f4f-a48d-77a96c374f2c-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-5grgb\" (UID: \"5e2d98af-d82a-4f4f-a48d-77a96c374f2c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5grgb" Apr 17 20:17:30.620131 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.620068 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a1e13515-b5f1-46d8-bc11-773c27792e7e-crio-socket\") pod \"insights-runtime-extractor-2vjgj\" (UID: \"a1e13515-b5f1-46d8-bc11-773c27792e7e\") " pod="openshift-insights/insights-runtime-extractor-2vjgj" Apr 17 20:17:30.620131 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.620096 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9r5kp\" (UniqueName: \"kubernetes.io/projected/a1e13515-b5f1-46d8-bc11-773c27792e7e-kube-api-access-9r5kp\") pod \"insights-runtime-extractor-2vjgj\" (UID: \"a1e13515-b5f1-46d8-bc11-773c27792e7e\") " pod="openshift-insights/insights-runtime-extractor-2vjgj" Apr 17 20:17:30.620223 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.620134 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76875\" (UniqueName: \"kubernetes.io/projected/7dcf50fe-0775-4474-b4fd-e451ff50c3a5-kube-api-access-76875\") pod \"downloads-6bcc868b7-49zfd\" (UID: \"7dcf50fe-0775-4474-b4fd-e451ff50c3a5\") " pod="openshift-console/downloads-6bcc868b7-49zfd" Apr 17 20:17:30.620223 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.620168 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a1e13515-b5f1-46d8-bc11-773c27792e7e-data-volume\") pod \"insights-runtime-extractor-2vjgj\" (UID: \"a1e13515-b5f1-46d8-bc11-773c27792e7e\") " pod="openshift-insights/insights-runtime-extractor-2vjgj" Apr 17 20:17:30.620223 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.620193 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a1e13515-b5f1-46d8-bc11-773c27792e7e-crio-socket\") pod \"insights-runtime-extractor-2vjgj\" (UID: \"a1e13515-b5f1-46d8-bc11-773c27792e7e\") " pod="openshift-insights/insights-runtime-extractor-2vjgj" Apr 17 20:17:30.620223 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.620194 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a1e13515-b5f1-46d8-bc11-773c27792e7e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2vjgj\" (UID: \"a1e13515-b5f1-46d8-bc11-773c27792e7e\") " pod="openshift-insights/insights-runtime-extractor-2vjgj" Apr 17 20:17:30.620615 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.620534 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a1e13515-b5f1-46d8-bc11-773c27792e7e-data-volume\") pod \"insights-runtime-extractor-2vjgj\" (UID: \"a1e13515-b5f1-46d8-bc11-773c27792e7e\") " pod="openshift-insights/insights-runtime-extractor-2vjgj" Apr 17 20:17:30.620886 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.620756 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a1e13515-b5f1-46d8-bc11-773c27792e7e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2vjgj\" (UID: \"a1e13515-b5f1-46d8-bc11-773c27792e7e\") " pod="openshift-insights/insights-runtime-extractor-2vjgj" Apr 17 20:17:30.621219 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.621192 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5e2d98af-d82a-4f4f-a48d-77a96c374f2c-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-5grgb\" (UID: \"5e2d98af-d82a-4f4f-a48d-77a96c374f2c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5grgb" Apr 17 20:17:30.622451 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.622429 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a1e13515-b5f1-46d8-bc11-773c27792e7e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2vjgj\" (UID: \"a1e13515-b5f1-46d8-bc11-773c27792e7e\") " pod="openshift-insights/insights-runtime-extractor-2vjgj" Apr 17 20:17:30.622580 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.622563 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5e2d98af-d82a-4f4f-a48d-77a96c374f2c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-5grgb\" (UID: \"5e2d98af-d82a-4f4f-a48d-77a96c374f2c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5grgb" Apr 17 20:17:30.629646 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.629625 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r5kp\" (UniqueName: \"kubernetes.io/projected/a1e13515-b5f1-46d8-bc11-773c27792e7e-kube-api-access-9r5kp\") pod \"insights-runtime-extractor-2vjgj\" (UID: \"a1e13515-b5f1-46d8-bc11-773c27792e7e\") " pod="openshift-insights/insights-runtime-extractor-2vjgj" Apr 17 20:17:30.684050 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.684031 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2vjgj" Apr 17 20:17:30.690136 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.690116 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-5grgb" Apr 17 20:17:30.721343 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.721190 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76875\" (UniqueName: \"kubernetes.io/projected/7dcf50fe-0775-4474-b4fd-e451ff50c3a5-kube-api-access-76875\") pod \"downloads-6bcc868b7-49zfd\" (UID: \"7dcf50fe-0775-4474-b4fd-e451ff50c3a5\") " pod="openshift-console/downloads-6bcc868b7-49zfd" Apr 17 20:17:30.729516 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.729484 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-76875\" (UniqueName: \"kubernetes.io/projected/7dcf50fe-0775-4474-b4fd-e451ff50c3a5-kube-api-access-76875\") pod \"downloads-6bcc868b7-49zfd\" (UID: \"7dcf50fe-0775-4474-b4fd-e451ff50c3a5\") " pod="openshift-console/downloads-6bcc868b7-49zfd" Apr 17 20:17:30.778965 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.778933 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-49zfd" Apr 17 20:17:30.809707 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.809677 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2vjgj"] Apr 17 20:17:30.813965 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:17:30.813938 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1e13515_b5f1_46d8_bc11_773c27792e7e.slice/crio-1ecdf8bf6b1cfe01d2b849e0b0fb2be4dd1441ceb6308a9d9df0be2b9cae022e WatchSource:0}: Error finding container 1ecdf8bf6b1cfe01d2b849e0b0fb2be4dd1441ceb6308a9d9df0be2b9cae022e: Status 404 returned error can't find the container with id 1ecdf8bf6b1cfe01d2b849e0b0fb2be4dd1441ceb6308a9d9df0be2b9cae022e Apr 17 20:17:30.826496 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.826166 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-5grgb"] Apr 17 20:17:30.830485 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:17:30.830436 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e2d98af_d82a_4f4f_a48d_77a96c374f2c.slice/crio-f2b45c8ee28245115b52e20aba9e6bddccc2400db9188165e82755e7f2284a1b WatchSource:0}: Error finding container f2b45c8ee28245115b52e20aba9e6bddccc2400db9188165e82755e7f2284a1b: Status 404 returned error can't find the container with id f2b45c8ee28245115b52e20aba9e6bddccc2400db9188165e82755e7f2284a1b Apr 17 20:17:30.883180 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.883149 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-5grgb" event={"ID":"5e2d98af-d82a-4f4f-a48d-77a96c374f2c","Type":"ContainerStarted","Data":"f2b45c8ee28245115b52e20aba9e6bddccc2400db9188165e82755e7f2284a1b"} Apr 17 20:17:30.884450 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.884422 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2vjgj" event={"ID":"a1e13515-b5f1-46d8-bc11-773c27792e7e","Type":"ContainerStarted","Data":"13738d723e0e42b6c285068ecc9ed1f542c05a14c809cd49f765501c539d524c"} Apr 17 20:17:30.884573 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.884522 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2vjgj" event={"ID":"a1e13515-b5f1-46d8-bc11-773c27792e7e","Type":"ContainerStarted","Data":"1ecdf8bf6b1cfe01d2b849e0b0fb2be4dd1441ceb6308a9d9df0be2b9cae022e"} Apr 17 20:17:30.899508 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:30.899487 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-49zfd"] Apr 17 20:17:30.902659 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:17:30.902633 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7dcf50fe_0775_4474_b4fd_e451ff50c3a5.slice/crio-7a909dfb4c72bd1fb09222b175050331702d403cbe3ae65f76109ba3be9e6bcf WatchSource:0}: Error finding container 7a909dfb4c72bd1fb09222b175050331702d403cbe3ae65f76109ba3be9e6bcf: Status 404 returned error can't find the container with id 7a909dfb4c72bd1fb09222b175050331702d403cbe3ae65f76109ba3be9e6bcf Apr 17 20:17:31.888276 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:31.888246 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-5grgb" event={"ID":"5e2d98af-d82a-4f4f-a48d-77a96c374f2c","Type":"ContainerStarted","Data":"86896efe916b7d43d54ff823ebfc694d83f99a04f07194622fcbc627cc48b1aa"} Apr 17 20:17:31.890100 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:31.890076 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2vjgj" event={"ID":"a1e13515-b5f1-46d8-bc11-773c27792e7e","Type":"ContainerStarted","Data":"2aab258148e0f37afd9b2c93992cb4149cf9e6695693b4ca784230c9ed54c201"} Apr 17 20:17:31.891205 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:31.891182 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-49zfd" event={"ID":"7dcf50fe-0775-4474-b4fd-e451ff50c3a5","Type":"ContainerStarted","Data":"7a909dfb4c72bd1fb09222b175050331702d403cbe3ae65f76109ba3be9e6bcf"} Apr 17 20:17:31.901698 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:31.901656 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-5grgb" podStartSLOduration=0.945716469 podStartE2EDuration="1.901641821s" podCreationTimestamp="2026-04-17 20:17:30 +0000 UTC" firstStartedPulling="2026-04-17 20:17:30.832698799 +0000 UTC m=+117.909070782" lastFinishedPulling="2026-04-17 20:17:31.788624148 +0000 UTC m=+118.864996134" observedRunningTime="2026-04-17 20:17:31.901002781 +0000 UTC m=+118.977374786" watchObservedRunningTime="2026-04-17 20:17:31.901641821 +0000 UTC m=+118.978013820" Apr 17 20:17:33.898298 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:33.898237 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2vjgj" event={"ID":"a1e13515-b5f1-46d8-bc11-773c27792e7e","Type":"ContainerStarted","Data":"27be125901782efe5b56494f650835b9c7f7fb8f25d51fa4f374aaba6071d532"} Apr 17 20:17:33.913708 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:33.913663 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-2vjgj" podStartSLOduration=1.686622178 podStartE2EDuration="3.913645694s" podCreationTimestamp="2026-04-17 20:17:30 +0000 UTC" firstStartedPulling="2026-04-17 20:17:30.872540628 +0000 UTC m=+117.948912615" lastFinishedPulling="2026-04-17 20:17:33.099564145 +0000 UTC m=+120.175936131" observedRunningTime="2026-04-17 20:17:33.913072114 +0000 UTC m=+120.989444117" watchObservedRunningTime="2026-04-17 20:17:33.913645694 +0000 UTC m=+120.990017696" Apr 17 20:17:33.947407 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:33.947376 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-metrics-certs\") pod \"router-default-65d65c8fdb-58dhc\" (UID: \"3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9\") " pod="openshift-ingress/router-default-65d65c8fdb-58dhc" Apr 17 20:17:33.947584 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:33.947479 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-service-ca-bundle\") pod \"router-default-65d65c8fdb-58dhc\" (UID: \"3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9\") " pod="openshift-ingress/router-default-65d65c8fdb-58dhc" Apr 17 20:17:33.948145 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:33.948118 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-service-ca-bundle\") pod \"router-default-65d65c8fdb-58dhc\" (UID: \"3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9\") " pod="openshift-ingress/router-default-65d65c8fdb-58dhc" Apr 17 20:17:33.950436 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:33.950411 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9-metrics-certs\") pod \"router-default-65d65c8fdb-58dhc\" (UID: \"3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9\") " pod="openshift-ingress/router-default-65d65c8fdb-58dhc" Apr 17 20:17:33.982406 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:33.982378 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-5jdcv\"" Apr 17 20:17:33.991164 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:33.991131 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-65d65c8fdb-58dhc" Apr 17 20:17:34.127735 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:34.127702 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-65d65c8fdb-58dhc"] Apr 17 20:17:34.628239 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:34.628204 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-76df7c9b79-w59dp"] Apr 17 20:17:34.631543 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:34.631515 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76df7c9b79-w59dp" Apr 17 20:17:34.633874 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:34.633843 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 20:17:34.633874 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:34.633861 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 20:17:34.634060 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:34.633890 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 20:17:34.634060 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:34.633843 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 20:17:34.634060 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:34.633960 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-hzws2\"" Apr 17 20:17:34.634060 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:34.634017 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 20:17:34.640786 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:34.640757 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76df7c9b79-w59dp"] Apr 17 20:17:34.755232 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:34.755198 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a04f1a62-018f-41d6-a075-60c2c6d70790-service-ca\") pod \"console-76df7c9b79-w59dp\" (UID: \"a04f1a62-018f-41d6-a075-60c2c6d70790\") " pod="openshift-console/console-76df7c9b79-w59dp" Apr 17 20:17:34.755394 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:34.755280 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a04f1a62-018f-41d6-a075-60c2c6d70790-oauth-serving-cert\") pod \"console-76df7c9b79-w59dp\" (UID: \"a04f1a62-018f-41d6-a075-60c2c6d70790\") " pod="openshift-console/console-76df7c9b79-w59dp" Apr 17 20:17:34.755394 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:34.755330 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns6bd\" (UniqueName: \"kubernetes.io/projected/a04f1a62-018f-41d6-a075-60c2c6d70790-kube-api-access-ns6bd\") pod \"console-76df7c9b79-w59dp\" (UID: \"a04f1a62-018f-41d6-a075-60c2c6d70790\") " pod="openshift-console/console-76df7c9b79-w59dp" Apr 17 20:17:34.755592 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:34.755416 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a04f1a62-018f-41d6-a075-60c2c6d70790-console-config\") pod \"console-76df7c9b79-w59dp\" (UID: \"a04f1a62-018f-41d6-a075-60c2c6d70790\") " pod="openshift-console/console-76df7c9b79-w59dp" Apr 17 20:17:34.755592 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:34.755483 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a04f1a62-018f-41d6-a075-60c2c6d70790-console-oauth-config\") pod \"console-76df7c9b79-w59dp\" (UID: \"a04f1a62-018f-41d6-a075-60c2c6d70790\") " pod="openshift-console/console-76df7c9b79-w59dp" Apr 17 20:17:34.755592 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:34.755512 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a04f1a62-018f-41d6-a075-60c2c6d70790-console-serving-cert\") pod \"console-76df7c9b79-w59dp\" (UID: \"a04f1a62-018f-41d6-a075-60c2c6d70790\") " pod="openshift-console/console-76df7c9b79-w59dp" Apr 17 20:17:34.856730 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:34.856697 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a04f1a62-018f-41d6-a075-60c2c6d70790-console-config\") pod \"console-76df7c9b79-w59dp\" (UID: \"a04f1a62-018f-41d6-a075-60c2c6d70790\") " pod="openshift-console/console-76df7c9b79-w59dp" Apr 17 20:17:34.856730 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:34.856736 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a04f1a62-018f-41d6-a075-60c2c6d70790-console-oauth-config\") pod \"console-76df7c9b79-w59dp\" (UID: \"a04f1a62-018f-41d6-a075-60c2c6d70790\") " pod="openshift-console/console-76df7c9b79-w59dp" Apr 17 20:17:34.856979 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:34.856754 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a04f1a62-018f-41d6-a075-60c2c6d70790-console-serving-cert\") pod \"console-76df7c9b79-w59dp\" (UID: \"a04f1a62-018f-41d6-a075-60c2c6d70790\") " pod="openshift-console/console-76df7c9b79-w59dp" Apr 17 20:17:34.856979 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:34.856786 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a04f1a62-018f-41d6-a075-60c2c6d70790-service-ca\") pod \"console-76df7c9b79-w59dp\" (UID: \"a04f1a62-018f-41d6-a075-60c2c6d70790\") " pod="openshift-console/console-76df7c9b79-w59dp" Apr 17 20:17:34.856979 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:34.856839 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a04f1a62-018f-41d6-a075-60c2c6d70790-oauth-serving-cert\") pod \"console-76df7c9b79-w59dp\" (UID: \"a04f1a62-018f-41d6-a075-60c2c6d70790\") " pod="openshift-console/console-76df7c9b79-w59dp" Apr 17 20:17:34.856979 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:34.856869 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ns6bd\" (UniqueName: \"kubernetes.io/projected/a04f1a62-018f-41d6-a075-60c2c6d70790-kube-api-access-ns6bd\") pod \"console-76df7c9b79-w59dp\" (UID: \"a04f1a62-018f-41d6-a075-60c2c6d70790\") " pod="openshift-console/console-76df7c9b79-w59dp" Apr 17 20:17:34.857673 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:34.857640 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a04f1a62-018f-41d6-a075-60c2c6d70790-service-ca\") pod \"console-76df7c9b79-w59dp\" (UID: \"a04f1a62-018f-41d6-a075-60c2c6d70790\") " pod="openshift-console/console-76df7c9b79-w59dp" Apr 17 20:17:34.857786 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:34.857771 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a04f1a62-018f-41d6-a075-60c2c6d70790-console-config\") pod \"console-76df7c9b79-w59dp\" (UID: \"a04f1a62-018f-41d6-a075-60c2c6d70790\") " pod="openshift-console/console-76df7c9b79-w59dp" Apr 17 20:17:34.857889 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:34.857863 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a04f1a62-018f-41d6-a075-60c2c6d70790-oauth-serving-cert\") pod \"console-76df7c9b79-w59dp\" (UID: \"a04f1a62-018f-41d6-a075-60c2c6d70790\") " pod="openshift-console/console-76df7c9b79-w59dp" Apr 17 20:17:34.859685 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:34.859666 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a04f1a62-018f-41d6-a075-60c2c6d70790-console-serving-cert\") pod \"console-76df7c9b79-w59dp\" (UID: \"a04f1a62-018f-41d6-a075-60c2c6d70790\") " pod="openshift-console/console-76df7c9b79-w59dp" Apr 17 20:17:34.860136 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:34.860119 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a04f1a62-018f-41d6-a075-60c2c6d70790-console-oauth-config\") pod \"console-76df7c9b79-w59dp\" (UID: \"a04f1a62-018f-41d6-a075-60c2c6d70790\") " pod="openshift-console/console-76df7c9b79-w59dp" Apr 17 20:17:34.864288 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:34.864247 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns6bd\" (UniqueName: \"kubernetes.io/projected/a04f1a62-018f-41d6-a075-60c2c6d70790-kube-api-access-ns6bd\") pod \"console-76df7c9b79-w59dp\" (UID: \"a04f1a62-018f-41d6-a075-60c2c6d70790\") " pod="openshift-console/console-76df7c9b79-w59dp" Apr 17 20:17:34.902844 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:34.902753 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-65d65c8fdb-58dhc" event={"ID":"3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9","Type":"ContainerStarted","Data":"2d2609dc3d6f54e79ef6231ab2b5453ae68f97411d437b29c96a3c542c29d949"} Apr 17 20:17:34.902844 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:34.902801 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-65d65c8fdb-58dhc" event={"ID":"3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9","Type":"ContainerStarted","Data":"25d910e902162dcbff98c431f9a3d91740bce1ac8a047e4345904081027389ec"} Apr 17 20:17:34.920521 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:34.920452 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-65d65c8fdb-58dhc" podStartSLOduration=32.920434216 podStartE2EDuration="32.920434216s" podCreationTimestamp="2026-04-17 20:17:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:17:34.918551777 +0000 UTC m=+121.994923801" watchObservedRunningTime="2026-04-17 20:17:34.920434216 +0000 UTC m=+121.996806220" Apr 17 20:17:34.942705 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:34.942676 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76df7c9b79-w59dp" Apr 17 20:17:34.991928 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:34.991892 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-65d65c8fdb-58dhc" Apr 17 20:17:34.994403 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:34.994381 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-65d65c8fdb-58dhc" Apr 17 20:17:35.076187 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:35.076157 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76df7c9b79-w59dp"] Apr 17 20:17:35.079072 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:17:35.079038 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda04f1a62_018f_41d6_a075_60c2c6d70790.slice/crio-a0ecd37edef089375e33a1e3ef0be6eae59c23560134360a95cdfd2f75c5f5c3 WatchSource:0}: Error finding container a0ecd37edef089375e33a1e3ef0be6eae59c23560134360a95cdfd2f75c5f5c3: Status 404 returned error can't find the container with id a0ecd37edef089375e33a1e3ef0be6eae59c23560134360a95cdfd2f75c5f5c3 Apr 17 20:17:35.906353 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:35.906314 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76df7c9b79-w59dp" event={"ID":"a04f1a62-018f-41d6-a075-60c2c6d70790","Type":"ContainerStarted","Data":"a0ecd37edef089375e33a1e3ef0be6eae59c23560134360a95cdfd2f75c5f5c3"} Apr 17 20:17:35.906827 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:35.906680 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-65d65c8fdb-58dhc" Apr 17 20:17:35.908041 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:35.908020 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-65d65c8fdb-58dhc" Apr 17 20:17:38.916030 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:38.915997 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76df7c9b79-w59dp" event={"ID":"a04f1a62-018f-41d6-a075-60c2c6d70790","Type":"ContainerStarted","Data":"9af88f5ac16a593fe31417290ebbdcee276a244e2fd8c267a3a1e8f2a8edc5c4"} Apr 17 20:17:38.931877 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:38.931832 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-76df7c9b79-w59dp" podStartSLOduration=1.7620536869999999 podStartE2EDuration="4.931814694s" podCreationTimestamp="2026-04-17 20:17:34 +0000 UTC" firstStartedPulling="2026-04-17 20:17:35.081070718 +0000 UTC m=+122.157442704" lastFinishedPulling="2026-04-17 20:17:38.250831715 +0000 UTC m=+125.327203711" observedRunningTime="2026-04-17 20:17:38.930309094 +0000 UTC m=+126.006681119" watchObservedRunningTime="2026-04-17 20:17:38.931814694 +0000 UTC m=+126.008186696" Apr 17 20:17:43.232049 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:43.232013 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d943896a-8c08-4d43-b1c4-d738b0079503-metrics-certs\") pod \"network-metrics-daemon-99wq2\" (UID: \"d943896a-8c08-4d43-b1c4-d738b0079503\") " pod="openshift-multus/network-metrics-daemon-99wq2" Apr 17 20:17:43.234669 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:43.234640 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d943896a-8c08-4d43-b1c4-d738b0079503-metrics-certs\") pod \"network-metrics-daemon-99wq2\" (UID: \"d943896a-8c08-4d43-b1c4-d738b0079503\") " pod="openshift-multus/network-metrics-daemon-99wq2" Apr 17 20:17:43.338331 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:43.338298 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-rfhz5\"" Apr 17 20:17:43.346990 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:43.346964 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99wq2" Apr 17 20:17:43.832172 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:43.832140 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5d9fc4bfc5-xnjpn"] Apr 17 20:17:43.838489 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:43.838445 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d9fc4bfc5-xnjpn" Apr 17 20:17:43.845283 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:43.845259 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d9fc4bfc5-xnjpn"] Apr 17 20:17:43.846471 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:43.846282 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 20:17:43.938271 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:43.938234 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/132b50a8-deb0-486b-a136-71fb422c3919-service-ca\") pod \"console-5d9fc4bfc5-xnjpn\" (UID: \"132b50a8-deb0-486b-a136-71fb422c3919\") " pod="openshift-console/console-5d9fc4bfc5-xnjpn" Apr 17 20:17:43.938434 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:43.938366 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5f2d\" (UniqueName: \"kubernetes.io/projected/132b50a8-deb0-486b-a136-71fb422c3919-kube-api-access-w5f2d\") pod \"console-5d9fc4bfc5-xnjpn\" (UID: \"132b50a8-deb0-486b-a136-71fb422c3919\") " pod="openshift-console/console-5d9fc4bfc5-xnjpn" Apr 17 20:17:43.938434 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:43.938404 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/132b50a8-deb0-486b-a136-71fb422c3919-console-serving-cert\") pod \"console-5d9fc4bfc5-xnjpn\" (UID: \"132b50a8-deb0-486b-a136-71fb422c3919\") " pod="openshift-console/console-5d9fc4bfc5-xnjpn" Apr 17 20:17:43.938434 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:43.938426 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/132b50a8-deb0-486b-a136-71fb422c3919-console-oauth-config\") pod \"console-5d9fc4bfc5-xnjpn\" (UID: \"132b50a8-deb0-486b-a136-71fb422c3919\") " pod="openshift-console/console-5d9fc4bfc5-xnjpn" Apr 17 20:17:43.938588 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:43.938483 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/132b50a8-deb0-486b-a136-71fb422c3919-oauth-serving-cert\") pod \"console-5d9fc4bfc5-xnjpn\" (UID: \"132b50a8-deb0-486b-a136-71fb422c3919\") " pod="openshift-console/console-5d9fc4bfc5-xnjpn" Apr 17 20:17:43.938588 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:43.938536 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/132b50a8-deb0-486b-a136-71fb422c3919-trusted-ca-bundle\") pod \"console-5d9fc4bfc5-xnjpn\" (UID: \"132b50a8-deb0-486b-a136-71fb422c3919\") " pod="openshift-console/console-5d9fc4bfc5-xnjpn" Apr 17 20:17:43.938674 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:43.938585 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/132b50a8-deb0-486b-a136-71fb422c3919-console-config\") pod \"console-5d9fc4bfc5-xnjpn\" (UID: \"132b50a8-deb0-486b-a136-71fb422c3919\") " pod="openshift-console/console-5d9fc4bfc5-xnjpn" Apr 17 20:17:44.039589 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:44.039551 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w5f2d\" (UniqueName: \"kubernetes.io/projected/132b50a8-deb0-486b-a136-71fb422c3919-kube-api-access-w5f2d\") pod \"console-5d9fc4bfc5-xnjpn\" (UID: \"132b50a8-deb0-486b-a136-71fb422c3919\") " pod="openshift-console/console-5d9fc4bfc5-xnjpn" Apr 17 20:17:44.039589 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:44.039595 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/132b50a8-deb0-486b-a136-71fb422c3919-console-serving-cert\") pod \"console-5d9fc4bfc5-xnjpn\" (UID: \"132b50a8-deb0-486b-a136-71fb422c3919\") " pod="openshift-console/console-5d9fc4bfc5-xnjpn" Apr 17 20:17:44.039800 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:44.039622 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/132b50a8-deb0-486b-a136-71fb422c3919-console-oauth-config\") pod \"console-5d9fc4bfc5-xnjpn\" (UID: \"132b50a8-deb0-486b-a136-71fb422c3919\") " pod="openshift-console/console-5d9fc4bfc5-xnjpn" Apr 17 20:17:44.039800 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:44.039673 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/132b50a8-deb0-486b-a136-71fb422c3919-oauth-serving-cert\") pod \"console-5d9fc4bfc5-xnjpn\" (UID: \"132b50a8-deb0-486b-a136-71fb422c3919\") " pod="openshift-console/console-5d9fc4bfc5-xnjpn" Apr 17 20:17:44.039800 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:44.039705 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/132b50a8-deb0-486b-a136-71fb422c3919-trusted-ca-bundle\") pod \"console-5d9fc4bfc5-xnjpn\" (UID: \"132b50a8-deb0-486b-a136-71fb422c3919\") " pod="openshift-console/console-5d9fc4bfc5-xnjpn" Apr 17 20:17:44.039800 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:44.039735 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/132b50a8-deb0-486b-a136-71fb422c3919-console-config\") pod \"console-5d9fc4bfc5-xnjpn\" (UID: \"132b50a8-deb0-486b-a136-71fb422c3919\") " pod="openshift-console/console-5d9fc4bfc5-xnjpn" Apr 17 20:17:44.039800 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:44.039773 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/132b50a8-deb0-486b-a136-71fb422c3919-service-ca\") pod \"console-5d9fc4bfc5-xnjpn\" (UID: \"132b50a8-deb0-486b-a136-71fb422c3919\") " pod="openshift-console/console-5d9fc4bfc5-xnjpn" Apr 17 20:17:44.040807 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:44.040516 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/132b50a8-deb0-486b-a136-71fb422c3919-service-ca\") pod \"console-5d9fc4bfc5-xnjpn\" (UID: \"132b50a8-deb0-486b-a136-71fb422c3919\") " pod="openshift-console/console-5d9fc4bfc5-xnjpn" Apr 17 20:17:44.040807 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:44.040733 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/132b50a8-deb0-486b-a136-71fb422c3919-console-config\") pod \"console-5d9fc4bfc5-xnjpn\" (UID: \"132b50a8-deb0-486b-a136-71fb422c3919\") " pod="openshift-console/console-5d9fc4bfc5-xnjpn" Apr 17 20:17:44.040807 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:44.040752 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/132b50a8-deb0-486b-a136-71fb422c3919-trusted-ca-bundle\") pod \"console-5d9fc4bfc5-xnjpn\" (UID: \"132b50a8-deb0-486b-a136-71fb422c3919\") " pod="openshift-console/console-5d9fc4bfc5-xnjpn" Apr 17 20:17:44.041058 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:44.040998 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/132b50a8-deb0-486b-a136-71fb422c3919-oauth-serving-cert\") pod \"console-5d9fc4bfc5-xnjpn\" (UID: \"132b50a8-deb0-486b-a136-71fb422c3919\") " pod="openshift-console/console-5d9fc4bfc5-xnjpn" Apr 17 20:17:44.042705 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:44.042675 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/132b50a8-deb0-486b-a136-71fb422c3919-console-oauth-config\") pod \"console-5d9fc4bfc5-xnjpn\" (UID: \"132b50a8-deb0-486b-a136-71fb422c3919\") " pod="openshift-console/console-5d9fc4bfc5-xnjpn" Apr 17 20:17:44.042867 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:44.042842 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/132b50a8-deb0-486b-a136-71fb422c3919-console-serving-cert\") pod \"console-5d9fc4bfc5-xnjpn\" (UID: \"132b50a8-deb0-486b-a136-71fb422c3919\") " pod="openshift-console/console-5d9fc4bfc5-xnjpn" Apr 17 20:17:44.047384 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:44.047360 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5f2d\" (UniqueName: \"kubernetes.io/projected/132b50a8-deb0-486b-a136-71fb422c3919-kube-api-access-w5f2d\") pod \"console-5d9fc4bfc5-xnjpn\" (UID: \"132b50a8-deb0-486b-a136-71fb422c3919\") " pod="openshift-console/console-5d9fc4bfc5-xnjpn" Apr 17 20:17:44.149573 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:44.149496 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d9fc4bfc5-xnjpn" Apr 17 20:17:44.943790 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:44.943757 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-76df7c9b79-w59dp" Apr 17 20:17:44.943790 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:44.943794 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-76df7c9b79-w59dp" Apr 17 20:17:44.945338 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:44.945311 2580 patch_prober.go:28] interesting pod/console-76df7c9b79-w59dp container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.134.0.14:8443/health\": dial tcp 10.134.0.14:8443: connect: connection refused" start-of-body= Apr 17 20:17:44.945486 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:44.945370 2580 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-76df7c9b79-w59dp" podUID="a04f1a62-018f-41d6-a075-60c2c6d70790" containerName="console" probeResult="failure" output="Get \"https://10.134.0.14:8443/health\": dial tcp 10.134.0.14:8443: connect: connection refused" Apr 17 20:17:46.312103 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.312067 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-qqk8v"] Apr 17 20:17:46.317918 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.317893 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qqk8v" Apr 17 20:17:46.320262 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.320200 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 17 20:17:46.320778 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.320601 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 20:17:46.321113 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.321042 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 17 20:17:46.321187 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.321157 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 20:17:46.321286 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.321267 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-hdl8s\"" Apr 17 20:17:46.321390 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.321374 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 20:17:46.325491 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.325447 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-qqk8v"] Apr 17 20:17:46.339394 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.338144 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-79v5j"] Apr 17 20:17:46.341954 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.341937 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-79v5j" Apr 17 20:17:46.344286 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.343944 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 20:17:46.344286 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.343946 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 20:17:46.344286 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.344179 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 20:17:46.344509 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.344304 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-kf58f\"" Apr 17 20:17:46.460770 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.460729 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/695db0e5-f748-4b2b-ad9f-a5a810dcad9b-metrics-client-ca\") pod \"node-exporter-79v5j\" (UID: \"695db0e5-f748-4b2b-ad9f-a5a810dcad9b\") " pod="openshift-monitoring/node-exporter-79v5j" Apr 17 20:17:46.460961 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.460774 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/695db0e5-f748-4b2b-ad9f-a5a810dcad9b-node-exporter-accelerators-collector-config\") pod \"node-exporter-79v5j\" (UID: \"695db0e5-f748-4b2b-ad9f-a5a810dcad9b\") " pod="openshift-monitoring/node-exporter-79v5j" Apr 17 20:17:46.460961 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.460811 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/695db0e5-f748-4b2b-ad9f-a5a810dcad9b-sys\") pod \"node-exporter-79v5j\" (UID: \"695db0e5-f748-4b2b-ad9f-a5a810dcad9b\") " pod="openshift-monitoring/node-exporter-79v5j" Apr 17 20:17:46.460961 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.460887 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/695db0e5-f748-4b2b-ad9f-a5a810dcad9b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-79v5j\" (UID: \"695db0e5-f748-4b2b-ad9f-a5a810dcad9b\") " pod="openshift-monitoring/node-exporter-79v5j" Apr 17 20:17:46.460961 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.460949 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/695db0e5-f748-4b2b-ad9f-a5a810dcad9b-root\") pod \"node-exporter-79v5j\" (UID: \"695db0e5-f748-4b2b-ad9f-a5a810dcad9b\") " pod="openshift-monitoring/node-exporter-79v5j" Apr 17 20:17:46.461212 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.460980 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm595\" (UniqueName: \"kubernetes.io/projected/dd485dc8-bb69-4490-83ff-fb09472c93f4-kube-api-access-cm595\") pod \"openshift-state-metrics-9d44df66c-qqk8v\" (UID: \"dd485dc8-bb69-4490-83ff-fb09472c93f4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qqk8v" Apr 17 20:17:46.461212 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.461009 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/695db0e5-f748-4b2b-ad9f-a5a810dcad9b-node-exporter-tls\") pod \"node-exporter-79v5j\" (UID: \"695db0e5-f748-4b2b-ad9f-a5a810dcad9b\") " pod="openshift-monitoring/node-exporter-79v5j" Apr 17 20:17:46.461212 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.461037 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dd485dc8-bb69-4490-83ff-fb09472c93f4-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-qqk8v\" (UID: \"dd485dc8-bb69-4490-83ff-fb09472c93f4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qqk8v" Apr 17 20:17:46.461212 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.461070 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v28ll\" (UniqueName: \"kubernetes.io/projected/695db0e5-f748-4b2b-ad9f-a5a810dcad9b-kube-api-access-v28ll\") pod \"node-exporter-79v5j\" (UID: \"695db0e5-f748-4b2b-ad9f-a5a810dcad9b\") " pod="openshift-monitoring/node-exporter-79v5j" Apr 17 20:17:46.461212 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.461110 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dd485dc8-bb69-4490-83ff-fb09472c93f4-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-qqk8v\" (UID: \"dd485dc8-bb69-4490-83ff-fb09472c93f4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qqk8v" Apr 17 20:17:46.461212 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.461139 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/dd485dc8-bb69-4490-83ff-fb09472c93f4-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-qqk8v\" (UID: \"dd485dc8-bb69-4490-83ff-fb09472c93f4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qqk8v" Apr 17 20:17:46.461212 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.461209 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/695db0e5-f748-4b2b-ad9f-a5a810dcad9b-node-exporter-textfile\") pod \"node-exporter-79v5j\" (UID: \"695db0e5-f748-4b2b-ad9f-a5a810dcad9b\") " pod="openshift-monitoring/node-exporter-79v5j" Apr 17 20:17:46.461591 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.461237 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/695db0e5-f748-4b2b-ad9f-a5a810dcad9b-node-exporter-wtmp\") pod \"node-exporter-79v5j\" (UID: \"695db0e5-f748-4b2b-ad9f-a5a810dcad9b\") " pod="openshift-monitoring/node-exporter-79v5j" Apr 17 20:17:46.562314 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.562223 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/695db0e5-f748-4b2b-ad9f-a5a810dcad9b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-79v5j\" (UID: \"695db0e5-f748-4b2b-ad9f-a5a810dcad9b\") " pod="openshift-monitoring/node-exporter-79v5j" Apr 17 20:17:46.562314 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.562295 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/695db0e5-f748-4b2b-ad9f-a5a810dcad9b-root\") pod \"node-exporter-79v5j\" (UID: \"695db0e5-f748-4b2b-ad9f-a5a810dcad9b\") " pod="openshift-monitoring/node-exporter-79v5j" Apr 17 20:17:46.562569 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.562324 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cm595\" (UniqueName: \"kubernetes.io/projected/dd485dc8-bb69-4490-83ff-fb09472c93f4-kube-api-access-cm595\") pod \"openshift-state-metrics-9d44df66c-qqk8v\" (UID: \"dd485dc8-bb69-4490-83ff-fb09472c93f4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qqk8v" Apr 17 20:17:46.562569 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.562351 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/695db0e5-f748-4b2b-ad9f-a5a810dcad9b-node-exporter-tls\") pod \"node-exporter-79v5j\" (UID: \"695db0e5-f748-4b2b-ad9f-a5a810dcad9b\") " pod="openshift-monitoring/node-exporter-79v5j" Apr 17 20:17:46.562569 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.562372 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dd485dc8-bb69-4490-83ff-fb09472c93f4-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-qqk8v\" (UID: \"dd485dc8-bb69-4490-83ff-fb09472c93f4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qqk8v" Apr 17 20:17:46.562569 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.562405 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v28ll\" (UniqueName: \"kubernetes.io/projected/695db0e5-f748-4b2b-ad9f-a5a810dcad9b-kube-api-access-v28ll\") pod \"node-exporter-79v5j\" (UID: \"695db0e5-f748-4b2b-ad9f-a5a810dcad9b\") " pod="openshift-monitoring/node-exporter-79v5j" Apr 17 20:17:46.562569 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.562443 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dd485dc8-bb69-4490-83ff-fb09472c93f4-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-qqk8v\" (UID: \"dd485dc8-bb69-4490-83ff-fb09472c93f4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qqk8v" Apr 17 20:17:46.562569 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.562498 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/dd485dc8-bb69-4490-83ff-fb09472c93f4-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-qqk8v\" (UID: \"dd485dc8-bb69-4490-83ff-fb09472c93f4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qqk8v" Apr 17 20:17:46.562569 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.562554 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/695db0e5-f748-4b2b-ad9f-a5a810dcad9b-node-exporter-textfile\") pod \"node-exporter-79v5j\" (UID: \"695db0e5-f748-4b2b-ad9f-a5a810dcad9b\") " pod="openshift-monitoring/node-exporter-79v5j" Apr 17 20:17:46.562888 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.562581 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/695db0e5-f748-4b2b-ad9f-a5a810dcad9b-node-exporter-wtmp\") pod \"node-exporter-79v5j\" (UID: \"695db0e5-f748-4b2b-ad9f-a5a810dcad9b\") " pod="openshift-monitoring/node-exporter-79v5j" Apr 17 20:17:46.562888 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.562620 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/695db0e5-f748-4b2b-ad9f-a5a810dcad9b-metrics-client-ca\") pod \"node-exporter-79v5j\" (UID: \"695db0e5-f748-4b2b-ad9f-a5a810dcad9b\") " pod="openshift-monitoring/node-exporter-79v5j" Apr 17 20:17:46.562888 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.562646 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/695db0e5-f748-4b2b-ad9f-a5a810dcad9b-node-exporter-accelerators-collector-config\") pod \"node-exporter-79v5j\" (UID: \"695db0e5-f748-4b2b-ad9f-a5a810dcad9b\") " pod="openshift-monitoring/node-exporter-79v5j" Apr 17 20:17:46.562888 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.562685 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/695db0e5-f748-4b2b-ad9f-a5a810dcad9b-sys\") pod \"node-exporter-79v5j\" (UID: \"695db0e5-f748-4b2b-ad9f-a5a810dcad9b\") " pod="openshift-monitoring/node-exporter-79v5j" Apr 17 20:17:46.562888 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.562780 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/695db0e5-f748-4b2b-ad9f-a5a810dcad9b-sys\") pod \"node-exporter-79v5j\" (UID: \"695db0e5-f748-4b2b-ad9f-a5a810dcad9b\") " pod="openshift-monitoring/node-exporter-79v5j" Apr 17 20:17:46.565245 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.563714 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/695db0e5-f748-4b2b-ad9f-a5a810dcad9b-node-exporter-wtmp\") pod \"node-exporter-79v5j\" (UID: \"695db0e5-f748-4b2b-ad9f-a5a810dcad9b\") " pod="openshift-monitoring/node-exporter-79v5j" Apr 17 20:17:46.565245 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.564002 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/695db0e5-f748-4b2b-ad9f-a5a810dcad9b-node-exporter-textfile\") pod \"node-exporter-79v5j\" (UID: \"695db0e5-f748-4b2b-ad9f-a5a810dcad9b\") " pod="openshift-monitoring/node-exporter-79v5j" Apr 17 20:17:46.565245 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.564137 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/695db0e5-f748-4b2b-ad9f-a5a810dcad9b-root\") pod \"node-exporter-79v5j\" (UID: \"695db0e5-f748-4b2b-ad9f-a5a810dcad9b\") " pod="openshift-monitoring/node-exporter-79v5j" Apr 17 20:17:46.565245 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.564657 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/695db0e5-f748-4b2b-ad9f-a5a810dcad9b-metrics-client-ca\") pod \"node-exporter-79v5j\" (UID: \"695db0e5-f748-4b2b-ad9f-a5a810dcad9b\") " pod="openshift-monitoring/node-exporter-79v5j" Apr 17 20:17:46.565245 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.564884 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dd485dc8-bb69-4490-83ff-fb09472c93f4-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-qqk8v\" (UID: \"dd485dc8-bb69-4490-83ff-fb09472c93f4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qqk8v" Apr 17 20:17:46.565245 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.565158 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/695db0e5-f748-4b2b-ad9f-a5a810dcad9b-node-exporter-accelerators-collector-config\") pod \"node-exporter-79v5j\" (UID: \"695db0e5-f748-4b2b-ad9f-a5a810dcad9b\") " pod="openshift-monitoring/node-exporter-79v5j" Apr 17 20:17:46.566565 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.566523 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/dd485dc8-bb69-4490-83ff-fb09472c93f4-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-qqk8v\" (UID: \"dd485dc8-bb69-4490-83ff-fb09472c93f4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qqk8v" Apr 17 20:17:46.566771 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.566751 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/695db0e5-f748-4b2b-ad9f-a5a810dcad9b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-79v5j\" (UID: \"695db0e5-f748-4b2b-ad9f-a5a810dcad9b\") " pod="openshift-monitoring/node-exporter-79v5j" Apr 17 20:17:46.566837 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.566761 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/695db0e5-f748-4b2b-ad9f-a5a810dcad9b-node-exporter-tls\") pod \"node-exporter-79v5j\" (UID: \"695db0e5-f748-4b2b-ad9f-a5a810dcad9b\") " pod="openshift-monitoring/node-exporter-79v5j" Apr 17 20:17:46.566883 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.566850 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dd485dc8-bb69-4490-83ff-fb09472c93f4-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-qqk8v\" (UID: \"dd485dc8-bb69-4490-83ff-fb09472c93f4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qqk8v" Apr 17 20:17:46.571950 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.571928 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v28ll\" (UniqueName: \"kubernetes.io/projected/695db0e5-f748-4b2b-ad9f-a5a810dcad9b-kube-api-access-v28ll\") pod \"node-exporter-79v5j\" (UID: \"695db0e5-f748-4b2b-ad9f-a5a810dcad9b\") " pod="openshift-monitoring/node-exporter-79v5j" Apr 17 20:17:46.572049 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.571945 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm595\" (UniqueName: \"kubernetes.io/projected/dd485dc8-bb69-4490-83ff-fb09472c93f4-kube-api-access-cm595\") pod \"openshift-state-metrics-9d44df66c-qqk8v\" (UID: \"dd485dc8-bb69-4490-83ff-fb09472c93f4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qqk8v" Apr 17 20:17:46.631411 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.631381 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qqk8v" Apr 17 20:17:46.654323 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:46.654295 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-79v5j" Apr 17 20:17:47.222098 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:17:47.222054 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod695db0e5_f748_4b2b_ad9f_a5a810dcad9b.slice/crio-61d457397a836b87f59c9c5e9fcd7060b333aa40faeae1a67000b0a482468b98 WatchSource:0}: Error finding container 61d457397a836b87f59c9c5e9fcd7060b333aa40faeae1a67000b0a482468b98: Status 404 returned error can't find the container with id 61d457397a836b87f59c9c5e9fcd7060b333aa40faeae1a67000b0a482468b98 Apr 17 20:17:47.393426 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.393237 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-99wq2"] Apr 17 20:17:47.396893 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:17:47.396543 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd943896a_8c08_4d43_b1c4_d738b0079503.slice/crio-36517cd175b1f4d1b6baaa53d0aa8be68a537a776de1405b981117cfba0b5a9b WatchSource:0}: Error finding container 36517cd175b1f4d1b6baaa53d0aa8be68a537a776de1405b981117cfba0b5a9b: Status 404 returned error can't find the container with id 36517cd175b1f4d1b6baaa53d0aa8be68a537a776de1405b981117cfba0b5a9b Apr 17 20:17:47.397664 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.397644 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 20:17:47.403397 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.403378 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:17:47.405877 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.405822 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 20:17:47.406051 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.405900 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-hx2vz\"" Apr 17 20:17:47.407043 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.406164 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 20:17:47.407043 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.406240 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 20:17:47.407043 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.406356 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 20:17:47.407043 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.406403 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 20:17:47.407043 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.406592 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 20:17:47.407043 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.406602 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 20:17:47.407043 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.406715 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 20:17:47.407043 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.406991 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 20:17:47.414263 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.414147 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 20:17:47.416995 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.416959 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d9fc4bfc5-xnjpn"] Apr 17 20:17:47.420721 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:17:47.420693 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod132b50a8_deb0_486b_a136_71fb422c3919.slice/crio-d4eba72ca64a3c6bcfa84a27c5dc0247117c9618b1876c9e5fac492010e83175 WatchSource:0}: Error finding container d4eba72ca64a3c6bcfa84a27c5dc0247117c9618b1876c9e5fac492010e83175: Status 404 returned error can't find the container with id d4eba72ca64a3c6bcfa84a27c5dc0247117c9618b1876c9e5fac492010e83175 Apr 17 20:17:47.434822 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.434802 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-qqk8v"] Apr 17 20:17:47.438949 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:17:47.438915 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd485dc8_bb69_4490_83ff_fb09472c93f4.slice/crio-4057e1b4dbf595507ccf376691c0cbacaa978d57c466893966a0da312df0866d WatchSource:0}: Error finding container 4057e1b4dbf595507ccf376691c0cbacaa978d57c466893966a0da312df0866d: Status 404 returned error can't find the container with id 4057e1b4dbf595507ccf376691c0cbacaa978d57c466893966a0da312df0866d Apr 17 20:17:47.572625 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.572377 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/79f5d464-0615-476c-8303-51771c3852b6-config-out\") pod \"alertmanager-main-0\" (UID: \"79f5d464-0615-476c-8303-51771c3852b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:17:47.572625 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.572443 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/79f5d464-0615-476c-8303-51771c3852b6-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"79f5d464-0615-476c-8303-51771c3852b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:17:47.572625 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.572516 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/79f5d464-0615-476c-8303-51771c3852b6-config-volume\") pod \"alertmanager-main-0\" (UID: \"79f5d464-0615-476c-8303-51771c3852b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:17:47.572625 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.572552 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/79f5d464-0615-476c-8303-51771c3852b6-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"79f5d464-0615-476c-8303-51771c3852b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:17:47.572919 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.572625 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/79f5d464-0615-476c-8303-51771c3852b6-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"79f5d464-0615-476c-8303-51771c3852b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:17:47.572919 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.572664 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79f5d464-0615-476c-8303-51771c3852b6-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"79f5d464-0615-476c-8303-51771c3852b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:17:47.572919 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.572750 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drsqp\" (UniqueName: \"kubernetes.io/projected/79f5d464-0615-476c-8303-51771c3852b6-kube-api-access-drsqp\") pod \"alertmanager-main-0\" (UID: \"79f5d464-0615-476c-8303-51771c3852b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:17:47.572919 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.572787 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/79f5d464-0615-476c-8303-51771c3852b6-web-config\") pod \"alertmanager-main-0\" (UID: \"79f5d464-0615-476c-8303-51771c3852b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:17:47.572919 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.572830 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/79f5d464-0615-476c-8303-51771c3852b6-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"79f5d464-0615-476c-8303-51771c3852b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:17:47.572919 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.572859 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/79f5d464-0615-476c-8303-51771c3852b6-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"79f5d464-0615-476c-8303-51771c3852b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:17:47.572919 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.572899 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/79f5d464-0615-476c-8303-51771c3852b6-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"79f5d464-0615-476c-8303-51771c3852b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:17:47.573264 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.572929 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/79f5d464-0615-476c-8303-51771c3852b6-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"79f5d464-0615-476c-8303-51771c3852b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:17:47.573264 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.572962 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/79f5d464-0615-476c-8303-51771c3852b6-tls-assets\") pod \"alertmanager-main-0\" (UID: \"79f5d464-0615-476c-8303-51771c3852b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:17:47.674515 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.674261 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/79f5d464-0615-476c-8303-51771c3852b6-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"79f5d464-0615-476c-8303-51771c3852b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:17:47.674515 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.674308 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/79f5d464-0615-476c-8303-51771c3852b6-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"79f5d464-0615-476c-8303-51771c3852b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:17:47.674515 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.674329 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79f5d464-0615-476c-8303-51771c3852b6-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"79f5d464-0615-476c-8303-51771c3852b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:17:47.674787 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.674522 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-drsqp\" (UniqueName: \"kubernetes.io/projected/79f5d464-0615-476c-8303-51771c3852b6-kube-api-access-drsqp\") pod \"alertmanager-main-0\" (UID: \"79f5d464-0615-476c-8303-51771c3852b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:17:47.674787 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.674572 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/79f5d464-0615-476c-8303-51771c3852b6-web-config\") pod \"alertmanager-main-0\" (UID: \"79f5d464-0615-476c-8303-51771c3852b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:17:47.674787 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.674618 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/79f5d464-0615-476c-8303-51771c3852b6-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"79f5d464-0615-476c-8303-51771c3852b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:17:47.674787 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.674645 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/79f5d464-0615-476c-8303-51771c3852b6-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"79f5d464-0615-476c-8303-51771c3852b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:17:47.674787 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.674686 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/79f5d464-0615-476c-8303-51771c3852b6-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"79f5d464-0615-476c-8303-51771c3852b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:17:47.674787 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.674710 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/79f5d464-0615-476c-8303-51771c3852b6-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"79f5d464-0615-476c-8303-51771c3852b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:17:47.675097 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.674718 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/79f5d464-0615-476c-8303-51771c3852b6-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"79f5d464-0615-476c-8303-51771c3852b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:17:47.675097 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.675061 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/79f5d464-0615-476c-8303-51771c3852b6-tls-assets\") pod \"alertmanager-main-0\" (UID: \"79f5d464-0615-476c-8303-51771c3852b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:17:47.675206 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.675104 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/79f5d464-0615-476c-8303-51771c3852b6-config-out\") pod \"alertmanager-main-0\" (UID: \"79f5d464-0615-476c-8303-51771c3852b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:17:47.675206 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.675155 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/79f5d464-0615-476c-8303-51771c3852b6-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"79f5d464-0615-476c-8303-51771c3852b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:17:47.675303 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.675207 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/79f5d464-0615-476c-8303-51771c3852b6-config-volume\") pod \"alertmanager-main-0\" (UID: \"79f5d464-0615-476c-8303-51771c3852b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:17:47.676649 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.676293 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79f5d464-0615-476c-8303-51771c3852b6-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"79f5d464-0615-476c-8303-51771c3852b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:17:47.678728 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.678698 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/79f5d464-0615-476c-8303-51771c3852b6-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"79f5d464-0615-476c-8303-51771c3852b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:17:47.679242 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.679200 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/79f5d464-0615-476c-8303-51771c3852b6-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"79f5d464-0615-476c-8303-51771c3852b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:17:47.680689 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.680239 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/79f5d464-0615-476c-8303-51771c3852b6-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"79f5d464-0615-476c-8303-51771c3852b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:17:47.680689 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.680293 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/79f5d464-0615-476c-8303-51771c3852b6-tls-assets\") pod \"alertmanager-main-0\" (UID: \"79f5d464-0615-476c-8303-51771c3852b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:17:47.680689 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.680574 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/79f5d464-0615-476c-8303-51771c3852b6-web-config\") pod \"alertmanager-main-0\" (UID: \"79f5d464-0615-476c-8303-51771c3852b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:17:47.680689 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.680653 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/79f5d464-0615-476c-8303-51771c3852b6-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"79f5d464-0615-476c-8303-51771c3852b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:17:47.681327 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.681284 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/79f5d464-0615-476c-8303-51771c3852b6-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"79f5d464-0615-476c-8303-51771c3852b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:17:47.681664 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.681642 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/79f5d464-0615-476c-8303-51771c3852b6-config-out\") pod \"alertmanager-main-0\" (UID: \"79f5d464-0615-476c-8303-51771c3852b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:17:47.682120 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.682099 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/79f5d464-0615-476c-8303-51771c3852b6-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"79f5d464-0615-476c-8303-51771c3852b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:17:47.682660 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.682636 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/79f5d464-0615-476c-8303-51771c3852b6-config-volume\") pod \"alertmanager-main-0\" (UID: \"79f5d464-0615-476c-8303-51771c3852b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:17:47.683646 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.683624 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-drsqp\" (UniqueName: \"kubernetes.io/projected/79f5d464-0615-476c-8303-51771c3852b6-kube-api-access-drsqp\") pod \"alertmanager-main-0\" (UID: \"79f5d464-0615-476c-8303-51771c3852b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:17:47.715286 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.715223 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:17:47.870224 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.870167 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 20:17:47.945073 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.944987 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-49zfd" event={"ID":"7dcf50fe-0775-4474-b4fd-e451ff50c3a5","Type":"ContainerStarted","Data":"d4794b80cfea3fb15f00b9ccf4bd4c34ba060a028b18f642097c93fb59feac71"} Apr 17 20:17:47.945597 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.945564 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-49zfd" Apr 17 20:17:47.948288 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.948243 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-99wq2" event={"ID":"d943896a-8c08-4d43-b1c4-d738b0079503","Type":"ContainerStarted","Data":"36517cd175b1f4d1b6baaa53d0aa8be68a537a776de1405b981117cfba0b5a9b"} Apr 17 20:17:47.949791 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.949766 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-79v5j" event={"ID":"695db0e5-f748-4b2b-ad9f-a5a810dcad9b","Type":"ContainerStarted","Data":"61d457397a836b87f59c9c5e9fcd7060b333aa40faeae1a67000b0a482468b98"} Apr 17 20:17:47.951889 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.951828 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qqk8v" event={"ID":"dd485dc8-bb69-4490-83ff-fb09472c93f4","Type":"ContainerStarted","Data":"38e47438b58909bd5943b805d196e15ab2fd0c7148c145d8ff4068439f19518c"} Apr 17 20:17:47.951889 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.951857 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qqk8v" event={"ID":"dd485dc8-bb69-4490-83ff-fb09472c93f4","Type":"ContainerStarted","Data":"358a427a45fe7a793a813b1cedaa06949039a14ec45ed63ab647d8505534d8eb"} Apr 17 20:17:47.951889 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.951872 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qqk8v" event={"ID":"dd485dc8-bb69-4490-83ff-fb09472c93f4","Type":"ContainerStarted","Data":"4057e1b4dbf595507ccf376691c0cbacaa978d57c466893966a0da312df0866d"} Apr 17 20:17:47.954480 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.954439 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d9fc4bfc5-xnjpn" event={"ID":"132b50a8-deb0-486b-a136-71fb422c3919","Type":"ContainerStarted","Data":"dc895a2ec1342ad1a8ec6f7b3be89877163b733bceabf0a4859ec94ec9d4bdab"} Apr 17 20:17:47.954565 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.954485 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d9fc4bfc5-xnjpn" event={"ID":"132b50a8-deb0-486b-a136-71fb422c3919","Type":"ContainerStarted","Data":"d4eba72ca64a3c6bcfa84a27c5dc0247117c9618b1876c9e5fac492010e83175"} Apr 17 20:17:47.960548 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.960518 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-49zfd" Apr 17 20:17:47.961678 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.961631 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-49zfd" podStartSLOduration=1.5188984589999999 podStartE2EDuration="17.961619756s" podCreationTimestamp="2026-04-17 20:17:30 +0000 UTC" firstStartedPulling="2026-04-17 20:17:30.905256862 +0000 UTC m=+117.981628843" lastFinishedPulling="2026-04-17 20:17:47.347978157 +0000 UTC m=+134.424350140" observedRunningTime="2026-04-17 20:17:47.960295057 +0000 UTC m=+135.036667093" watchObservedRunningTime="2026-04-17 20:17:47.961619756 +0000 UTC m=+135.037991758" Apr 17 20:17:47.980524 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:47.980363 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5d9fc4bfc5-xnjpn" podStartSLOduration=4.980345947 podStartE2EDuration="4.980345947s" podCreationTimestamp="2026-04-17 20:17:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:17:47.979443895 +0000 UTC m=+135.055815909" watchObservedRunningTime="2026-04-17 20:17:47.980345947 +0000 UTC m=+135.056717951" Apr 17 20:17:48.010526 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:17:48.010452 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79f5d464_0615_476c_8303_51771c3852b6.slice/crio-0826f762a6a42a2b609fefd5fb1a6ae88d38637ee0db3c16cfc46b9578d63a14 WatchSource:0}: Error finding container 0826f762a6a42a2b609fefd5fb1a6ae88d38637ee0db3c16cfc46b9578d63a14: Status 404 returned error can't find the container with id 0826f762a6a42a2b609fefd5fb1a6ae88d38637ee0db3c16cfc46b9578d63a14 Apr 17 20:17:48.962510 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:48.961611 2580 generic.go:358] "Generic (PLEG): container finished" podID="695db0e5-f748-4b2b-ad9f-a5a810dcad9b" containerID="66436dad5ca5682403389adafa4e6097c0adca16c2e6c29f96a6daed66cb895a" exitCode=0 Apr 17 20:17:48.962510 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:48.961711 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-79v5j" event={"ID":"695db0e5-f748-4b2b-ad9f-a5a810dcad9b","Type":"ContainerDied","Data":"66436dad5ca5682403389adafa4e6097c0adca16c2e6c29f96a6daed66cb895a"} Apr 17 20:17:48.969344 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:48.969316 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"79f5d464-0615-476c-8303-51771c3852b6","Type":"ContainerStarted","Data":"0826f762a6a42a2b609fefd5fb1a6ae88d38637ee0db3c16cfc46b9578d63a14"} Apr 17 20:17:49.974900 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:49.974829 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-99wq2" event={"ID":"d943896a-8c08-4d43-b1c4-d738b0079503","Type":"ContainerStarted","Data":"a8c4c70acc4d7d0c3218dcf5b767a5d83214aa66036d851f71dcfe9d09e1e556"} Apr 17 20:17:49.974900 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:49.974876 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-99wq2" event={"ID":"d943896a-8c08-4d43-b1c4-d738b0079503","Type":"ContainerStarted","Data":"d6327c0f08085387b0ef5cf4e6fab2516fc0d99ae897611c5afa215b788b19f2"} Apr 17 20:17:49.977584 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:49.977488 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-79v5j" event={"ID":"695db0e5-f748-4b2b-ad9f-a5a810dcad9b","Type":"ContainerStarted","Data":"c81528aa040f54f5030cb44abc57b7e68cad774e4b9c687108270508534d8493"} Apr 17 20:17:49.977584 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:49.977523 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-79v5j" event={"ID":"695db0e5-f748-4b2b-ad9f-a5a810dcad9b","Type":"ContainerStarted","Data":"e7ae17c3267010ca6bdba2d4677d940108621517219ea2f7c2dba2085aa1f5a3"} Apr 17 20:17:49.979561 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:49.979540 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qqk8v" event={"ID":"dd485dc8-bb69-4490-83ff-fb09472c93f4","Type":"ContainerStarted","Data":"b4bf225e8cecfc268404c7f06ef1358e9235c92775701d3a6fc02d398d27d068"} Apr 17 20:17:49.989157 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:49.989100 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-99wq2" podStartSLOduration=135.467036255 podStartE2EDuration="2m16.989085674s" podCreationTimestamp="2026-04-17 20:15:33 +0000 UTC" firstStartedPulling="2026-04-17 20:17:47.399207957 +0000 UTC m=+134.475579951" lastFinishedPulling="2026-04-17 20:17:48.921257376 +0000 UTC m=+135.997629370" observedRunningTime="2026-04-17 20:17:49.987959072 +0000 UTC m=+137.064331098" watchObservedRunningTime="2026-04-17 20:17:49.989085674 +0000 UTC m=+137.065457680" Apr 17 20:17:50.004319 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:50.004276 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-79v5j" podStartSLOduration=3.168963514 podStartE2EDuration="4.004260697s" podCreationTimestamp="2026-04-17 20:17:46 +0000 UTC" firstStartedPulling="2026-04-17 20:17:47.224174617 +0000 UTC m=+134.300546597" lastFinishedPulling="2026-04-17 20:17:48.059471799 +0000 UTC m=+135.135843780" observedRunningTime="2026-04-17 20:17:50.003086454 +0000 UTC m=+137.079458456" watchObservedRunningTime="2026-04-17 20:17:50.004260697 +0000 UTC m=+137.080632699" Apr 17 20:17:50.019586 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:50.019532 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qqk8v" podStartSLOduration=2.672960278 podStartE2EDuration="4.019518747s" podCreationTimestamp="2026-04-17 20:17:46 +0000 UTC" firstStartedPulling="2026-04-17 20:17:47.576173719 +0000 UTC m=+134.652545704" lastFinishedPulling="2026-04-17 20:17:48.922732187 +0000 UTC m=+135.999104173" observedRunningTime="2026-04-17 20:17:50.017730909 +0000 UTC m=+137.094102911" watchObservedRunningTime="2026-04-17 20:17:50.019518747 +0000 UTC m=+137.095890749" Apr 17 20:17:50.731230 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:50.731197 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-7f9447cc77-gq78m"] Apr 17 20:17:50.749089 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:50.749058 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7f9447cc77-gq78m"] Apr 17 20:17:50.749242 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:50.749193 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7f9447cc77-gq78m" Apr 17 20:17:50.752527 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:50.752482 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 17 20:17:50.752669 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:50.752654 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-9pcv9gtcj9e0v\"" Apr 17 20:17:50.753121 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:50.753097 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 20:17:50.753320 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:50.753295 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-h74z4\"" Apr 17 20:17:50.753403 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:50.753368 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 17 20:17:50.753548 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:50.753527 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 17 20:17:50.806951 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:50.806919 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fb9d968-11f8-424c-833a-9133403fbf4e-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7f9447cc77-gq78m\" (UID: \"8fb9d968-11f8-424c-833a-9133403fbf4e\") " pod="openshift-monitoring/metrics-server-7f9447cc77-gq78m" Apr 17 20:17:50.807099 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:50.806962 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/8fb9d968-11f8-424c-833a-9133403fbf4e-secret-metrics-server-tls\") pod \"metrics-server-7f9447cc77-gq78m\" (UID: \"8fb9d968-11f8-424c-833a-9133403fbf4e\") " pod="openshift-monitoring/metrics-server-7f9447cc77-gq78m" Apr 17 20:17:50.807099 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:50.806987 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/8fb9d968-11f8-424c-833a-9133403fbf4e-secret-metrics-server-client-certs\") pod \"metrics-server-7f9447cc77-gq78m\" (UID: \"8fb9d968-11f8-424c-833a-9133403fbf4e\") " pod="openshift-monitoring/metrics-server-7f9447cc77-gq78m" Apr 17 20:17:50.807099 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:50.807070 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq5dn\" (UniqueName: \"kubernetes.io/projected/8fb9d968-11f8-424c-833a-9133403fbf4e-kube-api-access-rq5dn\") pod \"metrics-server-7f9447cc77-gq78m\" (UID: \"8fb9d968-11f8-424c-833a-9133403fbf4e\") " pod="openshift-monitoring/metrics-server-7f9447cc77-gq78m" Apr 17 20:17:50.807258 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:50.807100 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/8fb9d968-11f8-424c-833a-9133403fbf4e-metrics-server-audit-profiles\") pod \"metrics-server-7f9447cc77-gq78m\" (UID: \"8fb9d968-11f8-424c-833a-9133403fbf4e\") " pod="openshift-monitoring/metrics-server-7f9447cc77-gq78m" Apr 17 20:17:50.807258 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:50.807187 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/8fb9d968-11f8-424c-833a-9133403fbf4e-audit-log\") pod \"metrics-server-7f9447cc77-gq78m\" (UID: \"8fb9d968-11f8-424c-833a-9133403fbf4e\") " pod="openshift-monitoring/metrics-server-7f9447cc77-gq78m" Apr 17 20:17:50.807258 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:50.807249 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fb9d968-11f8-424c-833a-9133403fbf4e-client-ca-bundle\") pod \"metrics-server-7f9447cc77-gq78m\" (UID: \"8fb9d968-11f8-424c-833a-9133403fbf4e\") " pod="openshift-monitoring/metrics-server-7f9447cc77-gq78m" Apr 17 20:17:50.908510 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:50.908471 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/8fb9d968-11f8-424c-833a-9133403fbf4e-audit-log\") pod \"metrics-server-7f9447cc77-gq78m\" (UID: \"8fb9d968-11f8-424c-833a-9133403fbf4e\") " pod="openshift-monitoring/metrics-server-7f9447cc77-gq78m" Apr 17 20:17:50.908665 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:50.908523 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fb9d968-11f8-424c-833a-9133403fbf4e-client-ca-bundle\") pod \"metrics-server-7f9447cc77-gq78m\" (UID: \"8fb9d968-11f8-424c-833a-9133403fbf4e\") " pod="openshift-monitoring/metrics-server-7f9447cc77-gq78m" Apr 17 20:17:50.908665 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:50.908586 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fb9d968-11f8-424c-833a-9133403fbf4e-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7f9447cc77-gq78m\" (UID: \"8fb9d968-11f8-424c-833a-9133403fbf4e\") " pod="openshift-monitoring/metrics-server-7f9447cc77-gq78m" Apr 17 20:17:50.908665 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:50.908615 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/8fb9d968-11f8-424c-833a-9133403fbf4e-secret-metrics-server-tls\") pod \"metrics-server-7f9447cc77-gq78m\" (UID: \"8fb9d968-11f8-424c-833a-9133403fbf4e\") " pod="openshift-monitoring/metrics-server-7f9447cc77-gq78m" Apr 17 20:17:50.908665 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:50.908632 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/8fb9d968-11f8-424c-833a-9133403fbf4e-secret-metrics-server-client-certs\") pod \"metrics-server-7f9447cc77-gq78m\" (UID: \"8fb9d968-11f8-424c-833a-9133403fbf4e\") " pod="openshift-monitoring/metrics-server-7f9447cc77-gq78m" Apr 17 20:17:50.908665 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:50.908652 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rq5dn\" (UniqueName: \"kubernetes.io/projected/8fb9d968-11f8-424c-833a-9133403fbf4e-kube-api-access-rq5dn\") pod \"metrics-server-7f9447cc77-gq78m\" (UID: \"8fb9d968-11f8-424c-833a-9133403fbf4e\") " pod="openshift-monitoring/metrics-server-7f9447cc77-gq78m" Apr 17 20:17:50.908917 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:50.908676 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/8fb9d968-11f8-424c-833a-9133403fbf4e-metrics-server-audit-profiles\") pod \"metrics-server-7f9447cc77-gq78m\" (UID: \"8fb9d968-11f8-424c-833a-9133403fbf4e\") " pod="openshift-monitoring/metrics-server-7f9447cc77-gq78m" Apr 17 20:17:50.908917 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:50.908841 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/8fb9d968-11f8-424c-833a-9133403fbf4e-audit-log\") pod \"metrics-server-7f9447cc77-gq78m\" (UID: \"8fb9d968-11f8-424c-833a-9133403fbf4e\") " pod="openshift-monitoring/metrics-server-7f9447cc77-gq78m" Apr 17 20:17:50.909541 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:50.909437 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fb9d968-11f8-424c-833a-9133403fbf4e-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7f9447cc77-gq78m\" (UID: \"8fb9d968-11f8-424c-833a-9133403fbf4e\") " pod="openshift-monitoring/metrics-server-7f9447cc77-gq78m" Apr 17 20:17:50.909661 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:50.909590 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/8fb9d968-11f8-424c-833a-9133403fbf4e-metrics-server-audit-profiles\") pod \"metrics-server-7f9447cc77-gq78m\" (UID: \"8fb9d968-11f8-424c-833a-9133403fbf4e\") " pod="openshift-monitoring/metrics-server-7f9447cc77-gq78m" Apr 17 20:17:50.911345 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:50.911323 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/8fb9d968-11f8-424c-833a-9133403fbf4e-secret-metrics-server-client-certs\") pod \"metrics-server-7f9447cc77-gq78m\" (UID: \"8fb9d968-11f8-424c-833a-9133403fbf4e\") " pod="openshift-monitoring/metrics-server-7f9447cc77-gq78m" Apr 17 20:17:50.911486 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:50.911428 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fb9d968-11f8-424c-833a-9133403fbf4e-client-ca-bundle\") pod \"metrics-server-7f9447cc77-gq78m\" (UID: \"8fb9d968-11f8-424c-833a-9133403fbf4e\") " pod="openshift-monitoring/metrics-server-7f9447cc77-gq78m" Apr 17 20:17:50.911486 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:50.911442 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/8fb9d968-11f8-424c-833a-9133403fbf4e-secret-metrics-server-tls\") pod \"metrics-server-7f9447cc77-gq78m\" (UID: \"8fb9d968-11f8-424c-833a-9133403fbf4e\") " pod="openshift-monitoring/metrics-server-7f9447cc77-gq78m" Apr 17 20:17:50.916372 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:50.916345 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq5dn\" (UniqueName: \"kubernetes.io/projected/8fb9d968-11f8-424c-833a-9133403fbf4e-kube-api-access-rq5dn\") pod \"metrics-server-7f9447cc77-gq78m\" (UID: \"8fb9d968-11f8-424c-833a-9133403fbf4e\") " pod="openshift-monitoring/metrics-server-7f9447cc77-gq78m" Apr 17 20:17:50.984755 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:50.984678 2580 generic.go:358] "Generic (PLEG): container finished" podID="79f5d464-0615-476c-8303-51771c3852b6" containerID="18e544100fa42f2c79030515ad2cd6f1aa67b41cb8ca01eacbae43e8eeb7a9bd" exitCode=0 Apr 17 20:17:50.985120 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:50.984811 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"79f5d464-0615-476c-8303-51771c3852b6","Type":"ContainerDied","Data":"18e544100fa42f2c79030515ad2cd6f1aa67b41cb8ca01eacbae43e8eeb7a9bd"} Apr 17 20:17:51.063013 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.062982 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7f9447cc77-gq78m" Apr 17 20:17:51.119501 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.118866 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-klvtv"] Apr 17 20:17:51.153239 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.153140 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-klvtv"] Apr 17 20:17:51.153380 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.153306 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-klvtv" Apr 17 20:17:51.155909 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.155626 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 17 20:17:51.156364 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.156182 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-7lg9x\"" Apr 17 20:17:51.203610 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.203577 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7f9447cc77-gq78m"] Apr 17 20:17:51.212005 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.211981 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/336ee31c-aa4b-4408-9cbd-46e77e017cfa-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-klvtv\" (UID: \"336ee31c-aa4b-4408-9cbd-46e77e017cfa\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-klvtv" Apr 17 20:17:51.313375 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.312818 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/336ee31c-aa4b-4408-9cbd-46e77e017cfa-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-klvtv\" (UID: \"336ee31c-aa4b-4408-9cbd-46e77e017cfa\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-klvtv" Apr 17 20:17:51.313375 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:17:51.313061 2580 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 17 20:17:51.313375 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:17:51.313128 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/336ee31c-aa4b-4408-9cbd-46e77e017cfa-monitoring-plugin-cert podName:336ee31c-aa4b-4408-9cbd-46e77e017cfa nodeName:}" failed. No retries permitted until 2026-04-17 20:17:51.813108623 +0000 UTC m=+138.889480623 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/336ee31c-aa4b-4408-9cbd-46e77e017cfa-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-klvtv" (UID: "336ee31c-aa4b-4408-9cbd-46e77e017cfa") : secret "monitoring-plugin-cert" not found Apr 17 20:17:51.516482 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.516431 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-7dffdb7bd4-5fdhx"] Apr 17 20:17:51.544540 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.544513 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-7dffdb7bd4-5fdhx" Apr 17 20:17:51.546957 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.546936 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 17 20:17:51.547100 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.547078 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 17 20:17:51.547213 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.547191 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 17 20:17:51.547347 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.546985 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 17 20:17:51.547526 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.547500 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 17 20:17:51.547752 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.547737 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-87dng\"" Apr 17 20:17:51.549490 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.549471 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-7dffdb7bd4-5fdhx"] Apr 17 20:17:51.551235 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.551207 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 17 20:17:51.615289 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.615264 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68lpr\" (UniqueName: \"kubernetes.io/projected/754c0844-0836-4dca-9230-21c7a04f6de9-kube-api-access-68lpr\") pod \"telemeter-client-7dffdb7bd4-5fdhx\" (UID: \"754c0844-0836-4dca-9230-21c7a04f6de9\") " pod="openshift-monitoring/telemeter-client-7dffdb7bd4-5fdhx" Apr 17 20:17:51.615429 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.615327 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/754c0844-0836-4dca-9230-21c7a04f6de9-serving-certs-ca-bundle\") pod \"telemeter-client-7dffdb7bd4-5fdhx\" (UID: \"754c0844-0836-4dca-9230-21c7a04f6de9\") " pod="openshift-monitoring/telemeter-client-7dffdb7bd4-5fdhx" Apr 17 20:17:51.615512 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.615437 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/754c0844-0836-4dca-9230-21c7a04f6de9-metrics-client-ca\") pod \"telemeter-client-7dffdb7bd4-5fdhx\" (UID: \"754c0844-0836-4dca-9230-21c7a04f6de9\") " pod="openshift-monitoring/telemeter-client-7dffdb7bd4-5fdhx" Apr 17 20:17:51.615512 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.615488 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/754c0844-0836-4dca-9230-21c7a04f6de9-telemeter-client-tls\") pod \"telemeter-client-7dffdb7bd4-5fdhx\" (UID: \"754c0844-0836-4dca-9230-21c7a04f6de9\") " pod="openshift-monitoring/telemeter-client-7dffdb7bd4-5fdhx" Apr 17 20:17:51.615512 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.615507 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/754c0844-0836-4dca-9230-21c7a04f6de9-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7dffdb7bd4-5fdhx\" (UID: \"754c0844-0836-4dca-9230-21c7a04f6de9\") " pod="openshift-monitoring/telemeter-client-7dffdb7bd4-5fdhx" Apr 17 20:17:51.615659 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.615552 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/754c0844-0836-4dca-9230-21c7a04f6de9-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7dffdb7bd4-5fdhx\" (UID: \"754c0844-0836-4dca-9230-21c7a04f6de9\") " pod="openshift-monitoring/telemeter-client-7dffdb7bd4-5fdhx" Apr 17 20:17:51.615659 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.615587 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/754c0844-0836-4dca-9230-21c7a04f6de9-secret-telemeter-client\") pod \"telemeter-client-7dffdb7bd4-5fdhx\" (UID: \"754c0844-0836-4dca-9230-21c7a04f6de9\") " pod="openshift-monitoring/telemeter-client-7dffdb7bd4-5fdhx" Apr 17 20:17:51.615659 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.615624 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/754c0844-0836-4dca-9230-21c7a04f6de9-federate-client-tls\") pod \"telemeter-client-7dffdb7bd4-5fdhx\" (UID: \"754c0844-0836-4dca-9230-21c7a04f6de9\") " pod="openshift-monitoring/telemeter-client-7dffdb7bd4-5fdhx" Apr 17 20:17:51.716810 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.716776 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/754c0844-0836-4dca-9230-21c7a04f6de9-telemeter-client-tls\") pod \"telemeter-client-7dffdb7bd4-5fdhx\" (UID: \"754c0844-0836-4dca-9230-21c7a04f6de9\") " pod="openshift-monitoring/telemeter-client-7dffdb7bd4-5fdhx" Apr 17 20:17:51.716963 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.716823 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/754c0844-0836-4dca-9230-21c7a04f6de9-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7dffdb7bd4-5fdhx\" (UID: \"754c0844-0836-4dca-9230-21c7a04f6de9\") " pod="openshift-monitoring/telemeter-client-7dffdb7bd4-5fdhx" Apr 17 20:17:51.716963 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.716857 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/754c0844-0836-4dca-9230-21c7a04f6de9-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7dffdb7bd4-5fdhx\" (UID: \"754c0844-0836-4dca-9230-21c7a04f6de9\") " pod="openshift-monitoring/telemeter-client-7dffdb7bd4-5fdhx" Apr 17 20:17:51.716963 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.716899 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/754c0844-0836-4dca-9230-21c7a04f6de9-secret-telemeter-client\") pod \"telemeter-client-7dffdb7bd4-5fdhx\" (UID: \"754c0844-0836-4dca-9230-21c7a04f6de9\") " pod="openshift-monitoring/telemeter-client-7dffdb7bd4-5fdhx" Apr 17 20:17:51.716963 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.716940 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/754c0844-0836-4dca-9230-21c7a04f6de9-federate-client-tls\") pod \"telemeter-client-7dffdb7bd4-5fdhx\" (UID: \"754c0844-0836-4dca-9230-21c7a04f6de9\") " pod="openshift-monitoring/telemeter-client-7dffdb7bd4-5fdhx" Apr 17 20:17:51.717171 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.716969 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-68lpr\" (UniqueName: \"kubernetes.io/projected/754c0844-0836-4dca-9230-21c7a04f6de9-kube-api-access-68lpr\") pod \"telemeter-client-7dffdb7bd4-5fdhx\" (UID: \"754c0844-0836-4dca-9230-21c7a04f6de9\") " pod="openshift-monitoring/telemeter-client-7dffdb7bd4-5fdhx" Apr 17 20:17:51.717171 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.717018 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/754c0844-0836-4dca-9230-21c7a04f6de9-serving-certs-ca-bundle\") pod \"telemeter-client-7dffdb7bd4-5fdhx\" (UID: \"754c0844-0836-4dca-9230-21c7a04f6de9\") " pod="openshift-monitoring/telemeter-client-7dffdb7bd4-5fdhx" Apr 17 20:17:51.718796 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.717840 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/754c0844-0836-4dca-9230-21c7a04f6de9-serving-certs-ca-bundle\") pod \"telemeter-client-7dffdb7bd4-5fdhx\" (UID: \"754c0844-0836-4dca-9230-21c7a04f6de9\") " pod="openshift-monitoring/telemeter-client-7dffdb7bd4-5fdhx" Apr 17 20:17:51.718796 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.717894 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/754c0844-0836-4dca-9230-21c7a04f6de9-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7dffdb7bd4-5fdhx\" (UID: \"754c0844-0836-4dca-9230-21c7a04f6de9\") " pod="openshift-monitoring/telemeter-client-7dffdb7bd4-5fdhx" Apr 17 20:17:51.718796 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.717917 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/754c0844-0836-4dca-9230-21c7a04f6de9-metrics-client-ca\") pod \"telemeter-client-7dffdb7bd4-5fdhx\" (UID: \"754c0844-0836-4dca-9230-21c7a04f6de9\") " pod="openshift-monitoring/telemeter-client-7dffdb7bd4-5fdhx" Apr 17 20:17:51.718796 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.718752 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/754c0844-0836-4dca-9230-21c7a04f6de9-metrics-client-ca\") pod \"telemeter-client-7dffdb7bd4-5fdhx\" (UID: \"754c0844-0836-4dca-9230-21c7a04f6de9\") " pod="openshift-monitoring/telemeter-client-7dffdb7bd4-5fdhx" Apr 17 20:17:51.720375 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.720352 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/754c0844-0836-4dca-9230-21c7a04f6de9-federate-client-tls\") pod \"telemeter-client-7dffdb7bd4-5fdhx\" (UID: \"754c0844-0836-4dca-9230-21c7a04f6de9\") " pod="openshift-monitoring/telemeter-client-7dffdb7bd4-5fdhx" Apr 17 20:17:51.720980 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.720830 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/754c0844-0836-4dca-9230-21c7a04f6de9-telemeter-client-tls\") pod \"telemeter-client-7dffdb7bd4-5fdhx\" (UID: \"754c0844-0836-4dca-9230-21c7a04f6de9\") " pod="openshift-monitoring/telemeter-client-7dffdb7bd4-5fdhx" Apr 17 20:17:51.721412 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.721388 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/754c0844-0836-4dca-9230-21c7a04f6de9-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7dffdb7bd4-5fdhx\" (UID: \"754c0844-0836-4dca-9230-21c7a04f6de9\") " pod="openshift-monitoring/telemeter-client-7dffdb7bd4-5fdhx" Apr 17 20:17:51.721809 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.721775 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/754c0844-0836-4dca-9230-21c7a04f6de9-secret-telemeter-client\") pod \"telemeter-client-7dffdb7bd4-5fdhx\" (UID: \"754c0844-0836-4dca-9230-21c7a04f6de9\") " pod="openshift-monitoring/telemeter-client-7dffdb7bd4-5fdhx" Apr 17 20:17:51.725634 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.725609 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-68lpr\" (UniqueName: \"kubernetes.io/projected/754c0844-0836-4dca-9230-21c7a04f6de9-kube-api-access-68lpr\") pod \"telemeter-client-7dffdb7bd4-5fdhx\" (UID: \"754c0844-0836-4dca-9230-21c7a04f6de9\") " pod="openshift-monitoring/telemeter-client-7dffdb7bd4-5fdhx" Apr 17 20:17:51.818970 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.818905 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/336ee31c-aa4b-4408-9cbd-46e77e017cfa-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-klvtv\" (UID: \"336ee31c-aa4b-4408-9cbd-46e77e017cfa\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-klvtv" Apr 17 20:17:51.819231 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:17:51.819076 2580 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 17 20:17:51.819231 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:17:51.819158 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/336ee31c-aa4b-4408-9cbd-46e77e017cfa-monitoring-plugin-cert podName:336ee31c-aa4b-4408-9cbd-46e77e017cfa nodeName:}" failed. No retries permitted until 2026-04-17 20:17:52.819138867 +0000 UTC m=+139.895510857 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/336ee31c-aa4b-4408-9cbd-46e77e017cfa-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-klvtv" (UID: "336ee31c-aa4b-4408-9cbd-46e77e017cfa") : secret "monitoring-plugin-cert" not found Apr 17 20:17:51.857156 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.856775 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-7dffdb7bd4-5fdhx" Apr 17 20:17:51.990724 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:51.989729 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7f9447cc77-gq78m" event={"ID":"8fb9d968-11f8-424c-833a-9133403fbf4e","Type":"ContainerStarted","Data":"281dfd0a790247b7c501896b46f827a076d6e16edb51567d5103c8be5a68e053"} Apr 17 20:17:52.021591 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.021555 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-7dffdb7bd4-5fdhx"] Apr 17 20:17:52.027801 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:17:52.025901 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod754c0844_0836_4dca_9230_21c7a04f6de9.slice/crio-248aa346ffe0c60e6df68f0a4b0eaa20723324831e4d2144b35a159d3512216d WatchSource:0}: Error finding container 248aa346ffe0c60e6df68f0a4b0eaa20723324831e4d2144b35a159d3512216d: Status 404 returned error can't find the container with id 248aa346ffe0c60e6df68f0a4b0eaa20723324831e4d2144b35a159d3512216d Apr 17 20:17:52.514028 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.513996 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 20:17:52.537665 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.537640 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 20:17:52.537962 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.537941 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.541129 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.541096 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-earrvmce1ss8s\"" Apr 17 20:17:52.541283 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.541153 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 20:17:52.541436 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.541408 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 20:17:52.543574 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.541483 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-8tgmg\"" Apr 17 20:17:52.543574 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.541492 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 20:17:52.543574 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.541616 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 20:17:52.543574 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.541938 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 20:17:52.543574 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.542233 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 20:17:52.543574 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.542276 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 20:17:52.543574 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.542335 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 20:17:52.543574 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.543151 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 20:17:52.545869 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.544477 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 20:17:52.545869 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.544995 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 20:17:52.547767 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.547570 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 20:17:52.628356 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.628326 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/18946c1e-e029-484b-89f2-05cde4450668-config-out\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.628356 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.628361 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/18946c1e-e029-484b-89f2-05cde4450668-config\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.628615 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.628413 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/18946c1e-e029-484b-89f2-05cde4450668-web-config\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.628615 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.628446 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/18946c1e-e029-484b-89f2-05cde4450668-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.628615 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.628494 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/18946c1e-e029-484b-89f2-05cde4450668-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.628615 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.628540 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18946c1e-e029-484b-89f2-05cde4450668-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.628615 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.628589 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/18946c1e-e029-484b-89f2-05cde4450668-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.628871 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.628620 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/18946c1e-e029-484b-89f2-05cde4450668-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.628871 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.628657 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/18946c1e-e029-484b-89f2-05cde4450668-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.628871 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.628684 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sc6q\" (UniqueName: \"kubernetes.io/projected/18946c1e-e029-484b-89f2-05cde4450668-kube-api-access-2sc6q\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.628871 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.628723 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/18946c1e-e029-484b-89f2-05cde4450668-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.628871 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.628749 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/18946c1e-e029-484b-89f2-05cde4450668-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.628871 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.628776 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18946c1e-e029-484b-89f2-05cde4450668-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.628871 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.628807 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/18946c1e-e029-484b-89f2-05cde4450668-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.628871 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.628827 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/18946c1e-e029-484b-89f2-05cde4450668-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.628871 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.628857 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/18946c1e-e029-484b-89f2-05cde4450668-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.628871 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.628874 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/18946c1e-e029-484b-89f2-05cde4450668-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.629376 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.628897 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18946c1e-e029-484b-89f2-05cde4450668-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.730137 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.730099 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/18946c1e-e029-484b-89f2-05cde4450668-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.730341 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.730147 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/18946c1e-e029-484b-89f2-05cde4450668-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.730341 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.730186 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18946c1e-e029-484b-89f2-05cde4450668-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.730341 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.730216 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/18946c1e-e029-484b-89f2-05cde4450668-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.730341 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.730245 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/18946c1e-e029-484b-89f2-05cde4450668-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.730341 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.730282 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/18946c1e-e029-484b-89f2-05cde4450668-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.730341 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.730311 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/18946c1e-e029-484b-89f2-05cde4450668-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.730341 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.730339 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18946c1e-e029-484b-89f2-05cde4450668-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.731000 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.730369 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/18946c1e-e029-484b-89f2-05cde4450668-config-out\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.731000 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.730407 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/18946c1e-e029-484b-89f2-05cde4450668-config\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.731000 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.730453 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/18946c1e-e029-484b-89f2-05cde4450668-web-config\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.731492 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.731437 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18946c1e-e029-484b-89f2-05cde4450668-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.731858 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.731515 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/18946c1e-e029-484b-89f2-05cde4450668-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.731858 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.731550 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/18946c1e-e029-484b-89f2-05cde4450668-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.731858 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.731606 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18946c1e-e029-484b-89f2-05cde4450668-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.731858 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.731675 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/18946c1e-e029-484b-89f2-05cde4450668-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.731858 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.731710 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/18946c1e-e029-484b-89f2-05cde4450668-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.731858 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.731763 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/18946c1e-e029-484b-89f2-05cde4450668-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.731858 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.731790 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2sc6q\" (UniqueName: \"kubernetes.io/projected/18946c1e-e029-484b-89f2-05cde4450668-kube-api-access-2sc6q\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.733180 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.732125 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/18946c1e-e029-484b-89f2-05cde4450668-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.733295 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.733267 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18946c1e-e029-484b-89f2-05cde4450668-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.734198 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.733904 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/18946c1e-e029-484b-89f2-05cde4450668-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.735224 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.735194 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/18946c1e-e029-484b-89f2-05cde4450668-web-config\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.737070 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.736770 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/18946c1e-e029-484b-89f2-05cde4450668-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.737823 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.737800 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/18946c1e-e029-484b-89f2-05cde4450668-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.738184 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.738153 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/18946c1e-e029-484b-89f2-05cde4450668-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.740168 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.739559 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/18946c1e-e029-484b-89f2-05cde4450668-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.742325 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.742297 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18946c1e-e029-484b-89f2-05cde4450668-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.749697 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.749673 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/18946c1e-e029-484b-89f2-05cde4450668-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.751898 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.750826 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/18946c1e-e029-484b-89f2-05cde4450668-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.751898 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.751144 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/18946c1e-e029-484b-89f2-05cde4450668-config\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.751898 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.751579 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/18946c1e-e029-484b-89f2-05cde4450668-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.751898 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.751860 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/18946c1e-e029-484b-89f2-05cde4450668-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.752574 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.752232 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/18946c1e-e029-484b-89f2-05cde4450668-config-out\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.752574 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.752393 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/18946c1e-e029-484b-89f2-05cde4450668-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.753561 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.753506 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sc6q\" (UniqueName: \"kubernetes.io/projected/18946c1e-e029-484b-89f2-05cde4450668-kube-api-access-2sc6q\") pod \"prometheus-k8s-0\" (UID: \"18946c1e-e029-484b-89f2-05cde4450668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.833169 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.833130 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/336ee31c-aa4b-4408-9cbd-46e77e017cfa-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-klvtv\" (UID: \"336ee31c-aa4b-4408-9cbd-46e77e017cfa\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-klvtv" Apr 17 20:17:52.836548 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.836499 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/336ee31c-aa4b-4408-9cbd-46e77e017cfa-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-klvtv\" (UID: \"336ee31c-aa4b-4408-9cbd-46e77e017cfa\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-klvtv" Apr 17 20:17:52.854097 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.853624 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:17:52.968110 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.968073 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-klvtv" Apr 17 20:17:52.993731 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:52.993694 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7dffdb7bd4-5fdhx" event={"ID":"754c0844-0836-4dca-9230-21c7a04f6de9","Type":"ContainerStarted","Data":"248aa346ffe0c60e6df68f0a4b0eaa20723324831e4d2144b35a159d3512216d"} Apr 17 20:17:53.876066 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:53.873355 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76df7c9b79-w59dp"] Apr 17 20:17:54.150946 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:54.150911 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5d9fc4bfc5-xnjpn" Apr 17 20:17:54.151520 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:54.151496 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5d9fc4bfc5-xnjpn" Apr 17 20:17:54.153057 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:54.153025 2580 patch_prober.go:28] interesting pod/console-5d9fc4bfc5-xnjpn container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.134.0.15:8443/health\": dial tcp 10.134.0.15:8443: connect: connection refused" start-of-body= Apr 17 20:17:54.153157 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:54.153085 2580 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-5d9fc4bfc5-xnjpn" podUID="132b50a8-deb0-486b-a136-71fb422c3919" containerName="console" probeResult="failure" output="Get \"https://10.134.0.15:8443/health\": dial tcp 10.134.0.15:8443: connect: connection refused" Apr 17 20:17:54.245839 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:54.245802 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 20:17:54.258228 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:54.258170 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-klvtv"] Apr 17 20:17:54.742147 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:17:54.742109 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod336ee31c_aa4b_4408_9cbd_46e77e017cfa.slice/crio-b0b633837b0c25f0f90f107afd3c4ce3b61c59c49d4c85e17dd6fa3f061f5aef WatchSource:0}: Error finding container b0b633837b0c25f0f90f107afd3c4ce3b61c59c49d4c85e17dd6fa3f061f5aef: Status 404 returned error can't find the container with id b0b633837b0c25f0f90f107afd3c4ce3b61c59c49d4c85e17dd6fa3f061f5aef Apr 17 20:17:54.743852 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:17:54.743561 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18946c1e_e029_484b_89f2_05cde4450668.slice/crio-5fb6bf068a296e357754f7cfd062afb65c830e4c327db9e0be7b824e402d6b63 WatchSource:0}: Error finding container 5fb6bf068a296e357754f7cfd062afb65c830e4c327db9e0be7b824e402d6b63: Status 404 returned error can't find the container with id 5fb6bf068a296e357754f7cfd062afb65c830e4c327db9e0be7b824e402d6b63 Apr 17 20:17:55.001669 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:55.001601 2580 generic.go:358] "Generic (PLEG): container finished" podID="18946c1e-e029-484b-89f2-05cde4450668" containerID="00e172a31b95ce4aaeb6cc6dcd639f03eb13588698711056e50657a35cb14c0c" exitCode=0 Apr 17 20:17:55.001816 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:55.001679 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"18946c1e-e029-484b-89f2-05cde4450668","Type":"ContainerDied","Data":"00e172a31b95ce4aaeb6cc6dcd639f03eb13588698711056e50657a35cb14c0c"} Apr 17 20:17:55.001816 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:55.001729 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"18946c1e-e029-484b-89f2-05cde4450668","Type":"ContainerStarted","Data":"5fb6bf068a296e357754f7cfd062afb65c830e4c327db9e0be7b824e402d6b63"} Apr 17 20:17:55.003537 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:55.003508 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7f9447cc77-gq78m" event={"ID":"8fb9d968-11f8-424c-833a-9133403fbf4e","Type":"ContainerStarted","Data":"a0e7d05f9e36c9ae10a7008933c301c16c7a285d0f7796e806a19665457ffbac"} Apr 17 20:17:55.005720 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:55.005695 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"79f5d464-0615-476c-8303-51771c3852b6","Type":"ContainerStarted","Data":"0613802409d994934913d4f3000739d32ab2ae7631b3165707092b709c0141ac"} Apr 17 20:17:55.006772 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:55.006751 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-klvtv" event={"ID":"336ee31c-aa4b-4408-9cbd-46e77e017cfa","Type":"ContainerStarted","Data":"b0b633837b0c25f0f90f107afd3c4ce3b61c59c49d4c85e17dd6fa3f061f5aef"} Apr 17 20:17:55.040571 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:55.039771 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-7f9447cc77-gq78m" podStartSLOduration=2.158704352 podStartE2EDuration="5.03975908s" podCreationTimestamp="2026-04-17 20:17:50 +0000 UTC" firstStartedPulling="2026-04-17 20:17:51.210294611 +0000 UTC m=+138.286666598" lastFinishedPulling="2026-04-17 20:17:54.091349344 +0000 UTC m=+141.167721326" observedRunningTime="2026-04-17 20:17:55.039381241 +0000 UTC m=+142.115753244" watchObservedRunningTime="2026-04-17 20:17:55.03975908 +0000 UTC m=+142.116131081" Apr 17 20:17:56.015801 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:56.015343 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7dffdb7bd4-5fdhx" event={"ID":"754c0844-0836-4dca-9230-21c7a04f6de9","Type":"ContainerStarted","Data":"2e0d85c744bdfa47c28297b65c22c15d3543fdb50836a02c558d15d401cb4da4"} Apr 17 20:17:56.015801 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:56.015489 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7dffdb7bd4-5fdhx" event={"ID":"754c0844-0836-4dca-9230-21c7a04f6de9","Type":"ContainerStarted","Data":"f0559360a31bdaca1188b06901ca8a542c4f90eaae073ab2048e18dcfc10e53c"} Apr 17 20:17:56.015801 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:56.015509 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7dffdb7bd4-5fdhx" event={"ID":"754c0844-0836-4dca-9230-21c7a04f6de9","Type":"ContainerStarted","Data":"5f4f61739c2d41ba5744591e7960991d50f173f3d5e315c796e9d0c83bc888b2"} Apr 17 20:17:56.021879 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:56.021831 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"79f5d464-0615-476c-8303-51771c3852b6","Type":"ContainerStarted","Data":"97aff3accefb746b951932890e503fc5f73dfe31f4a86afdacef049e54eda8ac"} Apr 17 20:17:56.021879 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:56.021864 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"79f5d464-0615-476c-8303-51771c3852b6","Type":"ContainerStarted","Data":"df60f7547c3dbf0341c5f7139db786f2d2abbb308984811ccbbe5a7b336dd1a7"} Apr 17 20:17:56.021879 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:56.021878 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"79f5d464-0615-476c-8303-51771c3852b6","Type":"ContainerStarted","Data":"6ecdf22c89f475324b1e7bb3c2eaf38013c4769ff5e6475e439f4511be479ce7"} Apr 17 20:17:56.022047 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:56.021889 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"79f5d464-0615-476c-8303-51771c3852b6","Type":"ContainerStarted","Data":"be81fbafecb20d5ab4978324d33cdd51abfe6fff2086a5c3287a424a72c80c8a"} Apr 17 20:17:56.035853 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:56.035251 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-7dffdb7bd4-5fdhx" podStartSLOduration=1.924016282 podStartE2EDuration="5.035233047s" podCreationTimestamp="2026-04-17 20:17:51 +0000 UTC" firstStartedPulling="2026-04-17 20:17:52.029338972 +0000 UTC m=+139.105710952" lastFinishedPulling="2026-04-17 20:17:55.140555734 +0000 UTC m=+142.216927717" observedRunningTime="2026-04-17 20:17:56.034262245 +0000 UTC m=+143.110634248" watchObservedRunningTime="2026-04-17 20:17:56.035233047 +0000 UTC m=+143.111605044" Apr 17 20:17:57.640079 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:57.640044 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5d9fc4bfc5-xnjpn"] Apr 17 20:17:58.033548 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:58.033469 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"79f5d464-0615-476c-8303-51771c3852b6","Type":"ContainerStarted","Data":"f2e414ca9329a1a04e9f07ee3dbad74cf96b2e1a5af954e57e1ffe4a25fd2b92"} Apr 17 20:17:58.034951 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:58.034921 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-klvtv" event={"ID":"336ee31c-aa4b-4408-9cbd-46e77e017cfa","Type":"ContainerStarted","Data":"7b0cbca523fb92601f9b78e4dd1f4680bb081c8120d6553572a10db056eb13a2"} Apr 17 20:17:58.035150 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:58.035128 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-klvtv" Apr 17 20:17:58.040855 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:58.040829 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-klvtv" Apr 17 20:17:58.060838 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:58.060797 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.38868935 podStartE2EDuration="11.060785278s" podCreationTimestamp="2026-04-17 20:17:47 +0000 UTC" firstStartedPulling="2026-04-17 20:17:48.013097705 +0000 UTC m=+135.089469685" lastFinishedPulling="2026-04-17 20:17:57.685193624 +0000 UTC m=+144.761565613" observedRunningTime="2026-04-17 20:17:58.058913765 +0000 UTC m=+145.135285780" watchObservedRunningTime="2026-04-17 20:17:58.060785278 +0000 UTC m=+145.137157279" Apr 17 20:17:58.073398 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:17:58.072800 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-klvtv" podStartSLOduration=4.540469629 podStartE2EDuration="7.072785098s" podCreationTimestamp="2026-04-17 20:17:51 +0000 UTC" firstStartedPulling="2026-04-17 20:17:54.744943561 +0000 UTC m=+141.821315542" lastFinishedPulling="2026-04-17 20:17:57.277259023 +0000 UTC m=+144.353631011" observedRunningTime="2026-04-17 20:17:58.071585202 +0000 UTC m=+145.147957204" watchObservedRunningTime="2026-04-17 20:17:58.072785098 +0000 UTC m=+145.149157101" Apr 17 20:18:00.047029 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:00.046994 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"18946c1e-e029-484b-89f2-05cde4450668","Type":"ContainerStarted","Data":"6c367a883eab010b1e8902bcbf328077dd8bb75b9f2be693b747e21478e82d9e"} Apr 17 20:18:00.047029 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:00.047032 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"18946c1e-e029-484b-89f2-05cde4450668","Type":"ContainerStarted","Data":"0ad8fc10f3e47ec65780bc460b40faf8badd6b77eb488df5b53e33580b776fb3"} Apr 17 20:18:03.058082 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:03.058044 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"18946c1e-e029-484b-89f2-05cde4450668","Type":"ContainerStarted","Data":"2aa873f88f999aa7f11a5c2020333aa0ef3132a66cd4dc0497defd5151f48f2b"} Apr 17 20:18:03.058082 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:03.058084 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"18946c1e-e029-484b-89f2-05cde4450668","Type":"ContainerStarted","Data":"0825c86811777239d1744fbd5d9831e9cef957bfab44adac8e2d8f354367ed69"} Apr 17 20:18:03.058577 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:03.058097 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"18946c1e-e029-484b-89f2-05cde4450668","Type":"ContainerStarted","Data":"1c2b226e8f7a1a809dd8f7e0e866cd00f1f8faa796fb84c8f0c7e0ea0464e6e5"} Apr 17 20:18:03.058577 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:03.058112 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"18946c1e-e029-484b-89f2-05cde4450668","Type":"ContainerStarted","Data":"38738372a706792856652ab0fc5fdf25e9b64a7d2d21896f05828843b1ea671a"} Apr 17 20:18:03.083194 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:03.083142 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=4.191273756 podStartE2EDuration="11.083123999s" podCreationTimestamp="2026-04-17 20:17:52 +0000 UTC" firstStartedPulling="2026-04-17 20:17:55.133623439 +0000 UTC m=+142.209995419" lastFinishedPulling="2026-04-17 20:18:02.025473682 +0000 UTC m=+149.101845662" observedRunningTime="2026-04-17 20:18:03.082001156 +0000 UTC m=+150.158373156" watchObservedRunningTime="2026-04-17 20:18:03.083123999 +0000 UTC m=+150.159496003" Apr 17 20:18:07.853884 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:07.853853 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:18:09.751707 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:18:09.751665 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-nztqs" podUID="29f57080-c48b-42b7-8c1a-747b7fd06533" Apr 17 20:18:09.757804 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:18:09.757776 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-4x62t" podUID="e270f686-4250-41f8-a9c1-6f192df2ee57" Apr 17 20:18:10.080924 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:10.080903 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nztqs" Apr 17 20:18:11.064101 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:11.064073 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-7f9447cc77-gq78m" Apr 17 20:18:11.064101 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:11.064109 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-7f9447cc77-gq78m" Apr 17 20:18:14.641359 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:14.641326 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/29f57080-c48b-42b7-8c1a-747b7fd06533-metrics-tls\") pod \"dns-default-nztqs\" (UID: \"29f57080-c48b-42b7-8c1a-747b7fd06533\") " pod="openshift-dns/dns-default-nztqs" Apr 17 20:18:14.641737 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:14.641401 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e270f686-4250-41f8-a9c1-6f192df2ee57-cert\") pod \"ingress-canary-4x62t\" (UID: \"e270f686-4250-41f8-a9c1-6f192df2ee57\") " pod="openshift-ingress-canary/ingress-canary-4x62t" Apr 17 20:18:14.643872 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:14.643842 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/29f57080-c48b-42b7-8c1a-747b7fd06533-metrics-tls\") pod \"dns-default-nztqs\" (UID: \"29f57080-c48b-42b7-8c1a-747b7fd06533\") " pod="openshift-dns/dns-default-nztqs" Apr 17 20:18:14.644118 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:14.644096 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e270f686-4250-41f8-a9c1-6f192df2ee57-cert\") pod \"ingress-canary-4x62t\" (UID: \"e270f686-4250-41f8-a9c1-6f192df2ee57\") " pod="openshift-ingress-canary/ingress-canary-4x62t" Apr 17 20:18:14.884207 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:14.884177 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-x9tzj\"" Apr 17 20:18:14.892844 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:14.892798 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nztqs" Apr 17 20:18:15.010841 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:15.010816 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nztqs"] Apr 17 20:18:15.013098 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:18:15.013069 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29f57080_c48b_42b7_8c1a_747b7fd06533.slice/crio-3bdec3441cbdc2510ca958ab9fbfd016aaa02c8ad1bb93c05e8187ad4061a3a8 WatchSource:0}: Error finding container 3bdec3441cbdc2510ca958ab9fbfd016aaa02c8ad1bb93c05e8187ad4061a3a8: Status 404 returned error can't find the container with id 3bdec3441cbdc2510ca958ab9fbfd016aaa02c8ad1bb93c05e8187ad4061a3a8 Apr 17 20:18:15.097499 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:15.097448 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nztqs" event={"ID":"29f57080-c48b-42b7-8c1a-747b7fd06533","Type":"ContainerStarted","Data":"3bdec3441cbdc2510ca958ab9fbfd016aaa02c8ad1bb93c05e8187ad4061a3a8"} Apr 17 20:18:17.106841 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:17.106797 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nztqs" event={"ID":"29f57080-c48b-42b7-8c1a-747b7fd06533","Type":"ContainerStarted","Data":"da188fb4a4673e5f16e81bdf57976b95c439e19d5bd54a27e31c92ac28ea3440"} Apr 17 20:18:17.107190 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:17.106840 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nztqs" event={"ID":"29f57080-c48b-42b7-8c1a-747b7fd06533","Type":"ContainerStarted","Data":"9b64e1b129ef945d346bfc724c28e11892ca22b601188ffef10b68f745fe69b8"} Apr 17 20:18:17.107190 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:17.106883 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-nztqs" Apr 17 20:18:17.135583 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:17.123781 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-nztqs" podStartSLOduration=129.952334914 podStartE2EDuration="2m11.123763221s" podCreationTimestamp="2026-04-17 20:16:06 +0000 UTC" firstStartedPulling="2026-04-17 20:18:15.015262104 +0000 UTC m=+162.091634093" lastFinishedPulling="2026-04-17 20:18:16.18669042 +0000 UTC m=+163.263062400" observedRunningTime="2026-04-17 20:18:17.120816244 +0000 UTC m=+164.197188246" watchObservedRunningTime="2026-04-17 20:18:17.123763221 +0000 UTC m=+164.200135224" Apr 17 20:18:18.898200 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:18.898160 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-76df7c9b79-w59dp" podUID="a04f1a62-018f-41d6-a075-60c2c6d70790" containerName="console" containerID="cri-o://9af88f5ac16a593fe31417290ebbdcee276a244e2fd8c267a3a1e8f2a8edc5c4" gracePeriod=15 Apr 17 20:18:19.115607 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:19.115582 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76df7c9b79-w59dp_a04f1a62-018f-41d6-a075-60c2c6d70790/console/0.log" Apr 17 20:18:19.115734 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:19.115623 2580 generic.go:358] "Generic (PLEG): container finished" podID="a04f1a62-018f-41d6-a075-60c2c6d70790" containerID="9af88f5ac16a593fe31417290ebbdcee276a244e2fd8c267a3a1e8f2a8edc5c4" exitCode=2 Apr 17 20:18:19.115734 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:19.115670 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76df7c9b79-w59dp" event={"ID":"a04f1a62-018f-41d6-a075-60c2c6d70790","Type":"ContainerDied","Data":"9af88f5ac16a593fe31417290ebbdcee276a244e2fd8c267a3a1e8f2a8edc5c4"} Apr 17 20:18:19.140302 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:19.140285 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76df7c9b79-w59dp_a04f1a62-018f-41d6-a075-60c2c6d70790/console/0.log" Apr 17 20:18:19.140396 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:19.140352 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76df7c9b79-w59dp" Apr 17 20:18:19.288073 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:19.287992 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ns6bd\" (UniqueName: \"kubernetes.io/projected/a04f1a62-018f-41d6-a075-60c2c6d70790-kube-api-access-ns6bd\") pod \"a04f1a62-018f-41d6-a075-60c2c6d70790\" (UID: \"a04f1a62-018f-41d6-a075-60c2c6d70790\") " Apr 17 20:18:19.288073 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:19.288026 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a04f1a62-018f-41d6-a075-60c2c6d70790-console-oauth-config\") pod \"a04f1a62-018f-41d6-a075-60c2c6d70790\" (UID: \"a04f1a62-018f-41d6-a075-60c2c6d70790\") " Apr 17 20:18:19.288073 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:19.288046 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a04f1a62-018f-41d6-a075-60c2c6d70790-console-config\") pod \"a04f1a62-018f-41d6-a075-60c2c6d70790\" (UID: \"a04f1a62-018f-41d6-a075-60c2c6d70790\") " Apr 17 20:18:19.288337 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:19.288136 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a04f1a62-018f-41d6-a075-60c2c6d70790-service-ca\") pod \"a04f1a62-018f-41d6-a075-60c2c6d70790\" (UID: \"a04f1a62-018f-41d6-a075-60c2c6d70790\") " Apr 17 20:18:19.288337 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:19.288159 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a04f1a62-018f-41d6-a075-60c2c6d70790-oauth-serving-cert\") pod \"a04f1a62-018f-41d6-a075-60c2c6d70790\" (UID: \"a04f1a62-018f-41d6-a075-60c2c6d70790\") " Apr 17 20:18:19.288337 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:19.288178 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a04f1a62-018f-41d6-a075-60c2c6d70790-console-serving-cert\") pod \"a04f1a62-018f-41d6-a075-60c2c6d70790\" (UID: \"a04f1a62-018f-41d6-a075-60c2c6d70790\") " Apr 17 20:18:19.288613 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:19.288572 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a04f1a62-018f-41d6-a075-60c2c6d70790-console-config" (OuterVolumeSpecName: "console-config") pod "a04f1a62-018f-41d6-a075-60c2c6d70790" (UID: "a04f1a62-018f-41d6-a075-60c2c6d70790"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:18:19.288672 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:19.288608 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a04f1a62-018f-41d6-a075-60c2c6d70790-service-ca" (OuterVolumeSpecName: "service-ca") pod "a04f1a62-018f-41d6-a075-60c2c6d70790" (UID: "a04f1a62-018f-41d6-a075-60c2c6d70790"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:18:19.290670 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:19.289524 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a04f1a62-018f-41d6-a075-60c2c6d70790-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a04f1a62-018f-41d6-a075-60c2c6d70790" (UID: "a04f1a62-018f-41d6-a075-60c2c6d70790"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:18:19.295338 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:19.291108 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a04f1a62-018f-41d6-a075-60c2c6d70790-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a04f1a62-018f-41d6-a075-60c2c6d70790" (UID: "a04f1a62-018f-41d6-a075-60c2c6d70790"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:18:19.295338 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:19.292706 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a04f1a62-018f-41d6-a075-60c2c6d70790-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a04f1a62-018f-41d6-a075-60c2c6d70790" (UID: "a04f1a62-018f-41d6-a075-60c2c6d70790"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:18:19.295338 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:19.293282 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a04f1a62-018f-41d6-a075-60c2c6d70790-kube-api-access-ns6bd" (OuterVolumeSpecName: "kube-api-access-ns6bd") pod "a04f1a62-018f-41d6-a075-60c2c6d70790" (UID: "a04f1a62-018f-41d6-a075-60c2c6d70790"). InnerVolumeSpecName "kube-api-access-ns6bd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:18:19.389135 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:19.389106 2580 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a04f1a62-018f-41d6-a075-60c2c6d70790-oauth-serving-cert\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 17 20:18:19.389135 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:19.389131 2580 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a04f1a62-018f-41d6-a075-60c2c6d70790-console-serving-cert\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 17 20:18:19.389297 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:19.389143 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ns6bd\" (UniqueName: \"kubernetes.io/projected/a04f1a62-018f-41d6-a075-60c2c6d70790-kube-api-access-ns6bd\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 17 20:18:19.389297 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:19.389156 2580 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a04f1a62-018f-41d6-a075-60c2c6d70790-console-oauth-config\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 17 20:18:19.389297 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:19.389167 2580 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a04f1a62-018f-41d6-a075-60c2c6d70790-console-config\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 17 20:18:19.389297 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:19.389179 2580 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a04f1a62-018f-41d6-a075-60c2c6d70790-service-ca\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 17 20:18:20.119730 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:20.119650 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76df7c9b79-w59dp_a04f1a62-018f-41d6-a075-60c2c6d70790/console/0.log" Apr 17 20:18:20.120145 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:20.119741 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76df7c9b79-w59dp" event={"ID":"a04f1a62-018f-41d6-a075-60c2c6d70790","Type":"ContainerDied","Data":"a0ecd37edef089375e33a1e3ef0be6eae59c23560134360a95cdfd2f75c5f5c3"} Apr 17 20:18:20.120145 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:20.119794 2580 scope.go:117] "RemoveContainer" containerID="9af88f5ac16a593fe31417290ebbdcee276a244e2fd8c267a3a1e8f2a8edc5c4" Apr 17 20:18:20.120145 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:20.119796 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76df7c9b79-w59dp" Apr 17 20:18:20.135316 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:20.135296 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76df7c9b79-w59dp"] Apr 17 20:18:20.138689 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:20.138669 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-76df7c9b79-w59dp"] Apr 17 20:18:21.531281 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:21.531246 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a04f1a62-018f-41d6-a075-60c2c6d70790" path="/var/lib/kubelet/pods/a04f1a62-018f-41d6-a075-60c2c6d70790/volumes" Apr 17 20:18:22.666262 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:22.666220 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5d9fc4bfc5-xnjpn" podUID="132b50a8-deb0-486b-a136-71fb422c3919" containerName="console" containerID="cri-o://dc895a2ec1342ad1a8ec6f7b3be89877163b733bceabf0a4859ec94ec9d4bdab" gracePeriod=15 Apr 17 20:18:22.929367 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:22.929349 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5d9fc4bfc5-xnjpn_132b50a8-deb0-486b-a136-71fb422c3919/console/0.log" Apr 17 20:18:22.929509 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:22.929408 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d9fc4bfc5-xnjpn" Apr 17 20:18:23.120282 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:23.120247 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/132b50a8-deb0-486b-a136-71fb422c3919-service-ca\") pod \"132b50a8-deb0-486b-a136-71fb422c3919\" (UID: \"132b50a8-deb0-486b-a136-71fb422c3919\") " Apr 17 20:18:23.120503 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:23.120335 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/132b50a8-deb0-486b-a136-71fb422c3919-console-config\") pod \"132b50a8-deb0-486b-a136-71fb422c3919\" (UID: \"132b50a8-deb0-486b-a136-71fb422c3919\") " Apr 17 20:18:23.120503 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:23.120370 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/132b50a8-deb0-486b-a136-71fb422c3919-trusted-ca-bundle\") pod \"132b50a8-deb0-486b-a136-71fb422c3919\" (UID: \"132b50a8-deb0-486b-a136-71fb422c3919\") " Apr 17 20:18:23.120503 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:23.120426 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5f2d\" (UniqueName: \"kubernetes.io/projected/132b50a8-deb0-486b-a136-71fb422c3919-kube-api-access-w5f2d\") pod \"132b50a8-deb0-486b-a136-71fb422c3919\" (UID: \"132b50a8-deb0-486b-a136-71fb422c3919\") " Apr 17 20:18:23.120503 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:23.120486 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/132b50a8-deb0-486b-a136-71fb422c3919-oauth-serving-cert\") pod \"132b50a8-deb0-486b-a136-71fb422c3919\" (UID: \"132b50a8-deb0-486b-a136-71fb422c3919\") " Apr 17 20:18:23.120718 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:23.120516 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/132b50a8-deb0-486b-a136-71fb422c3919-console-oauth-config\") pod \"132b50a8-deb0-486b-a136-71fb422c3919\" (UID: \"132b50a8-deb0-486b-a136-71fb422c3919\") " Apr 17 20:18:23.120718 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:23.120548 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/132b50a8-deb0-486b-a136-71fb422c3919-console-serving-cert\") pod \"132b50a8-deb0-486b-a136-71fb422c3919\" (UID: \"132b50a8-deb0-486b-a136-71fb422c3919\") " Apr 17 20:18:23.120813 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:23.120780 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/132b50a8-deb0-486b-a136-71fb422c3919-console-config" (OuterVolumeSpecName: "console-config") pod "132b50a8-deb0-486b-a136-71fb422c3919" (UID: "132b50a8-deb0-486b-a136-71fb422c3919"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:18:23.120813 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:23.120784 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/132b50a8-deb0-486b-a136-71fb422c3919-service-ca" (OuterVolumeSpecName: "service-ca") pod "132b50a8-deb0-486b-a136-71fb422c3919" (UID: "132b50a8-deb0-486b-a136-71fb422c3919"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:18:23.120919 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:23.120802 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/132b50a8-deb0-486b-a136-71fb422c3919-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "132b50a8-deb0-486b-a136-71fb422c3919" (UID: "132b50a8-deb0-486b-a136-71fb422c3919"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:18:23.120919 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:23.120852 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/132b50a8-deb0-486b-a136-71fb422c3919-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "132b50a8-deb0-486b-a136-71fb422c3919" (UID: "132b50a8-deb0-486b-a136-71fb422c3919"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:18:23.122860 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:23.122831 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/132b50a8-deb0-486b-a136-71fb422c3919-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "132b50a8-deb0-486b-a136-71fb422c3919" (UID: "132b50a8-deb0-486b-a136-71fb422c3919"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:18:23.122961 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:23.122856 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/132b50a8-deb0-486b-a136-71fb422c3919-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "132b50a8-deb0-486b-a136-71fb422c3919" (UID: "132b50a8-deb0-486b-a136-71fb422c3919"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:18:23.122961 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:23.122862 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/132b50a8-deb0-486b-a136-71fb422c3919-kube-api-access-w5f2d" (OuterVolumeSpecName: "kube-api-access-w5f2d") pod "132b50a8-deb0-486b-a136-71fb422c3919" (UID: "132b50a8-deb0-486b-a136-71fb422c3919"). InnerVolumeSpecName "kube-api-access-w5f2d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:18:23.131244 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:23.131228 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5d9fc4bfc5-xnjpn_132b50a8-deb0-486b-a136-71fb422c3919/console/0.log" Apr 17 20:18:23.131341 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:23.131263 2580 generic.go:358] "Generic (PLEG): container finished" podID="132b50a8-deb0-486b-a136-71fb422c3919" containerID="dc895a2ec1342ad1a8ec6f7b3be89877163b733bceabf0a4859ec94ec9d4bdab" exitCode=2 Apr 17 20:18:23.131341 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:23.131291 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d9fc4bfc5-xnjpn" event={"ID":"132b50a8-deb0-486b-a136-71fb422c3919","Type":"ContainerDied","Data":"dc895a2ec1342ad1a8ec6f7b3be89877163b733bceabf0a4859ec94ec9d4bdab"} Apr 17 20:18:23.131341 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:23.131312 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d9fc4bfc5-xnjpn" event={"ID":"132b50a8-deb0-486b-a136-71fb422c3919","Type":"ContainerDied","Data":"d4eba72ca64a3c6bcfa84a27c5dc0247117c9618b1876c9e5fac492010e83175"} Apr 17 20:18:23.131341 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:23.131325 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d9fc4bfc5-xnjpn" Apr 17 20:18:23.131578 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:23.131326 2580 scope.go:117] "RemoveContainer" containerID="dc895a2ec1342ad1a8ec6f7b3be89877163b733bceabf0a4859ec94ec9d4bdab" Apr 17 20:18:23.139541 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:23.139523 2580 scope.go:117] "RemoveContainer" containerID="dc895a2ec1342ad1a8ec6f7b3be89877163b733bceabf0a4859ec94ec9d4bdab" Apr 17 20:18:23.139792 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:18:23.139773 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc895a2ec1342ad1a8ec6f7b3be89877163b733bceabf0a4859ec94ec9d4bdab\": container with ID starting with dc895a2ec1342ad1a8ec6f7b3be89877163b733bceabf0a4859ec94ec9d4bdab not found: ID does not exist" containerID="dc895a2ec1342ad1a8ec6f7b3be89877163b733bceabf0a4859ec94ec9d4bdab" Apr 17 20:18:23.139846 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:23.139800 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc895a2ec1342ad1a8ec6f7b3be89877163b733bceabf0a4859ec94ec9d4bdab"} err="failed to get container status \"dc895a2ec1342ad1a8ec6f7b3be89877163b733bceabf0a4859ec94ec9d4bdab\": rpc error: code = NotFound desc = could not find container \"dc895a2ec1342ad1a8ec6f7b3be89877163b733bceabf0a4859ec94ec9d4bdab\": container with ID starting with dc895a2ec1342ad1a8ec6f7b3be89877163b733bceabf0a4859ec94ec9d4bdab not found: ID does not exist" Apr 17 20:18:23.150773 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:23.150749 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5d9fc4bfc5-xnjpn"] Apr 17 20:18:23.153357 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:23.153334 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5d9fc4bfc5-xnjpn"] Apr 17 20:18:23.222094 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:23.222028 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w5f2d\" (UniqueName: \"kubernetes.io/projected/132b50a8-deb0-486b-a136-71fb422c3919-kube-api-access-w5f2d\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 17 20:18:23.222094 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:23.222053 2580 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/132b50a8-deb0-486b-a136-71fb422c3919-oauth-serving-cert\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 17 20:18:23.222094 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:23.222067 2580 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/132b50a8-deb0-486b-a136-71fb422c3919-console-oauth-config\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 17 20:18:23.222094 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:23.222076 2580 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/132b50a8-deb0-486b-a136-71fb422c3919-console-serving-cert\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 17 20:18:23.222094 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:23.222088 2580 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/132b50a8-deb0-486b-a136-71fb422c3919-service-ca\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 17 20:18:23.222094 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:23.222097 2580 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/132b50a8-deb0-486b-a136-71fb422c3919-console-config\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 17 20:18:23.222341 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:23.222105 2580 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/132b50a8-deb0-486b-a136-71fb422c3919-trusted-ca-bundle\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 17 20:18:23.536186 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:23.536046 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="132b50a8-deb0-486b-a136-71fb422c3919" path="/var/lib/kubelet/pods/132b50a8-deb0-486b-a136-71fb422c3919/volumes" Apr 17 20:18:25.526364 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:25.526321 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4x62t" Apr 17 20:18:25.528770 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:25.528745 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ndtv9\"" Apr 17 20:18:25.537640 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:25.537622 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4x62t" Apr 17 20:18:25.656038 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:25.655880 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4x62t"] Apr 17 20:18:25.658628 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:18:25.658600 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode270f686_4250_41f8_a9c1_6f192df2ee57.slice/crio-da528a252f0066cad7f69359c10557a427f0c83ec146568d8e1757744cb1f116 WatchSource:0}: Error finding container da528a252f0066cad7f69359c10557a427f0c83ec146568d8e1757744cb1f116: Status 404 returned error can't find the container with id da528a252f0066cad7f69359c10557a427f0c83ec146568d8e1757744cb1f116 Apr 17 20:18:26.142155 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:26.142116 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4x62t" event={"ID":"e270f686-4250-41f8-a9c1-6f192df2ee57","Type":"ContainerStarted","Data":"da528a252f0066cad7f69359c10557a427f0c83ec146568d8e1757744cb1f116"} Apr 17 20:18:27.112153 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:27.112121 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-nztqs" Apr 17 20:18:28.150243 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:28.150154 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4x62t" event={"ID":"e270f686-4250-41f8-a9c1-6f192df2ee57","Type":"ContainerStarted","Data":"8ed7f41669492c2560fd4da0ad5bc9f98578207feabefad6316e77e8737f4838"} Apr 17 20:18:28.166662 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:28.166620 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-4x62t" podStartSLOduration=139.928503899 podStartE2EDuration="2m22.16660656s" podCreationTimestamp="2026-04-17 20:16:06 +0000 UTC" firstStartedPulling="2026-04-17 20:18:25.66045266 +0000 UTC m=+172.736824639" lastFinishedPulling="2026-04-17 20:18:27.89855531 +0000 UTC m=+174.974927300" observedRunningTime="2026-04-17 20:18:28.165372447 +0000 UTC m=+175.241744448" watchObservedRunningTime="2026-04-17 20:18:28.16660656 +0000 UTC m=+175.242978561" Apr 17 20:18:31.068555 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:31.068527 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-7f9447cc77-gq78m" Apr 17 20:18:31.072485 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:31.072452 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-7f9447cc77-gq78m" Apr 17 20:18:39.233204 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:39.233173 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-n6lwg_9d8cbec6-ac91-4373-b7ef-593404bf8a86/node-ca/0.log" Apr 17 20:18:52.854448 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:52.854185 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:18:52.871319 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:52.871294 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:18:53.245623 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:18:53.245549 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:19:36.585706 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:19:36.585675 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-pmz57"] Apr 17 20:19:36.586101 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:19:36.585978 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="132b50a8-deb0-486b-a136-71fb422c3919" containerName="console" Apr 17 20:19:36.586101 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:19:36.585990 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="132b50a8-deb0-486b-a136-71fb422c3919" containerName="console" Apr 17 20:19:36.586101 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:19:36.586012 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a04f1a62-018f-41d6-a075-60c2c6d70790" containerName="console" Apr 17 20:19:36.586101 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:19:36.586017 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="a04f1a62-018f-41d6-a075-60c2c6d70790" containerName="console" Apr 17 20:19:36.586101 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:19:36.586063 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="132b50a8-deb0-486b-a136-71fb422c3919" containerName="console" Apr 17 20:19:36.586101 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:19:36.586071 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="a04f1a62-018f-41d6-a075-60c2c6d70790" containerName="console" Apr 17 20:19:36.589220 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:19:36.589202 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pmz57" Apr 17 20:19:36.591649 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:19:36.591629 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 20:19:36.595815 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:19:36.595795 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-pmz57"] Apr 17 20:19:36.595927 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:19:36.595851 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b5439bd7-7d09-4129-aaf9-d27b03010197-dbus\") pod \"global-pull-secret-syncer-pmz57\" (UID: \"b5439bd7-7d09-4129-aaf9-d27b03010197\") " pod="kube-system/global-pull-secret-syncer-pmz57" Apr 17 20:19:36.596285 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:19:36.595924 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b5439bd7-7d09-4129-aaf9-d27b03010197-kubelet-config\") pod \"global-pull-secret-syncer-pmz57\" (UID: \"b5439bd7-7d09-4129-aaf9-d27b03010197\") " pod="kube-system/global-pull-secret-syncer-pmz57" Apr 17 20:19:36.596285 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:19:36.595959 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b5439bd7-7d09-4129-aaf9-d27b03010197-original-pull-secret\") pod \"global-pull-secret-syncer-pmz57\" (UID: \"b5439bd7-7d09-4129-aaf9-d27b03010197\") " pod="kube-system/global-pull-secret-syncer-pmz57" Apr 17 20:19:36.696654 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:19:36.696628 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b5439bd7-7d09-4129-aaf9-d27b03010197-kubelet-config\") pod \"global-pull-secret-syncer-pmz57\" (UID: \"b5439bd7-7d09-4129-aaf9-d27b03010197\") " pod="kube-system/global-pull-secret-syncer-pmz57" Apr 17 20:19:36.696790 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:19:36.696670 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b5439bd7-7d09-4129-aaf9-d27b03010197-original-pull-secret\") pod \"global-pull-secret-syncer-pmz57\" (UID: \"b5439bd7-7d09-4129-aaf9-d27b03010197\") " pod="kube-system/global-pull-secret-syncer-pmz57" Apr 17 20:19:36.696790 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:19:36.696734 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b5439bd7-7d09-4129-aaf9-d27b03010197-kubelet-config\") pod \"global-pull-secret-syncer-pmz57\" (UID: \"b5439bd7-7d09-4129-aaf9-d27b03010197\") " pod="kube-system/global-pull-secret-syncer-pmz57" Apr 17 20:19:36.696790 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:19:36.696742 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b5439bd7-7d09-4129-aaf9-d27b03010197-dbus\") pod \"global-pull-secret-syncer-pmz57\" (UID: \"b5439bd7-7d09-4129-aaf9-d27b03010197\") " pod="kube-system/global-pull-secret-syncer-pmz57" Apr 17 20:19:36.696942 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:19:36.696884 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b5439bd7-7d09-4129-aaf9-d27b03010197-dbus\") pod \"global-pull-secret-syncer-pmz57\" (UID: \"b5439bd7-7d09-4129-aaf9-d27b03010197\") " pod="kube-system/global-pull-secret-syncer-pmz57" Apr 17 20:19:36.699007 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:19:36.698987 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b5439bd7-7d09-4129-aaf9-d27b03010197-original-pull-secret\") pod \"global-pull-secret-syncer-pmz57\" (UID: \"b5439bd7-7d09-4129-aaf9-d27b03010197\") " pod="kube-system/global-pull-secret-syncer-pmz57" Apr 17 20:19:36.899168 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:19:36.899070 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pmz57" Apr 17 20:19:37.014451 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:19:37.014422 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-pmz57"] Apr 17 20:19:37.016809 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:19:37.016783 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5439bd7_7d09_4129_aaf9_d27b03010197.slice/crio-40aaec3a500d1aaa6d18e6cbe934f617f6eb591060a4525640e89ecf05c3cb0d WatchSource:0}: Error finding container 40aaec3a500d1aaa6d18e6cbe934f617f6eb591060a4525640e89ecf05c3cb0d: Status 404 returned error can't find the container with id 40aaec3a500d1aaa6d18e6cbe934f617f6eb591060a4525640e89ecf05c3cb0d Apr 17 20:19:37.356274 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:19:37.356239 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-pmz57" event={"ID":"b5439bd7-7d09-4129-aaf9-d27b03010197","Type":"ContainerStarted","Data":"40aaec3a500d1aaa6d18e6cbe934f617f6eb591060a4525640e89ecf05c3cb0d"} Apr 17 20:19:41.372056 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:19:41.372014 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-pmz57" event={"ID":"b5439bd7-7d09-4129-aaf9-d27b03010197","Type":"ContainerStarted","Data":"4b0cfb281cf0a8946fd35096a34b3ac95b014411fc2b3cba0f050101de9baec6"} Apr 17 20:19:41.386011 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:19:41.385957 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-pmz57" podStartSLOduration=1.238173257 podStartE2EDuration="5.385944019s" podCreationTimestamp="2026-04-17 20:19:36 +0000 UTC" firstStartedPulling="2026-04-17 20:19:37.018409082 +0000 UTC m=+244.094781062" lastFinishedPulling="2026-04-17 20:19:41.166179843 +0000 UTC m=+248.242551824" observedRunningTime="2026-04-17 20:19:41.384196878 +0000 UTC m=+248.460568880" watchObservedRunningTime="2026-04-17 20:19:41.385944019 +0000 UTC m=+248.462316021" Apr 17 20:20:33.408276 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:20:33.408244 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrmd6_c81797ca-3338-4294-8ef8-fb0416677637/ovn-acl-logging/0.log" Apr 17 20:20:33.408276 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:20:33.408244 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrmd6_c81797ca-3338-4294-8ef8-fb0416677637/ovn-acl-logging/0.log" Apr 17 20:20:33.412323 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:20:33.412303 2580 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 20:21:16.508380 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:16.508335 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-97hnf"] Apr 17 20:21:16.512091 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:16.512035 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-97hnf" Apr 17 20:21:16.514396 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:16.514376 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 17 20:21:16.514539 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:16.514380 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 20:21:16.514539 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:16.514380 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-5wc67\"" Apr 17 20:21:16.514701 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:16.514684 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 20:21:16.514800 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:16.514785 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 17 20:21:16.522997 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:16.522975 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-97hnf"] Apr 17 20:21:16.545982 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:16.545950 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w4rw\" (UniqueName: \"kubernetes.io/projected/95bc4d30-7395-4a2a-8de4-7d525388ec83-kube-api-access-6w4rw\") pod \"opendatahub-operator-controller-manager-799c8bc7d9-97hnf\" (UID: \"95bc4d30-7395-4a2a-8de4-7d525388ec83\") " pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-97hnf" Apr 17 20:21:16.546111 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:16.546009 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/95bc4d30-7395-4a2a-8de4-7d525388ec83-apiservice-cert\") pod \"opendatahub-operator-controller-manager-799c8bc7d9-97hnf\" (UID: \"95bc4d30-7395-4a2a-8de4-7d525388ec83\") " pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-97hnf" Apr 17 20:21:16.546111 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:16.546072 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/95bc4d30-7395-4a2a-8de4-7d525388ec83-webhook-cert\") pod \"opendatahub-operator-controller-manager-799c8bc7d9-97hnf\" (UID: \"95bc4d30-7395-4a2a-8de4-7d525388ec83\") " pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-97hnf" Apr 17 20:21:16.646783 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:16.646751 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/95bc4d30-7395-4a2a-8de4-7d525388ec83-apiservice-cert\") pod \"opendatahub-operator-controller-manager-799c8bc7d9-97hnf\" (UID: \"95bc4d30-7395-4a2a-8de4-7d525388ec83\") " pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-97hnf" Apr 17 20:21:16.646783 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:16.646783 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/95bc4d30-7395-4a2a-8de4-7d525388ec83-webhook-cert\") pod \"opendatahub-operator-controller-manager-799c8bc7d9-97hnf\" (UID: \"95bc4d30-7395-4a2a-8de4-7d525388ec83\") " pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-97hnf" Apr 17 20:21:16.646978 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:16.646836 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6w4rw\" (UniqueName: \"kubernetes.io/projected/95bc4d30-7395-4a2a-8de4-7d525388ec83-kube-api-access-6w4rw\") pod \"opendatahub-operator-controller-manager-799c8bc7d9-97hnf\" (UID: \"95bc4d30-7395-4a2a-8de4-7d525388ec83\") " pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-97hnf" Apr 17 20:21:16.649318 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:16.649288 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/95bc4d30-7395-4a2a-8de4-7d525388ec83-apiservice-cert\") pod \"opendatahub-operator-controller-manager-799c8bc7d9-97hnf\" (UID: \"95bc4d30-7395-4a2a-8de4-7d525388ec83\") " pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-97hnf" Apr 17 20:21:16.649426 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:16.649303 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/95bc4d30-7395-4a2a-8de4-7d525388ec83-webhook-cert\") pod \"opendatahub-operator-controller-manager-799c8bc7d9-97hnf\" (UID: \"95bc4d30-7395-4a2a-8de4-7d525388ec83\") " pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-97hnf" Apr 17 20:21:16.654518 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:16.654501 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w4rw\" (UniqueName: \"kubernetes.io/projected/95bc4d30-7395-4a2a-8de4-7d525388ec83-kube-api-access-6w4rw\") pod \"opendatahub-operator-controller-manager-799c8bc7d9-97hnf\" (UID: \"95bc4d30-7395-4a2a-8de4-7d525388ec83\") " pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-97hnf" Apr 17 20:21:16.822765 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:16.822730 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-97hnf" Apr 17 20:21:16.943315 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:16.943294 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-97hnf"] Apr 17 20:21:16.946261 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:21:16.946231 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95bc4d30_7395_4a2a_8de4_7d525388ec83.slice/crio-edb47d0a0c622cf72d39e45b3812fa1af4ae2f2af1369b3d1c69f06d0c5c390a WatchSource:0}: Error finding container edb47d0a0c622cf72d39e45b3812fa1af4ae2f2af1369b3d1c69f06d0c5c390a: Status 404 returned error can't find the container with id edb47d0a0c622cf72d39e45b3812fa1af4ae2f2af1369b3d1c69f06d0c5c390a Apr 17 20:21:16.947835 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:16.947818 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 20:21:17.646945 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:17.646896 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-97hnf" event={"ID":"95bc4d30-7395-4a2a-8de4-7d525388ec83","Type":"ContainerStarted","Data":"edb47d0a0c622cf72d39e45b3812fa1af4ae2f2af1369b3d1c69f06d0c5c390a"} Apr 17 20:21:18.569823 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:18.569791 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-fd99964b4-jkhdk"] Apr 17 20:21:18.575310 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:18.575284 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-fd99964b4-jkhdk" Apr 17 20:21:18.578487 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:18.578401 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:21:18.578487 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:18.578446 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 17 20:21:18.578768 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:18.578508 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 17 20:21:18.578768 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:18.578513 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 17 20:21:18.578768 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:18.578530 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-96pmd\"" Apr 17 20:21:18.578768 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:18.578589 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 17 20:21:18.583759 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:18.583738 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-fd99964b4-jkhdk"] Apr 17 20:21:18.666540 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:18.666500 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97ad1cbe-cd5a-4054-ba83-b963654e0afb-cert\") pod \"lws-controller-manager-fd99964b4-jkhdk\" (UID: \"97ad1cbe-cd5a-4054-ba83-b963654e0afb\") " pod="openshift-lws-operator/lws-controller-manager-fd99964b4-jkhdk" Apr 17 20:21:18.666954 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:18.666548 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/97ad1cbe-cd5a-4054-ba83-b963654e0afb-metrics-cert\") pod \"lws-controller-manager-fd99964b4-jkhdk\" (UID: \"97ad1cbe-cd5a-4054-ba83-b963654e0afb\") " pod="openshift-lws-operator/lws-controller-manager-fd99964b4-jkhdk" Apr 17 20:21:18.666954 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:18.666583 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g8q7\" (UniqueName: \"kubernetes.io/projected/97ad1cbe-cd5a-4054-ba83-b963654e0afb-kube-api-access-8g8q7\") pod \"lws-controller-manager-fd99964b4-jkhdk\" (UID: \"97ad1cbe-cd5a-4054-ba83-b963654e0afb\") " pod="openshift-lws-operator/lws-controller-manager-fd99964b4-jkhdk" Apr 17 20:21:18.666954 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:18.666705 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/97ad1cbe-cd5a-4054-ba83-b963654e0afb-manager-config\") pod \"lws-controller-manager-fd99964b4-jkhdk\" (UID: \"97ad1cbe-cd5a-4054-ba83-b963654e0afb\") " pod="openshift-lws-operator/lws-controller-manager-fd99964b4-jkhdk" Apr 17 20:21:18.767601 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:18.767570 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8g8q7\" (UniqueName: \"kubernetes.io/projected/97ad1cbe-cd5a-4054-ba83-b963654e0afb-kube-api-access-8g8q7\") pod \"lws-controller-manager-fd99964b4-jkhdk\" (UID: \"97ad1cbe-cd5a-4054-ba83-b963654e0afb\") " pod="openshift-lws-operator/lws-controller-manager-fd99964b4-jkhdk" Apr 17 20:21:18.767764 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:18.767626 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/97ad1cbe-cd5a-4054-ba83-b963654e0afb-manager-config\") pod \"lws-controller-manager-fd99964b4-jkhdk\" (UID: \"97ad1cbe-cd5a-4054-ba83-b963654e0afb\") " pod="openshift-lws-operator/lws-controller-manager-fd99964b4-jkhdk" Apr 17 20:21:18.767764 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:18.767683 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97ad1cbe-cd5a-4054-ba83-b963654e0afb-cert\") pod \"lws-controller-manager-fd99964b4-jkhdk\" (UID: \"97ad1cbe-cd5a-4054-ba83-b963654e0afb\") " pod="openshift-lws-operator/lws-controller-manager-fd99964b4-jkhdk" Apr 17 20:21:18.767764 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:18.767713 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/97ad1cbe-cd5a-4054-ba83-b963654e0afb-metrics-cert\") pod \"lws-controller-manager-fd99964b4-jkhdk\" (UID: \"97ad1cbe-cd5a-4054-ba83-b963654e0afb\") " pod="openshift-lws-operator/lws-controller-manager-fd99964b4-jkhdk" Apr 17 20:21:18.768332 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:18.768310 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/97ad1cbe-cd5a-4054-ba83-b963654e0afb-manager-config\") pod \"lws-controller-manager-fd99964b4-jkhdk\" (UID: \"97ad1cbe-cd5a-4054-ba83-b963654e0afb\") " pod="openshift-lws-operator/lws-controller-manager-fd99964b4-jkhdk" Apr 17 20:21:18.770269 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:18.770237 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97ad1cbe-cd5a-4054-ba83-b963654e0afb-cert\") pod \"lws-controller-manager-fd99964b4-jkhdk\" (UID: \"97ad1cbe-cd5a-4054-ba83-b963654e0afb\") " pod="openshift-lws-operator/lws-controller-manager-fd99964b4-jkhdk" Apr 17 20:21:18.770384 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:18.770280 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/97ad1cbe-cd5a-4054-ba83-b963654e0afb-metrics-cert\") pod \"lws-controller-manager-fd99964b4-jkhdk\" (UID: \"97ad1cbe-cd5a-4054-ba83-b963654e0afb\") " pod="openshift-lws-operator/lws-controller-manager-fd99964b4-jkhdk" Apr 17 20:21:18.775993 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:18.775969 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g8q7\" (UniqueName: \"kubernetes.io/projected/97ad1cbe-cd5a-4054-ba83-b963654e0afb-kube-api-access-8g8q7\") pod \"lws-controller-manager-fd99964b4-jkhdk\" (UID: \"97ad1cbe-cd5a-4054-ba83-b963654e0afb\") " pod="openshift-lws-operator/lws-controller-manager-fd99964b4-jkhdk" Apr 17 20:21:18.888200 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:18.888126 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-fd99964b4-jkhdk" Apr 17 20:21:19.471767 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:19.471745 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-fd99964b4-jkhdk"] Apr 17 20:21:19.473966 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:21:19.473938 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97ad1cbe_cd5a_4054_ba83_b963654e0afb.slice/crio-31fd607c596ffb017f5e31c0a9890dcc74c9d24129f0c8dfed7140355252bac4 WatchSource:0}: Error finding container 31fd607c596ffb017f5e31c0a9890dcc74c9d24129f0c8dfed7140355252bac4: Status 404 returned error can't find the container with id 31fd607c596ffb017f5e31c0a9890dcc74c9d24129f0c8dfed7140355252bac4 Apr 17 20:21:19.654938 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:19.654846 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-fd99964b4-jkhdk" event={"ID":"97ad1cbe-cd5a-4054-ba83-b963654e0afb","Type":"ContainerStarted","Data":"31fd607c596ffb017f5e31c0a9890dcc74c9d24129f0c8dfed7140355252bac4"} Apr 17 20:21:19.658500 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:19.658448 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-97hnf" event={"ID":"95bc4d30-7395-4a2a-8de4-7d525388ec83","Type":"ContainerStarted","Data":"9c3c6dc2434133b59d4de469fab1789ba3d0b38e40abfe85f256545223011a5e"} Apr 17 20:21:19.658618 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:19.658611 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-97hnf" Apr 17 20:21:19.677727 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:19.677670 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-97hnf" podStartSLOduration=1.227155976 podStartE2EDuration="3.677656634s" podCreationTimestamp="2026-04-17 20:21:16 +0000 UTC" firstStartedPulling="2026-04-17 20:21:16.947936441 +0000 UTC m=+344.024308420" lastFinishedPulling="2026-04-17 20:21:19.398437084 +0000 UTC m=+346.474809078" observedRunningTime="2026-04-17 20:21:19.676440614 +0000 UTC m=+346.752812616" watchObservedRunningTime="2026-04-17 20:21:19.677656634 +0000 UTC m=+346.754028635" Apr 17 20:21:22.669917 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:22.669879 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-fd99964b4-jkhdk" event={"ID":"97ad1cbe-cd5a-4054-ba83-b963654e0afb","Type":"ContainerStarted","Data":"19a68956f4990022b4bff713930e3217c34ae92e212e2a76574516dcecf14100"} Apr 17 20:21:22.670273 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:22.670004 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-fd99964b4-jkhdk" Apr 17 20:21:22.686720 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:22.686676 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-fd99964b4-jkhdk" podStartSLOduration=2.299795229 podStartE2EDuration="4.686663261s" podCreationTimestamp="2026-04-17 20:21:18 +0000 UTC" firstStartedPulling="2026-04-17 20:21:19.475703337 +0000 UTC m=+346.552075318" lastFinishedPulling="2026-04-17 20:21:21.862571365 +0000 UTC m=+348.938943350" observedRunningTime="2026-04-17 20:21:22.685940022 +0000 UTC m=+349.762312025" watchObservedRunningTime="2026-04-17 20:21:22.686663261 +0000 UTC m=+349.763035263" Apr 17 20:21:30.665391 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:30.665359 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-97hnf" Apr 17 20:21:33.674908 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:33.674871 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-fd99964b4-jkhdk" Apr 17 20:21:34.751402 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:34.751373 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-686dfbc75c-zz9tq"] Apr 17 20:21:34.755127 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:34.755108 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-686dfbc75c-zz9tq" Apr 17 20:21:34.757949 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:34.757930 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-7h6rh\"" Apr 17 20:21:34.758065 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:34.757947 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 17 20:21:34.758065 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:34.757947 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 17 20:21:34.761286 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:34.761250 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-686dfbc75c-zz9tq"] Apr 17 20:21:34.798567 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:34.798539 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d95x\" (UniqueName: \"kubernetes.io/projected/3db732ac-f37e-4b9d-a213-cc5a30a150bf-kube-api-access-9d95x\") pod \"kube-auth-proxy-686dfbc75c-zz9tq\" (UID: \"3db732ac-f37e-4b9d-a213-cc5a30a150bf\") " pod="openshift-ingress/kube-auth-proxy-686dfbc75c-zz9tq" Apr 17 20:21:34.798678 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:34.798581 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3db732ac-f37e-4b9d-a213-cc5a30a150bf-tls-certs\") pod \"kube-auth-proxy-686dfbc75c-zz9tq\" (UID: \"3db732ac-f37e-4b9d-a213-cc5a30a150bf\") " pod="openshift-ingress/kube-auth-proxy-686dfbc75c-zz9tq" Apr 17 20:21:34.798678 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:34.798602 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3db732ac-f37e-4b9d-a213-cc5a30a150bf-tmp\") pod \"kube-auth-proxy-686dfbc75c-zz9tq\" (UID: \"3db732ac-f37e-4b9d-a213-cc5a30a150bf\") " pod="openshift-ingress/kube-auth-proxy-686dfbc75c-zz9tq" Apr 17 20:21:34.899700 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:34.899671 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9d95x\" (UniqueName: \"kubernetes.io/projected/3db732ac-f37e-4b9d-a213-cc5a30a150bf-kube-api-access-9d95x\") pod \"kube-auth-proxy-686dfbc75c-zz9tq\" (UID: \"3db732ac-f37e-4b9d-a213-cc5a30a150bf\") " pod="openshift-ingress/kube-auth-proxy-686dfbc75c-zz9tq" Apr 17 20:21:34.899799 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:34.899729 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3db732ac-f37e-4b9d-a213-cc5a30a150bf-tls-certs\") pod \"kube-auth-proxy-686dfbc75c-zz9tq\" (UID: \"3db732ac-f37e-4b9d-a213-cc5a30a150bf\") " pod="openshift-ingress/kube-auth-proxy-686dfbc75c-zz9tq" Apr 17 20:21:34.899799 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:34.899755 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3db732ac-f37e-4b9d-a213-cc5a30a150bf-tmp\") pod \"kube-auth-proxy-686dfbc75c-zz9tq\" (UID: \"3db732ac-f37e-4b9d-a213-cc5a30a150bf\") " pod="openshift-ingress/kube-auth-proxy-686dfbc75c-zz9tq" Apr 17 20:21:34.902050 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:34.902031 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3db732ac-f37e-4b9d-a213-cc5a30a150bf-tmp\") pod \"kube-auth-proxy-686dfbc75c-zz9tq\" (UID: \"3db732ac-f37e-4b9d-a213-cc5a30a150bf\") " pod="openshift-ingress/kube-auth-proxy-686dfbc75c-zz9tq" Apr 17 20:21:34.902310 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:34.902293 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3db732ac-f37e-4b9d-a213-cc5a30a150bf-tls-certs\") pod \"kube-auth-proxy-686dfbc75c-zz9tq\" (UID: \"3db732ac-f37e-4b9d-a213-cc5a30a150bf\") " pod="openshift-ingress/kube-auth-proxy-686dfbc75c-zz9tq" Apr 17 20:21:34.906280 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:34.906262 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d95x\" (UniqueName: \"kubernetes.io/projected/3db732ac-f37e-4b9d-a213-cc5a30a150bf-kube-api-access-9d95x\") pod \"kube-auth-proxy-686dfbc75c-zz9tq\" (UID: \"3db732ac-f37e-4b9d-a213-cc5a30a150bf\") " pod="openshift-ingress/kube-auth-proxy-686dfbc75c-zz9tq" Apr 17 20:21:35.066080 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:35.066009 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-686dfbc75c-zz9tq" Apr 17 20:21:35.183738 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:35.183717 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-686dfbc75c-zz9tq"] Apr 17 20:21:35.186294 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:21:35.186264 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3db732ac_f37e_4b9d_a213_cc5a30a150bf.slice/crio-78e51dd658e082f9f72e56af3689dbdae3557522694374cb3207901732fd315c WatchSource:0}: Error finding container 78e51dd658e082f9f72e56af3689dbdae3557522694374cb3207901732fd315c: Status 404 returned error can't find the container with id 78e51dd658e082f9f72e56af3689dbdae3557522694374cb3207901732fd315c Apr 17 20:21:35.711713 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:35.711682 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-686dfbc75c-zz9tq" event={"ID":"3db732ac-f37e-4b9d-a213-cc5a30a150bf","Type":"ContainerStarted","Data":"78e51dd658e082f9f72e56af3689dbdae3557522694374cb3207901732fd315c"} Apr 17 20:21:39.726179 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:39.726135 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-686dfbc75c-zz9tq" event={"ID":"3db732ac-f37e-4b9d-a213-cc5a30a150bf","Type":"ContainerStarted","Data":"d4c794acb6c01df7763377335a60259ae0ff7d25f7e86bac97307eaa34dd84c1"} Apr 17 20:21:39.742703 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:21:39.742656 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-686dfbc75c-zz9tq" podStartSLOduration=2.071507144 podStartE2EDuration="5.742643309s" podCreationTimestamp="2026-04-17 20:21:34 +0000 UTC" firstStartedPulling="2026-04-17 20:21:35.188158299 +0000 UTC m=+362.264530283" lastFinishedPulling="2026-04-17 20:21:38.859294465 +0000 UTC m=+365.935666448" observedRunningTime="2026-04-17 20:21:39.740877973 +0000 UTC m=+366.817249990" watchObservedRunningTime="2026-04-17 20:21:39.742643309 +0000 UTC m=+366.819015372" Apr 17 20:23:19.258224 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:23:19.258192 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-7kz58"] Apr 17 20:23:19.261598 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:23:19.261583 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-7kz58" Apr 17 20:23:19.266968 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:23:19.266930 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 17 20:23:19.267121 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:23:19.267032 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 20:23:19.267121 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:23:19.267032 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 17 20:23:19.267332 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:23:19.267316 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 20:23:19.268034 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:23:19.267827 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-dhzz5\"" Apr 17 20:23:19.272520 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:23:19.272500 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-7kz58"] Apr 17 20:23:19.356041 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:23:19.356015 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8af0c5f7-0993-405e-a894-94b97c9218f2-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-7kz58\" (UID: \"8af0c5f7-0993-405e-a894-94b97c9218f2\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-7kz58" Apr 17 20:23:19.356209 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:23:19.356068 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8af0c5f7-0993-405e-a894-94b97c9218f2-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-7kz58\" (UID: \"8af0c5f7-0993-405e-a894-94b97c9218f2\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-7kz58" Apr 17 20:23:19.356209 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:23:19.356125 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z87ng\" (UniqueName: \"kubernetes.io/projected/8af0c5f7-0993-405e-a894-94b97c9218f2-kube-api-access-z87ng\") pod \"kuadrant-console-plugin-6cb54b5c86-7kz58\" (UID: \"8af0c5f7-0993-405e-a894-94b97c9218f2\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-7kz58" Apr 17 20:23:19.457355 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:23:19.457322 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8af0c5f7-0993-405e-a894-94b97c9218f2-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-7kz58\" (UID: \"8af0c5f7-0993-405e-a894-94b97c9218f2\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-7kz58" Apr 17 20:23:19.457549 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:23:19.457394 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8af0c5f7-0993-405e-a894-94b97c9218f2-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-7kz58\" (UID: \"8af0c5f7-0993-405e-a894-94b97c9218f2\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-7kz58" Apr 17 20:23:19.457549 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:23:19.457427 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z87ng\" (UniqueName: \"kubernetes.io/projected/8af0c5f7-0993-405e-a894-94b97c9218f2-kube-api-access-z87ng\") pod \"kuadrant-console-plugin-6cb54b5c86-7kz58\" (UID: \"8af0c5f7-0993-405e-a894-94b97c9218f2\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-7kz58" Apr 17 20:23:19.457990 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:23:19.457970 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8af0c5f7-0993-405e-a894-94b97c9218f2-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-7kz58\" (UID: \"8af0c5f7-0993-405e-a894-94b97c9218f2\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-7kz58" Apr 17 20:23:19.459768 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:23:19.459751 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8af0c5f7-0993-405e-a894-94b97c9218f2-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-7kz58\" (UID: \"8af0c5f7-0993-405e-a894-94b97c9218f2\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-7kz58" Apr 17 20:23:19.464652 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:23:19.464630 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z87ng\" (UniqueName: \"kubernetes.io/projected/8af0c5f7-0993-405e-a894-94b97c9218f2-kube-api-access-z87ng\") pod \"kuadrant-console-plugin-6cb54b5c86-7kz58\" (UID: \"8af0c5f7-0993-405e-a894-94b97c9218f2\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-7kz58" Apr 17 20:23:19.581698 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:23:19.581672 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-7kz58" Apr 17 20:23:19.702936 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:23:19.702910 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-7kz58"] Apr 17 20:23:19.705587 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:23:19.705551 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8af0c5f7_0993_405e_a894_94b97c9218f2.slice/crio-e33fe7663848b501e1f1e28933cc23824316155eb142cb068897995e65ff0a5b WatchSource:0}: Error finding container e33fe7663848b501e1f1e28933cc23824316155eb142cb068897995e65ff0a5b: Status 404 returned error can't find the container with id e33fe7663848b501e1f1e28933cc23824316155eb142cb068897995e65ff0a5b Apr 17 20:23:20.049009 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:23:20.048931 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-7kz58" event={"ID":"8af0c5f7-0993-405e-a894-94b97c9218f2","Type":"ContainerStarted","Data":"e33fe7663848b501e1f1e28933cc23824316155eb142cb068897995e65ff0a5b"} Apr 17 20:23:44.150472 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:23:44.150423 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-7kz58" event={"ID":"8af0c5f7-0993-405e-a894-94b97c9218f2","Type":"ContainerStarted","Data":"6b7c375f1c74e29c211c653a1b165b24900e562c4cbdd44a44e1d83c15c01c6b"} Apr 17 20:23:44.165839 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:23:44.165795 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-7kz58" podStartSLOduration=1.766568512 podStartE2EDuration="25.165781766s" podCreationTimestamp="2026-04-17 20:23:19 +0000 UTC" firstStartedPulling="2026-04-17 20:23:19.706934371 +0000 UTC m=+466.783306354" lastFinishedPulling="2026-04-17 20:23:43.106147626 +0000 UTC m=+490.182519608" observedRunningTime="2026-04-17 20:23:44.164246403 +0000 UTC m=+491.240618402" watchObservedRunningTime="2026-04-17 20:23:44.165781766 +0000 UTC m=+491.242153768" Apr 17 20:24:06.169575 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:06.169530 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-vxkn7"] Apr 17 20:24:06.247604 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:06.247570 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-vxkn7"] Apr 17 20:24:06.247604 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:06.247599 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-vxkn7"] Apr 17 20:24:06.247783 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:06.247701 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-vxkn7" Apr 17 20:24:06.249607 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:06.249588 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 17 20:24:06.265893 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:06.265860 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/e63872d6-333e-4199-9281-a33b9fc40c93-config-file\") pod \"limitador-limitador-78c99df468-vxkn7\" (UID: \"e63872d6-333e-4199-9281-a33b9fc40c93\") " pod="kuadrant-system/limitador-limitador-78c99df468-vxkn7" Apr 17 20:24:06.266100 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:06.266082 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znxkz\" (UniqueName: \"kubernetes.io/projected/e63872d6-333e-4199-9281-a33b9fc40c93-kube-api-access-znxkz\") pod \"limitador-limitador-78c99df468-vxkn7\" (UID: \"e63872d6-333e-4199-9281-a33b9fc40c93\") " pod="kuadrant-system/limitador-limitador-78c99df468-vxkn7" Apr 17 20:24:06.367130 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:06.367098 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/e63872d6-333e-4199-9281-a33b9fc40c93-config-file\") pod \"limitador-limitador-78c99df468-vxkn7\" (UID: \"e63872d6-333e-4199-9281-a33b9fc40c93\") " pod="kuadrant-system/limitador-limitador-78c99df468-vxkn7" Apr 17 20:24:06.367259 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:06.367143 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-znxkz\" (UniqueName: \"kubernetes.io/projected/e63872d6-333e-4199-9281-a33b9fc40c93-kube-api-access-znxkz\") pod \"limitador-limitador-78c99df468-vxkn7\" (UID: \"e63872d6-333e-4199-9281-a33b9fc40c93\") " pod="kuadrant-system/limitador-limitador-78c99df468-vxkn7" Apr 17 20:24:06.367799 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:06.367779 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/e63872d6-333e-4199-9281-a33b9fc40c93-config-file\") pod \"limitador-limitador-78c99df468-vxkn7\" (UID: \"e63872d6-333e-4199-9281-a33b9fc40c93\") " pod="kuadrant-system/limitador-limitador-78c99df468-vxkn7" Apr 17 20:24:06.374246 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:06.374219 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-znxkz\" (UniqueName: \"kubernetes.io/projected/e63872d6-333e-4199-9281-a33b9fc40c93-kube-api-access-znxkz\") pod \"limitador-limitador-78c99df468-vxkn7\" (UID: \"e63872d6-333e-4199-9281-a33b9fc40c93\") " pod="kuadrant-system/limitador-limitador-78c99df468-vxkn7" Apr 17 20:24:06.558217 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:06.558139 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-vxkn7" Apr 17 20:24:06.676793 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:06.676758 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-vxkn7"] Apr 17 20:24:06.679789 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:24:06.679762 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode63872d6_333e_4199_9281_a33b9fc40c93.slice/crio-73265f431e11a0e61442d998f8bfd342eb0b42ed649ea0e8810685df2d9a285c WatchSource:0}: Error finding container 73265f431e11a0e61442d998f8bfd342eb0b42ed649ea0e8810685df2d9a285c: Status 404 returned error can't find the container with id 73265f431e11a0e61442d998f8bfd342eb0b42ed649ea0e8810685df2d9a285c Apr 17 20:24:07.226639 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:07.226580 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-vxkn7" event={"ID":"e63872d6-333e-4199-9281-a33b9fc40c93","Type":"ContainerStarted","Data":"73265f431e11a0e61442d998f8bfd342eb0b42ed649ea0e8810685df2d9a285c"} Apr 17 20:24:10.238157 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:10.238115 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-vxkn7" event={"ID":"e63872d6-333e-4199-9281-a33b9fc40c93","Type":"ContainerStarted","Data":"c28d64d880af4c5757e7577fb5d2d93d2aca7206c7f94a87c9ca78f32a8a05ee"} Apr 17 20:24:10.238548 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:10.238231 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-vxkn7" Apr 17 20:24:10.253584 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:10.253536 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-vxkn7" podStartSLOduration=1.7340506759999998 podStartE2EDuration="4.253521587s" podCreationTimestamp="2026-04-17 20:24:06 +0000 UTC" firstStartedPulling="2026-04-17 20:24:06.68161387 +0000 UTC m=+513.757985851" lastFinishedPulling="2026-04-17 20:24:09.201084782 +0000 UTC m=+516.277456762" observedRunningTime="2026-04-17 20:24:10.251533315 +0000 UTC m=+517.327905317" watchObservedRunningTime="2026-04-17 20:24:10.253521587 +0000 UTC m=+517.329893589" Apr 17 20:24:21.243441 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:21.243348 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-vxkn7" Apr 17 20:24:39.745288 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:39.745251 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-6cb96f9bfb-fcgcf"] Apr 17 20:24:39.759877 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:39.759843 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-6cb96f9bfb-fcgcf"] Apr 17 20:24:39.759996 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:39.759905 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6cb96f9bfb-fcgcf" Apr 17 20:24:39.762096 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:39.762078 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-5m5m7\"" Apr 17 20:24:39.847828 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:39.847797 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6ct7\" (UniqueName: \"kubernetes.io/projected/3be0df59-e670-499a-a846-46d1c34e3bd7-kube-api-access-x6ct7\") pod \"authorino-6cb96f9bfb-fcgcf\" (UID: \"3be0df59-e670-499a-a846-46d1c34e3bd7\") " pod="kuadrant-system/authorino-6cb96f9bfb-fcgcf" Apr 17 20:24:39.949224 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:39.949192 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x6ct7\" (UniqueName: \"kubernetes.io/projected/3be0df59-e670-499a-a846-46d1c34e3bd7-kube-api-access-x6ct7\") pod \"authorino-6cb96f9bfb-fcgcf\" (UID: \"3be0df59-e670-499a-a846-46d1c34e3bd7\") " pod="kuadrant-system/authorino-6cb96f9bfb-fcgcf" Apr 17 20:24:39.956529 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:39.956500 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6ct7\" (UniqueName: \"kubernetes.io/projected/3be0df59-e670-499a-a846-46d1c34e3bd7-kube-api-access-x6ct7\") pod \"authorino-6cb96f9bfb-fcgcf\" (UID: \"3be0df59-e670-499a-a846-46d1c34e3bd7\") " pod="kuadrant-system/authorino-6cb96f9bfb-fcgcf" Apr 17 20:24:39.974615 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:39.974589 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-6cb96f9bfb-fcgcf"] Apr 17 20:24:39.974808 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:39.974797 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6cb96f9bfb-fcgcf" Apr 17 20:24:40.090253 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:40.090227 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-6cb96f9bfb-fcgcf"] Apr 17 20:24:40.092876 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:24:40.092848 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3be0df59_e670_499a_a846_46d1c34e3bd7.slice/crio-5c6ff9e93128d3d518d813da65492a3cd087af29d5bbef9bbad0bd4ef9871a65 WatchSource:0}: Error finding container 5c6ff9e93128d3d518d813da65492a3cd087af29d5bbef9bbad0bd4ef9871a65: Status 404 returned error can't find the container with id 5c6ff9e93128d3d518d813da65492a3cd087af29d5bbef9bbad0bd4ef9871a65 Apr 17 20:24:40.337691 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:40.337660 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-6cb96f9bfb-fcgcf" event={"ID":"3be0df59-e670-499a-a846-46d1c34e3bd7","Type":"ContainerStarted","Data":"5c6ff9e93128d3d518d813da65492a3cd087af29d5bbef9bbad0bd4ef9871a65"} Apr 17 20:24:43.350698 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:43.350662 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-6cb96f9bfb-fcgcf" event={"ID":"3be0df59-e670-499a-a846-46d1c34e3bd7","Type":"ContainerStarted","Data":"c45a83a315a82a5ff203c3e7a5ce326c3f2e56008f3fae4c6b9bce7dd5e4b851"} Apr 17 20:24:43.351177 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:43.350703 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-6cb96f9bfb-fcgcf" podUID="3be0df59-e670-499a-a846-46d1c34e3bd7" containerName="authorino" containerID="cri-o://c45a83a315a82a5ff203c3e7a5ce326c3f2e56008f3fae4c6b9bce7dd5e4b851" gracePeriod=30 Apr 17 20:24:43.364288 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:43.364229 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-6cb96f9bfb-fcgcf" podStartSLOduration=1.2949837180000001 podStartE2EDuration="4.364211139s" podCreationTimestamp="2026-04-17 20:24:39 +0000 UTC" firstStartedPulling="2026-04-17 20:24:40.094081882 +0000 UTC m=+547.170453866" lastFinishedPulling="2026-04-17 20:24:43.163309306 +0000 UTC m=+550.239681287" observedRunningTime="2026-04-17 20:24:43.363548092 +0000 UTC m=+550.439920097" watchObservedRunningTime="2026-04-17 20:24:43.364211139 +0000 UTC m=+550.440583143" Apr 17 20:24:43.596115 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:43.596088 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6cb96f9bfb-fcgcf" Apr 17 20:24:43.683164 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:43.683134 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6ct7\" (UniqueName: \"kubernetes.io/projected/3be0df59-e670-499a-a846-46d1c34e3bd7-kube-api-access-x6ct7\") pod \"3be0df59-e670-499a-a846-46d1c34e3bd7\" (UID: \"3be0df59-e670-499a-a846-46d1c34e3bd7\") " Apr 17 20:24:43.685427 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:43.685398 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3be0df59-e670-499a-a846-46d1c34e3bd7-kube-api-access-x6ct7" (OuterVolumeSpecName: "kube-api-access-x6ct7") pod "3be0df59-e670-499a-a846-46d1c34e3bd7" (UID: "3be0df59-e670-499a-a846-46d1c34e3bd7"). InnerVolumeSpecName "kube-api-access-x6ct7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:24:43.784105 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:43.784029 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x6ct7\" (UniqueName: \"kubernetes.io/projected/3be0df59-e670-499a-a846-46d1c34e3bd7-kube-api-access-x6ct7\") on node \"ip-10-0-132-57.ec2.internal\" DevicePath \"\"" Apr 17 20:24:44.354635 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:44.354601 2580 generic.go:358] "Generic (PLEG): container finished" podID="3be0df59-e670-499a-a846-46d1c34e3bd7" containerID="c45a83a315a82a5ff203c3e7a5ce326c3f2e56008f3fae4c6b9bce7dd5e4b851" exitCode=0 Apr 17 20:24:44.355036 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:44.354645 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6cb96f9bfb-fcgcf" Apr 17 20:24:44.355036 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:44.354692 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-6cb96f9bfb-fcgcf" event={"ID":"3be0df59-e670-499a-a846-46d1c34e3bd7","Type":"ContainerDied","Data":"c45a83a315a82a5ff203c3e7a5ce326c3f2e56008f3fae4c6b9bce7dd5e4b851"} Apr 17 20:24:44.355036 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:44.354730 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-6cb96f9bfb-fcgcf" event={"ID":"3be0df59-e670-499a-a846-46d1c34e3bd7","Type":"ContainerDied","Data":"5c6ff9e93128d3d518d813da65492a3cd087af29d5bbef9bbad0bd4ef9871a65"} Apr 17 20:24:44.355036 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:44.354749 2580 scope.go:117] "RemoveContainer" containerID="c45a83a315a82a5ff203c3e7a5ce326c3f2e56008f3fae4c6b9bce7dd5e4b851" Apr 17 20:24:44.364009 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:44.363994 2580 scope.go:117] "RemoveContainer" containerID="c45a83a315a82a5ff203c3e7a5ce326c3f2e56008f3fae4c6b9bce7dd5e4b851" Apr 17 20:24:44.364244 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:24:44.364223 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c45a83a315a82a5ff203c3e7a5ce326c3f2e56008f3fae4c6b9bce7dd5e4b851\": container with ID starting with c45a83a315a82a5ff203c3e7a5ce326c3f2e56008f3fae4c6b9bce7dd5e4b851 not found: ID does not exist" containerID="c45a83a315a82a5ff203c3e7a5ce326c3f2e56008f3fae4c6b9bce7dd5e4b851" Apr 17 20:24:44.364306 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:44.364257 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c45a83a315a82a5ff203c3e7a5ce326c3f2e56008f3fae4c6b9bce7dd5e4b851"} err="failed to get container status \"c45a83a315a82a5ff203c3e7a5ce326c3f2e56008f3fae4c6b9bce7dd5e4b851\": rpc error: code = NotFound desc = could not find container \"c45a83a315a82a5ff203c3e7a5ce326c3f2e56008f3fae4c6b9bce7dd5e4b851\": container with ID starting with c45a83a315a82a5ff203c3e7a5ce326c3f2e56008f3fae4c6b9bce7dd5e4b851 not found: ID does not exist" Apr 17 20:24:44.374026 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:44.374004 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-6cb96f9bfb-fcgcf"] Apr 17 20:24:44.379492 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:44.379453 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-6cb96f9bfb-fcgcf"] Apr 17 20:24:45.531143 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:24:45.531109 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3be0df59-e670-499a-a846-46d1c34e3bd7" path="/var/lib/kubelet/pods/3be0df59-e670-499a-a846-46d1c34e3bd7/volumes" Apr 17 20:25:18.826022 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:18.825986 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-vxkn7"] Apr 17 20:25:33.432421 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:33.432384 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrmd6_c81797ca-3338-4294-8ef8-fb0416677637/ovn-acl-logging/0.log" Apr 17 20:25:33.433134 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:33.433114 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrmd6_c81797ca-3338-4294-8ef8-fb0416677637/ovn-acl-logging/0.log" Apr 17 20:25:44.628184 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:44.628149 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-vxkn7"] Apr 17 20:25:46.131852 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:46.131820 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-vxkn7"] Apr 17 20:25:46.716736 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:46.716703 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v"] Apr 17 20:25:46.717070 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:46.717058 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3be0df59-e670-499a-a846-46d1c34e3bd7" containerName="authorino" Apr 17 20:25:46.717114 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:46.717073 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="3be0df59-e670-499a-a846-46d1c34e3bd7" containerName="authorino" Apr 17 20:25:46.717168 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:46.717158 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="3be0df59-e670-499a-a846-46d1c34e3bd7" containerName="authorino" Apr 17 20:25:46.721589 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:46.721568 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" Apr 17 20:25:46.723751 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:46.723732 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 17 20:25:46.723890 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:46.723760 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 17 20:25:46.723890 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:46.723780 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 17 20:25:46.724619 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:46.724605 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-vsxc9\"" Apr 17 20:25:46.730589 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:46.730569 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v"] Apr 17 20:25:46.734555 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:46.734535 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2b6aa065-f505-474b-a1ad-202164b57fc5-model-cache\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v\" (UID: \"2b6aa065-f505-474b-a1ad-202164b57fc5\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" Apr 17 20:25:46.734617 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:46.734580 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2b6aa065-f505-474b-a1ad-202164b57fc5-dshm\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v\" (UID: \"2b6aa065-f505-474b-a1ad-202164b57fc5\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" Apr 17 20:25:46.734677 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:46.734660 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2b6aa065-f505-474b-a1ad-202164b57fc5-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v\" (UID: \"2b6aa065-f505-474b-a1ad-202164b57fc5\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" Apr 17 20:25:46.734713 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:46.734687 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2b6aa065-f505-474b-a1ad-202164b57fc5-home\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v\" (UID: \"2b6aa065-f505-474b-a1ad-202164b57fc5\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" Apr 17 20:25:46.734713 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:46.734705 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j24n6\" (UniqueName: \"kubernetes.io/projected/2b6aa065-f505-474b-a1ad-202164b57fc5-kube-api-access-j24n6\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v\" (UID: \"2b6aa065-f505-474b-a1ad-202164b57fc5\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" Apr 17 20:25:46.734779 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:46.734766 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2b6aa065-f505-474b-a1ad-202164b57fc5-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v\" (UID: \"2b6aa065-f505-474b-a1ad-202164b57fc5\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" Apr 17 20:25:46.835432 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:46.835378 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2b6aa065-f505-474b-a1ad-202164b57fc5-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v\" (UID: \"2b6aa065-f505-474b-a1ad-202164b57fc5\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" Apr 17 20:25:46.835649 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:46.835498 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2b6aa065-f505-474b-a1ad-202164b57fc5-model-cache\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v\" (UID: \"2b6aa065-f505-474b-a1ad-202164b57fc5\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" Apr 17 20:25:46.835649 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:46.835524 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2b6aa065-f505-474b-a1ad-202164b57fc5-dshm\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v\" (UID: \"2b6aa065-f505-474b-a1ad-202164b57fc5\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" Apr 17 20:25:46.835649 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:46.835554 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2b6aa065-f505-474b-a1ad-202164b57fc5-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v\" (UID: \"2b6aa065-f505-474b-a1ad-202164b57fc5\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" Apr 17 20:25:46.835649 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:46.835574 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2b6aa065-f505-474b-a1ad-202164b57fc5-home\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v\" (UID: \"2b6aa065-f505-474b-a1ad-202164b57fc5\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" Apr 17 20:25:46.835649 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:46.835592 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j24n6\" (UniqueName: \"kubernetes.io/projected/2b6aa065-f505-474b-a1ad-202164b57fc5-kube-api-access-j24n6\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v\" (UID: \"2b6aa065-f505-474b-a1ad-202164b57fc5\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" Apr 17 20:25:46.835875 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:46.835847 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2b6aa065-f505-474b-a1ad-202164b57fc5-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v\" (UID: \"2b6aa065-f505-474b-a1ad-202164b57fc5\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" Apr 17 20:25:46.835932 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:46.835917 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2b6aa065-f505-474b-a1ad-202164b57fc5-model-cache\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v\" (UID: \"2b6aa065-f505-474b-a1ad-202164b57fc5\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" Apr 17 20:25:46.836032 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:46.836010 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2b6aa065-f505-474b-a1ad-202164b57fc5-home\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v\" (UID: \"2b6aa065-f505-474b-a1ad-202164b57fc5\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" Apr 17 20:25:46.837962 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:46.837935 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2b6aa065-f505-474b-a1ad-202164b57fc5-dshm\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v\" (UID: \"2b6aa065-f505-474b-a1ad-202164b57fc5\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" Apr 17 20:25:46.838112 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:46.838094 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2b6aa065-f505-474b-a1ad-202164b57fc5-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v\" (UID: \"2b6aa065-f505-474b-a1ad-202164b57fc5\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" Apr 17 20:25:46.842959 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:46.842936 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j24n6\" (UniqueName: \"kubernetes.io/projected/2b6aa065-f505-474b-a1ad-202164b57fc5-kube-api-access-j24n6\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v\" (UID: \"2b6aa065-f505-474b-a1ad-202164b57fc5\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" Apr 17 20:25:47.033279 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:47.033202 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" Apr 17 20:25:47.161677 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:47.161648 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v"] Apr 17 20:25:47.165331 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:25:47.165305 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b6aa065_f505_474b_a1ad_202164b57fc5.slice/crio-04418183134a5b54e51be8fb9c4b74890b74bf7dac128954f10ca3e7307497f7 WatchSource:0}: Error finding container 04418183134a5b54e51be8fb9c4b74890b74bf7dac128954f10ca3e7307497f7: Status 404 returned error can't find the container with id 04418183134a5b54e51be8fb9c4b74890b74bf7dac128954f10ca3e7307497f7 Apr 17 20:25:47.572764 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:47.572719 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" event={"ID":"2b6aa065-f505-474b-a1ad-202164b57fc5","Type":"ContainerStarted","Data":"04418183134a5b54e51be8fb9c4b74890b74bf7dac128954f10ca3e7307497f7"} Apr 17 20:25:49.131751 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:49.131712 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-vxkn7"] Apr 17 20:25:52.591998 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:52.591964 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" event={"ID":"2b6aa065-f505-474b-a1ad-202164b57fc5","Type":"ContainerStarted","Data":"fdda8a98e387e23b9a5b628bda3039d6b793a2154c2a92808f78ea5f30972470"} Apr 17 20:25:53.042725 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:53.042635 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-vxkn7"] Apr 17 20:25:57.610397 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:57.610363 2580 generic.go:358] "Generic (PLEG): container finished" podID="2b6aa065-f505-474b-a1ad-202164b57fc5" containerID="fdda8a98e387e23b9a5b628bda3039d6b793a2154c2a92808f78ea5f30972470" exitCode=0 Apr 17 20:25:57.610397 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:57.610407 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" event={"ID":"2b6aa065-f505-474b-a1ad-202164b57fc5","Type":"ContainerDied","Data":"fdda8a98e387e23b9a5b628bda3039d6b793a2154c2a92808f78ea5f30972470"} Apr 17 20:25:59.136333 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:59.136296 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-vxkn7"] Apr 17 20:25:59.618282 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:59.618245 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_2b6aa065-f505-474b-a1ad-202164b57fc5/main/0.log" Apr 17 20:25:59.618604 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:59.618580 2580 generic.go:358] "Generic (PLEG): container finished" podID="2b6aa065-f505-474b-a1ad-202164b57fc5" containerID="31a4eff92c2d1447ee36af64a9e7d341db52efed83ca16b9f901e31d24033e50" exitCode=2 Apr 17 20:25:59.618667 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:59.618639 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" event={"ID":"2b6aa065-f505-474b-a1ad-202164b57fc5","Type":"ContainerDied","Data":"31a4eff92c2d1447ee36af64a9e7d341db52efed83ca16b9f901e31d24033e50"} Apr 17 20:25:59.619008 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:25:59.618993 2580 scope.go:117] "RemoveContainer" containerID="31a4eff92c2d1447ee36af64a9e7d341db52efed83ca16b9f901e31d24033e50" Apr 17 20:26:00.623393 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:00.623364 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_2b6aa065-f505-474b-a1ad-202164b57fc5/main/1.log" Apr 17 20:26:00.623879 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:00.623805 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_2b6aa065-f505-474b-a1ad-202164b57fc5/main/0.log" Apr 17 20:26:00.624160 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:00.624137 2580 generic.go:358] "Generic (PLEG): container finished" podID="2b6aa065-f505-474b-a1ad-202164b57fc5" containerID="cd8dd6c10b274d9904447ee8420757e391421044f252e260a60632db1bc9d7b7" exitCode=2 Apr 17 20:26:00.624227 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:00.624209 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" event={"ID":"2b6aa065-f505-474b-a1ad-202164b57fc5","Type":"ContainerDied","Data":"cd8dd6c10b274d9904447ee8420757e391421044f252e260a60632db1bc9d7b7"} Apr 17 20:26:00.624260 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:00.624251 2580 scope.go:117] "RemoveContainer" containerID="31a4eff92c2d1447ee36af64a9e7d341db52efed83ca16b9f901e31d24033e50" Apr 17 20:26:00.624710 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:00.624695 2580 scope.go:117] "RemoveContainer" containerID="cd8dd6c10b274d9904447ee8420757e391421044f252e260a60632db1bc9d7b7" Apr 17 20:26:00.624909 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:26:00.624888 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_llm(2b6aa065-f505-474b-a1ad-202164b57fc5)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" podUID="2b6aa065-f505-474b-a1ad-202164b57fc5" Apr 17 20:26:01.629050 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:01.629022 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_2b6aa065-f505-474b-a1ad-202164b57fc5/main/1.log" Apr 17 20:26:07.033926 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:07.033894 2580 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" Apr 17 20:26:07.033926 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:07.033934 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" Apr 17 20:26:07.034484 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:07.034447 2580 scope.go:117] "RemoveContainer" containerID="cd8dd6c10b274d9904447ee8420757e391421044f252e260a60632db1bc9d7b7" Apr 17 20:26:07.034716 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:26:07.034696 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_llm(2b6aa065-f505-474b-a1ad-202164b57fc5)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" podUID="2b6aa065-f505-474b-a1ad-202164b57fc5" Apr 17 20:26:19.030431 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:19.030391 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-6d5965695-ntccf"] Apr 17 20:26:19.051705 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:19.051676 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-6d5965695-ntccf"] Apr 17 20:26:19.051845 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:19.051798 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-ntccf" Apr 17 20:26:19.053866 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:19.053846 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 17 20:26:19.124141 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:19.124111 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wrfv\" (UniqueName: \"kubernetes.io/projected/2002ee70-8b98-4354-8670-18b07579e406-kube-api-access-2wrfv\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-ntccf\" (UID: \"2002ee70-8b98-4354-8670-18b07579e406\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-ntccf" Apr 17 20:26:19.124277 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:19.124156 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2002ee70-8b98-4354-8670-18b07579e406-model-cache\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-ntccf\" (UID: \"2002ee70-8b98-4354-8670-18b07579e406\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-ntccf" Apr 17 20:26:19.124277 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:19.124234 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2002ee70-8b98-4354-8670-18b07579e406-home\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-ntccf\" (UID: \"2002ee70-8b98-4354-8670-18b07579e406\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-ntccf" Apr 17 20:26:19.124352 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:19.124282 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2002ee70-8b98-4354-8670-18b07579e406-dshm\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-ntccf\" (UID: \"2002ee70-8b98-4354-8670-18b07579e406\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-ntccf" Apr 17 20:26:19.124386 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:19.124354 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2002ee70-8b98-4354-8670-18b07579e406-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-ntccf\" (UID: \"2002ee70-8b98-4354-8670-18b07579e406\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-ntccf" Apr 17 20:26:19.124427 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:19.124382 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2002ee70-8b98-4354-8670-18b07579e406-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-ntccf\" (UID: \"2002ee70-8b98-4354-8670-18b07579e406\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-ntccf" Apr 17 20:26:19.225019 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:19.224984 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2002ee70-8b98-4354-8670-18b07579e406-dshm\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-ntccf\" (UID: \"2002ee70-8b98-4354-8670-18b07579e406\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-ntccf" Apr 17 20:26:19.225166 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:19.225031 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2002ee70-8b98-4354-8670-18b07579e406-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-ntccf\" (UID: \"2002ee70-8b98-4354-8670-18b07579e406\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-ntccf" Apr 17 20:26:19.225166 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:19.225153 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2002ee70-8b98-4354-8670-18b07579e406-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-ntccf\" (UID: \"2002ee70-8b98-4354-8670-18b07579e406\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-ntccf" Apr 17 20:26:19.225243 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:19.225220 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2wrfv\" (UniqueName: \"kubernetes.io/projected/2002ee70-8b98-4354-8670-18b07579e406-kube-api-access-2wrfv\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-ntccf\" (UID: \"2002ee70-8b98-4354-8670-18b07579e406\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-ntccf" Apr 17 20:26:19.225285 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:19.225268 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2002ee70-8b98-4354-8670-18b07579e406-model-cache\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-ntccf\" (UID: \"2002ee70-8b98-4354-8670-18b07579e406\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-ntccf" Apr 17 20:26:19.225331 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:19.225310 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2002ee70-8b98-4354-8670-18b07579e406-home\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-ntccf\" (UID: \"2002ee70-8b98-4354-8670-18b07579e406\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-ntccf" Apr 17 20:26:19.225385 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:19.225325 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2002ee70-8b98-4354-8670-18b07579e406-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-ntccf\" (UID: \"2002ee70-8b98-4354-8670-18b07579e406\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-ntccf" Apr 17 20:26:19.225628 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:19.225608 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2002ee70-8b98-4354-8670-18b07579e406-model-cache\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-ntccf\" (UID: \"2002ee70-8b98-4354-8670-18b07579e406\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-ntccf" Apr 17 20:26:19.225722 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:19.225663 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2002ee70-8b98-4354-8670-18b07579e406-home\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-ntccf\" (UID: \"2002ee70-8b98-4354-8670-18b07579e406\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-ntccf" Apr 17 20:26:19.227291 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:19.227268 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2002ee70-8b98-4354-8670-18b07579e406-dshm\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-ntccf\" (UID: \"2002ee70-8b98-4354-8670-18b07579e406\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-ntccf" Apr 17 20:26:19.227715 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:19.227698 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2002ee70-8b98-4354-8670-18b07579e406-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-ntccf\" (UID: \"2002ee70-8b98-4354-8670-18b07579e406\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-ntccf" Apr 17 20:26:19.233308 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:19.233286 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wrfv\" (UniqueName: \"kubernetes.io/projected/2002ee70-8b98-4354-8670-18b07579e406-kube-api-access-2wrfv\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-ntccf\" (UID: \"2002ee70-8b98-4354-8670-18b07579e406\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-ntccf" Apr 17 20:26:19.362281 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:19.362248 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-ntccf" Apr 17 20:26:19.486747 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:19.486724 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-6d5965695-ntccf"] Apr 17 20:26:19.489170 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:26:19.489133 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2002ee70_8b98_4354_8670_18b07579e406.slice/crio-b802225f5950c832c51ebbb73fac8ed6002babe657dcb35217a56538163ce451 WatchSource:0}: Error finding container b802225f5950c832c51ebbb73fac8ed6002babe657dcb35217a56538163ce451: Status 404 returned error can't find the container with id b802225f5950c832c51ebbb73fac8ed6002babe657dcb35217a56538163ce451 Apr 17 20:26:19.491047 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:19.491028 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 20:26:19.526763 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:19.526746 2580 scope.go:117] "RemoveContainer" containerID="cd8dd6c10b274d9904447ee8420757e391421044f252e260a60632db1bc9d7b7" Apr 17 20:26:19.690531 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:19.690419 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-ntccf" event={"ID":"2002ee70-8b98-4354-8670-18b07579e406","Type":"ContainerStarted","Data":"6c4b6635dec19322cf1703dc788cb76095203938d32ca7a7a468b3e92e861cb6"} Apr 17 20:26:19.690531 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:19.690486 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-ntccf" event={"ID":"2002ee70-8b98-4354-8670-18b07579e406","Type":"ContainerStarted","Data":"b802225f5950c832c51ebbb73fac8ed6002babe657dcb35217a56538163ce451"} Apr 17 20:26:20.695382 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:20.695348 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_2b6aa065-f505-474b-a1ad-202164b57fc5/main/2.log" Apr 17 20:26:20.695814 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:20.695788 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_2b6aa065-f505-474b-a1ad-202164b57fc5/main/1.log" Apr 17 20:26:20.696136 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:20.696105 2580 generic.go:358] "Generic (PLEG): container finished" podID="2b6aa065-f505-474b-a1ad-202164b57fc5" containerID="efde509ffb7f6c7cf288edeba20b95b508579125a48f21080d4c3606676d4a8c" exitCode=2 Apr 17 20:26:20.696203 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:20.696181 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" event={"ID":"2b6aa065-f505-474b-a1ad-202164b57fc5","Type":"ContainerDied","Data":"efde509ffb7f6c7cf288edeba20b95b508579125a48f21080d4c3606676d4a8c"} Apr 17 20:26:20.696243 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:20.696225 2580 scope.go:117] "RemoveContainer" containerID="cd8dd6c10b274d9904447ee8420757e391421044f252e260a60632db1bc9d7b7" Apr 17 20:26:20.697003 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:20.696861 2580 scope.go:117] "RemoveContainer" containerID="efde509ffb7f6c7cf288edeba20b95b508579125a48f21080d4c3606676d4a8c" Apr 17 20:26:20.697139 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:26:20.697068 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_llm(2b6aa065-f505-474b-a1ad-202164b57fc5)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" podUID="2b6aa065-f505-474b-a1ad-202164b57fc5" Apr 17 20:26:21.231163 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:21.231130 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-vxkn7"] Apr 17 20:26:21.700362 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:21.700329 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_2b6aa065-f505-474b-a1ad-202164b57fc5/main/2.log" Apr 17 20:26:25.715567 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:25.715528 2580 generic.go:358] "Generic (PLEG): container finished" podID="2002ee70-8b98-4354-8670-18b07579e406" containerID="6c4b6635dec19322cf1703dc788cb76095203938d32ca7a7a468b3e92e861cb6" exitCode=0 Apr 17 20:26:25.715966 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:25.715589 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-ntccf" event={"ID":"2002ee70-8b98-4354-8670-18b07579e406","Type":"ContainerDied","Data":"6c4b6635dec19322cf1703dc788cb76095203938d32ca7a7a468b3e92e861cb6"} Apr 17 20:26:27.033534 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:27.033485 2580 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" Apr 17 20:26:27.033534 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:27.033535 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" Apr 17 20:26:27.034079 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:27.033997 2580 scope.go:117] "RemoveContainer" containerID="efde509ffb7f6c7cf288edeba20b95b508579125a48f21080d4c3606676d4a8c" Apr 17 20:26:27.034405 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:26:27.034364 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_llm(2b6aa065-f505-474b-a1ad-202164b57fc5)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" podUID="2b6aa065-f505-474b-a1ad-202164b57fc5" Apr 17 20:26:29.734416 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:29.734331 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-ntccf" event={"ID":"2002ee70-8b98-4354-8670-18b07579e406","Type":"ContainerStarted","Data":"ab19023660d5b698f4f7192e295ef64d1299a36a8bcd78151d3d6268666885da"} Apr 17 20:26:29.734784 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:29.734615 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-ntccf" Apr 17 20:26:29.750016 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:29.749976 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-ntccf" podStartSLOduration=7.115691856 podStartE2EDuration="10.749963346s" podCreationTimestamp="2026-04-17 20:26:19 +0000 UTC" firstStartedPulling="2026-04-17 20:26:25.716202232 +0000 UTC m=+652.792574212" lastFinishedPulling="2026-04-17 20:26:29.350473721 +0000 UTC m=+656.426845702" observedRunningTime="2026-04-17 20:26:29.749756281 +0000 UTC m=+656.826128283" watchObservedRunningTime="2026-04-17 20:26:29.749963346 +0000 UTC m=+656.826335340" Apr 17 20:26:39.526805 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:39.526772 2580 scope.go:117] "RemoveContainer" containerID="efde509ffb7f6c7cf288edeba20b95b508579125a48f21080d4c3606676d4a8c" Apr 17 20:26:39.527184 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:26:39.526972 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_llm(2b6aa065-f505-474b-a1ad-202164b57fc5)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" podUID="2b6aa065-f505-474b-a1ad-202164b57fc5" Apr 17 20:26:40.753280 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:40.753248 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-ntccf" Apr 17 20:26:53.529024 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:53.528994 2580 scope.go:117] "RemoveContainer" containerID="efde509ffb7f6c7cf288edeba20b95b508579125a48f21080d4c3606676d4a8c" Apr 17 20:26:54.820211 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:54.820182 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_2b6aa065-f505-474b-a1ad-202164b57fc5/main/3.log" Apr 17 20:26:54.820658 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:54.820586 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_2b6aa065-f505-474b-a1ad-202164b57fc5/main/2.log" Apr 17 20:26:54.820915 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:54.820891 2580 generic.go:358] "Generic (PLEG): container finished" podID="2b6aa065-f505-474b-a1ad-202164b57fc5" containerID="372c9854efddaf96ddc31eca4810577b6caef6107085c42a4d4327b1ee6f6a33" exitCode=2 Apr 17 20:26:54.820986 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:54.820964 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" event={"ID":"2b6aa065-f505-474b-a1ad-202164b57fc5","Type":"ContainerDied","Data":"372c9854efddaf96ddc31eca4810577b6caef6107085c42a4d4327b1ee6f6a33"} Apr 17 20:26:54.821026 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:54.821010 2580 scope.go:117] "RemoveContainer" containerID="efde509ffb7f6c7cf288edeba20b95b508579125a48f21080d4c3606676d4a8c" Apr 17 20:26:54.821433 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:54.821417 2580 scope.go:117] "RemoveContainer" containerID="372c9854efddaf96ddc31eca4810577b6caef6107085c42a4d4327b1ee6f6a33" Apr 17 20:26:54.821672 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:26:54.821646 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_llm(2b6aa065-f505-474b-a1ad-202164b57fc5)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" podUID="2b6aa065-f505-474b-a1ad-202164b57fc5" Apr 17 20:26:55.826367 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:55.826342 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_2b6aa065-f505-474b-a1ad-202164b57fc5/main/3.log" Apr 17 20:26:57.033601 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:57.033559 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" Apr 17 20:26:57.033601 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:57.033594 2580 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" Apr 17 20:26:57.034162 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:26:57.034143 2580 scope.go:117] "RemoveContainer" containerID="372c9854efddaf96ddc31eca4810577b6caef6107085c42a4d4327b1ee6f6a33" Apr 17 20:26:57.034390 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:26:57.034368 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_llm(2b6aa065-f505-474b-a1ad-202164b57fc5)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" podUID="2b6aa065-f505-474b-a1ad-202164b57fc5" Apr 17 20:27:10.527049 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:27:10.526956 2580 scope.go:117] "RemoveContainer" containerID="372c9854efddaf96ddc31eca4810577b6caef6107085c42a4d4327b1ee6f6a33" Apr 17 20:27:10.527514 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:27:10.527147 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_llm(2b6aa065-f505-474b-a1ad-202164b57fc5)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" podUID="2b6aa065-f505-474b-a1ad-202164b57fc5" Apr 17 20:27:23.528793 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:27:23.528700 2580 scope.go:117] "RemoveContainer" containerID="372c9854efddaf96ddc31eca4810577b6caef6107085c42a4d4327b1ee6f6a33" Apr 17 20:27:23.529152 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:27:23.528942 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_llm(2b6aa065-f505-474b-a1ad-202164b57fc5)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" podUID="2b6aa065-f505-474b-a1ad-202164b57fc5" Apr 17 20:27:34.527127 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:27:34.527096 2580 scope.go:117] "RemoveContainer" containerID="372c9854efddaf96ddc31eca4810577b6caef6107085c42a4d4327b1ee6f6a33" Apr 17 20:27:34.965716 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:27:34.965688 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_2b6aa065-f505-474b-a1ad-202164b57fc5/main/4.log" Apr 17 20:27:34.966084 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:27:34.966067 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_2b6aa065-f505-474b-a1ad-202164b57fc5/main/3.log" Apr 17 20:27:34.966420 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:27:34.966398 2580 generic.go:358] "Generic (PLEG): container finished" podID="2b6aa065-f505-474b-a1ad-202164b57fc5" containerID="a5cb22338cd06aae7484cd5d40aacd439bb0493f1c43113d4b970aa7b3623ae9" exitCode=2 Apr 17 20:27:34.966505 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:27:34.966442 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" event={"ID":"2b6aa065-f505-474b-a1ad-202164b57fc5","Type":"ContainerDied","Data":"a5cb22338cd06aae7484cd5d40aacd439bb0493f1c43113d4b970aa7b3623ae9"} Apr 17 20:27:34.966505 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:27:34.966500 2580 scope.go:117] "RemoveContainer" containerID="372c9854efddaf96ddc31eca4810577b6caef6107085c42a4d4327b1ee6f6a33" Apr 17 20:27:34.966925 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:27:34.966906 2580 scope.go:117] "RemoveContainer" containerID="a5cb22338cd06aae7484cd5d40aacd439bb0493f1c43113d4b970aa7b3623ae9" Apr 17 20:27:34.967155 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:27:34.967137 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_llm(2b6aa065-f505-474b-a1ad-202164b57fc5)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" podUID="2b6aa065-f505-474b-a1ad-202164b57fc5" Apr 17 20:27:35.971366 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:27:35.971337 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_2b6aa065-f505-474b-a1ad-202164b57fc5/main/4.log" Apr 17 20:27:37.033710 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:27:37.033680 2580 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" Apr 17 20:27:37.033710 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:27:37.033712 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" Apr 17 20:27:37.034162 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:27:37.034147 2580 scope.go:117] "RemoveContainer" containerID="a5cb22338cd06aae7484cd5d40aacd439bb0493f1c43113d4b970aa7b3623ae9" Apr 17 20:27:37.034339 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:27:37.034322 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_llm(2b6aa065-f505-474b-a1ad-202164b57fc5)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" podUID="2b6aa065-f505-474b-a1ad-202164b57fc5" Apr 17 20:27:51.526845 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:27:51.526799 2580 scope.go:117] "RemoveContainer" containerID="a5cb22338cd06aae7484cd5d40aacd439bb0493f1c43113d4b970aa7b3623ae9" Apr 17 20:27:51.527238 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:27:51.527022 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_llm(2b6aa065-f505-474b-a1ad-202164b57fc5)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" podUID="2b6aa065-f505-474b-a1ad-202164b57fc5" Apr 17 20:28:03.528784 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:28:03.528751 2580 scope.go:117] "RemoveContainer" containerID="a5cb22338cd06aae7484cd5d40aacd439bb0493f1c43113d4b970aa7b3623ae9" Apr 17 20:28:03.529190 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:28:03.528965 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_llm(2b6aa065-f505-474b-a1ad-202164b57fc5)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" podUID="2b6aa065-f505-474b-a1ad-202164b57fc5" Apr 17 20:28:17.526751 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:28:17.526715 2580 scope.go:117] "RemoveContainer" containerID="a5cb22338cd06aae7484cd5d40aacd439bb0493f1c43113d4b970aa7b3623ae9" Apr 17 20:28:17.527205 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:28:17.526980 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_llm(2b6aa065-f505-474b-a1ad-202164b57fc5)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" podUID="2b6aa065-f505-474b-a1ad-202164b57fc5" Apr 17 20:28:31.526390 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:28:31.526352 2580 scope.go:117] "RemoveContainer" containerID="a5cb22338cd06aae7484cd5d40aacd439bb0493f1c43113d4b970aa7b3623ae9" Apr 17 20:28:31.526850 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:28:31.526564 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_llm(2b6aa065-f505-474b-a1ad-202164b57fc5)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" podUID="2b6aa065-f505-474b-a1ad-202164b57fc5" Apr 17 20:28:45.526907 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:28:45.526718 2580 scope.go:117] "RemoveContainer" containerID="a5cb22338cd06aae7484cd5d40aacd439bb0493f1c43113d4b970aa7b3623ae9" Apr 17 20:28:45.527551 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:28:45.527412 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_llm(2b6aa065-f505-474b-a1ad-202164b57fc5)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" podUID="2b6aa065-f505-474b-a1ad-202164b57fc5" Apr 17 20:28:56.527368 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:28:56.527273 2580 scope.go:117] "RemoveContainer" containerID="a5cb22338cd06aae7484cd5d40aacd439bb0493f1c43113d4b970aa7b3623ae9" Apr 17 20:28:57.247780 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:28:57.247752 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_2b6aa065-f505-474b-a1ad-202164b57fc5/main/5.log" Apr 17 20:28:57.248143 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:28:57.248127 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_2b6aa065-f505-474b-a1ad-202164b57fc5/main/4.log" Apr 17 20:28:57.248422 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:28:57.248391 2580 generic.go:358] "Generic (PLEG): container finished" podID="2b6aa065-f505-474b-a1ad-202164b57fc5" containerID="8510d7ce61627b2c9e4ea0ca5122974b5cd6062105dfa3fa05b8c4b383cfa481" exitCode=2 Apr 17 20:28:57.248539 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:28:57.248490 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" event={"ID":"2b6aa065-f505-474b-a1ad-202164b57fc5","Type":"ContainerDied","Data":"8510d7ce61627b2c9e4ea0ca5122974b5cd6062105dfa3fa05b8c4b383cfa481"} Apr 17 20:28:57.248539 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:28:57.248533 2580 scope.go:117] "RemoveContainer" containerID="a5cb22338cd06aae7484cd5d40aacd439bb0493f1c43113d4b970aa7b3623ae9" Apr 17 20:28:57.248939 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:28:57.248915 2580 scope.go:117] "RemoveContainer" containerID="8510d7ce61627b2c9e4ea0ca5122974b5cd6062105dfa3fa05b8c4b383cfa481" Apr 17 20:28:57.249154 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:28:57.249136 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_llm(2b6aa065-f505-474b-a1ad-202164b57fc5)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" podUID="2b6aa065-f505-474b-a1ad-202164b57fc5" Apr 17 20:28:58.253804 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:28:58.253781 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_2b6aa065-f505-474b-a1ad-202164b57fc5/main/5.log" Apr 17 20:29:07.034290 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:29:07.034256 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" Apr 17 20:29:07.034290 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:29:07.034297 2580 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" Apr 17 20:29:07.034754 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:29:07.034737 2580 scope.go:117] "RemoveContainer" containerID="8510d7ce61627b2c9e4ea0ca5122974b5cd6062105dfa3fa05b8c4b383cfa481" Apr 17 20:29:07.034934 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:29:07.034918 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_llm(2b6aa065-f505-474b-a1ad-202164b57fc5)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" podUID="2b6aa065-f505-474b-a1ad-202164b57fc5" Apr 17 20:29:22.526720 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:29:22.526689 2580 scope.go:117] "RemoveContainer" containerID="8510d7ce61627b2c9e4ea0ca5122974b5cd6062105dfa3fa05b8c4b383cfa481" Apr 17 20:29:22.527167 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:29:22.526892 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_llm(2b6aa065-f505-474b-a1ad-202164b57fc5)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" podUID="2b6aa065-f505-474b-a1ad-202164b57fc5" Apr 17 20:29:35.532194 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:29:35.532168 2580 scope.go:117] "RemoveContainer" containerID="8510d7ce61627b2c9e4ea0ca5122974b5cd6062105dfa3fa05b8c4b383cfa481" Apr 17 20:29:35.532621 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:29:35.532342 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_llm(2b6aa065-f505-474b-a1ad-202164b57fc5)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" podUID="2b6aa065-f505-474b-a1ad-202164b57fc5" Apr 17 20:29:49.526905 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:29:49.526878 2580 scope.go:117] "RemoveContainer" containerID="8510d7ce61627b2c9e4ea0ca5122974b5cd6062105dfa3fa05b8c4b383cfa481" Apr 17 20:29:49.527288 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:29:49.527068 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_llm(2b6aa065-f505-474b-a1ad-202164b57fc5)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" podUID="2b6aa065-f505-474b-a1ad-202164b57fc5" Apr 17 20:30:00.142042 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:00.142006 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29607630-9czdd"] Apr 17 20:30:00.145294 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:00.145277 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607630-9czdd" Apr 17 20:30:00.147595 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:00.147578 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-z7qbr\"" Apr 17 20:30:00.151964 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:00.151940 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607630-9czdd"] Apr 17 20:30:00.178512 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:00.178478 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btqlk\" (UniqueName: \"kubernetes.io/projected/5abd1edf-8e99-4135-a512-9b0bd22c2d69-kube-api-access-btqlk\") pod \"maas-api-key-cleanup-29607630-9czdd\" (UID: \"5abd1edf-8e99-4135-a512-9b0bd22c2d69\") " pod="opendatahub/maas-api-key-cleanup-29607630-9czdd" Apr 17 20:30:00.279581 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:00.279544 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-btqlk\" (UniqueName: \"kubernetes.io/projected/5abd1edf-8e99-4135-a512-9b0bd22c2d69-kube-api-access-btqlk\") pod \"maas-api-key-cleanup-29607630-9czdd\" (UID: \"5abd1edf-8e99-4135-a512-9b0bd22c2d69\") " pod="opendatahub/maas-api-key-cleanup-29607630-9czdd" Apr 17 20:30:00.287652 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:00.287612 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-btqlk\" (UniqueName: \"kubernetes.io/projected/5abd1edf-8e99-4135-a512-9b0bd22c2d69-kube-api-access-btqlk\") pod \"maas-api-key-cleanup-29607630-9czdd\" (UID: \"5abd1edf-8e99-4135-a512-9b0bd22c2d69\") " pod="opendatahub/maas-api-key-cleanup-29607630-9czdd" Apr 17 20:30:00.455845 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:00.455756 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607630-9czdd" Apr 17 20:30:00.526330 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:00.526305 2580 scope.go:117] "RemoveContainer" containerID="8510d7ce61627b2c9e4ea0ca5122974b5cd6062105dfa3fa05b8c4b383cfa481" Apr 17 20:30:00.526543 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:30:00.526525 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_llm(2b6aa065-f505-474b-a1ad-202164b57fc5)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" podUID="2b6aa065-f505-474b-a1ad-202164b57fc5" Apr 17 20:30:00.577157 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:00.577121 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607630-9czdd"] Apr 17 20:30:00.580166 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:30:00.580142 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5abd1edf_8e99_4135_a512_9b0bd22c2d69.slice/crio-71dd082ae233c77bfea3092d06083b7a8a808239cdefa515b5757ad11813314c WatchSource:0}: Error finding container 71dd082ae233c77bfea3092d06083b7a8a808239cdefa515b5757ad11813314c: Status 404 returned error can't find the container with id 71dd082ae233c77bfea3092d06083b7a8a808239cdefa515b5757ad11813314c Apr 17 20:30:01.471669 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:01.471623 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607630-9czdd" event={"ID":"5abd1edf-8e99-4135-a512-9b0bd22c2d69","Type":"ContainerStarted","Data":"71dd082ae233c77bfea3092d06083b7a8a808239cdefa515b5757ad11813314c"} Apr 17 20:30:03.479235 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:03.479199 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607630-9czdd" event={"ID":"5abd1edf-8e99-4135-a512-9b0bd22c2d69","Type":"ContainerStarted","Data":"dd1babf922dd85c1c1e146bd5c45ab6f06b29d85562a2cb011cf7ba7f5e57e2f"} Apr 17 20:30:03.493236 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:03.493194 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29607630-9czdd" podStartSLOduration=1.368497753 podStartE2EDuration="3.493178731s" podCreationTimestamp="2026-04-17 20:30:00 +0000 UTC" firstStartedPulling="2026-04-17 20:30:00.581901664 +0000 UTC m=+867.658273643" lastFinishedPulling="2026-04-17 20:30:02.706582638 +0000 UTC m=+869.782954621" observedRunningTime="2026-04-17 20:30:03.491867841 +0000 UTC m=+870.568239842" watchObservedRunningTime="2026-04-17 20:30:03.493178731 +0000 UTC m=+870.569550735" Apr 17 20:30:13.528892 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:13.528861 2580 scope.go:117] "RemoveContainer" containerID="8510d7ce61627b2c9e4ea0ca5122974b5cd6062105dfa3fa05b8c4b383cfa481" Apr 17 20:30:13.529344 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:30:13.529090 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_llm(2b6aa065-f505-474b-a1ad-202164b57fc5)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" podUID="2b6aa065-f505-474b-a1ad-202164b57fc5" Apr 17 20:30:23.546232 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:23.546198 2580 generic.go:358] "Generic (PLEG): container finished" podID="5abd1edf-8e99-4135-a512-9b0bd22c2d69" containerID="dd1babf922dd85c1c1e146bd5c45ab6f06b29d85562a2cb011cf7ba7f5e57e2f" exitCode=6 Apr 17 20:30:23.546639 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:23.546252 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607630-9czdd" event={"ID":"5abd1edf-8e99-4135-a512-9b0bd22c2d69","Type":"ContainerDied","Data":"dd1babf922dd85c1c1e146bd5c45ab6f06b29d85562a2cb011cf7ba7f5e57e2f"} Apr 17 20:30:23.546639 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:23.546531 2580 scope.go:117] "RemoveContainer" containerID="dd1babf922dd85c1c1e146bd5c45ab6f06b29d85562a2cb011cf7ba7f5e57e2f" Apr 17 20:30:24.551263 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:24.551230 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607630-9czdd" event={"ID":"5abd1edf-8e99-4135-a512-9b0bd22c2d69","Type":"ContainerStarted","Data":"6b6fd3a7811830c89150baf2482576905cae10a1cb3fc850d262abb64adffd04"} Apr 17 20:30:27.527363 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:27.527327 2580 scope.go:117] "RemoveContainer" containerID="8510d7ce61627b2c9e4ea0ca5122974b5cd6062105dfa3fa05b8c4b383cfa481" Apr 17 20:30:27.527861 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:30:27.527605 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_llm(2b6aa065-f505-474b-a1ad-202164b57fc5)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" podUID="2b6aa065-f505-474b-a1ad-202164b57fc5" Apr 17 20:30:33.441533 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:33.441505 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_2b6aa065-f505-474b-a1ad-202164b57fc5/main/5.log" Apr 17 20:30:33.443650 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:33.443628 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_2b6aa065-f505-474b-a1ad-202164b57fc5/main/5.log" Apr 17 20:30:33.456927 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:33.456901 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrmd6_c81797ca-3338-4294-8ef8-fb0416677637/ovn-acl-logging/0.log" Apr 17 20:30:33.458783 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:33.458766 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrmd6_c81797ca-3338-4294-8ef8-fb0416677637/ovn-acl-logging/0.log" Apr 17 20:30:36.059915 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:36.059889 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-key-cleanup-29607630-9czdd_5abd1edf-8e99-4135-a512-9b0bd22c2d69/cleanup/1.log" Apr 17 20:30:36.619176 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:36.619147 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-799c8bc7d9-97hnf_95bc4d30-7395-4a2a-8de4-7d525388ec83/manager/0.log" Apr 17 20:30:38.234421 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:38.234392 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-7kz58_8af0c5f7-0993-405e-a894-94b97c9218f2/kuadrant-console-plugin/0.log" Apr 17 20:30:38.554447 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:38.554352 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-vxkn7_e63872d6-333e-4199-9281-a33b9fc40c93/limitador/0.log" Apr 17 20:30:39.289893 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:39.289864 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-686dfbc75c-zz9tq_3db732ac-f37e-4b9d-a213-cc5a30a150bf/kube-auth-proxy/0.log" Apr 17 20:30:39.500334 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:39.500310 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-65d65c8fdb-58dhc_3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9/router/0.log" Apr 17 20:30:39.824916 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:39.824892 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_2b6aa065-f505-474b-a1ad-202164b57fc5/storage-initializer/0.log" Apr 17 20:30:39.830932 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:39.830909 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_2b6aa065-f505-474b-a1ad-202164b57fc5/main/5.log" Apr 17 20:30:40.045998 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:40.045963 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-6d5965695-ntccf_2002ee70-8b98-4354-8670-18b07579e406/storage-initializer/0.log" Apr 17 20:30:40.052192 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:40.052175 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-6d5965695-ntccf_2002ee70-8b98-4354-8670-18b07579e406/main/0.log" Apr 17 20:30:41.527227 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:41.527198 2580 scope.go:117] "RemoveContainer" containerID="8510d7ce61627b2c9e4ea0ca5122974b5cd6062105dfa3fa05b8c4b383cfa481" Apr 17 20:30:41.527706 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:30:41.527385 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_llm(2b6aa065-f505-474b-a1ad-202164b57fc5)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" podUID="2b6aa065-f505-474b-a1ad-202164b57fc5" Apr 17 20:30:44.621750 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:44.621721 2580 generic.go:358] "Generic (PLEG): container finished" podID="5abd1edf-8e99-4135-a512-9b0bd22c2d69" containerID="6b6fd3a7811830c89150baf2482576905cae10a1cb3fc850d262abb64adffd04" exitCode=6 Apr 17 20:30:44.622151 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:44.621793 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607630-9czdd" event={"ID":"5abd1edf-8e99-4135-a512-9b0bd22c2d69","Type":"ContainerDied","Data":"6b6fd3a7811830c89150baf2482576905cae10a1cb3fc850d262abb64adffd04"} Apr 17 20:30:44.622151 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:44.621832 2580 scope.go:117] "RemoveContainer" containerID="dd1babf922dd85c1c1e146bd5c45ab6f06b29d85562a2cb011cf7ba7f5e57e2f" Apr 17 20:30:44.622151 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:44.622091 2580 scope.go:117] "RemoveContainer" containerID="6b6fd3a7811830c89150baf2482576905cae10a1cb3fc850d262abb64adffd04" Apr 17 20:30:44.622296 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:30:44.622276 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29607630-9czdd_opendatahub(5abd1edf-8e99-4135-a512-9b0bd22c2d69)\"" pod="opendatahub/maas-api-key-cleanup-29607630-9czdd" podUID="5abd1edf-8e99-4135-a512-9b0bd22c2d69" Apr 17 20:30:47.354094 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:47.354066 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-pmz57_b5439bd7-7d09-4129-aaf9-d27b03010197/global-pull-secret-syncer/0.log" Apr 17 20:30:47.481448 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:47.481417 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-xtvgg_87977dd5-cce0-46ae-8d11-ddfd87452aef/konnectivity-agent/0.log" Apr 17 20:30:47.524140 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:47.524099 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-132-57.ec2.internal_ad918dee123885abb804caebda37d740/haproxy/0.log" Apr 17 20:30:51.997278 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:51.997201 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-7kz58_8af0c5f7-0993-405e-a894-94b97c9218f2/kuadrant-console-plugin/0.log" Apr 17 20:30:52.083958 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:52.083933 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-vxkn7_e63872d6-333e-4199-9281-a33b9fc40c93/limitador/0.log" Apr 17 20:30:53.570477 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:53.570428 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_79f5d464-0615-476c-8303-51771c3852b6/alertmanager/0.log" Apr 17 20:30:53.593028 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:53.593005 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_79f5d464-0615-476c-8303-51771c3852b6/config-reloader/0.log" Apr 17 20:30:53.614547 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:53.614528 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_79f5d464-0615-476c-8303-51771c3852b6/kube-rbac-proxy-web/0.log" Apr 17 20:30:53.633860 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:53.633837 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_79f5d464-0615-476c-8303-51771c3852b6/kube-rbac-proxy/0.log" Apr 17 20:30:53.654012 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:53.653985 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_79f5d464-0615-476c-8303-51771c3852b6/kube-rbac-proxy-metric/0.log" Apr 17 20:30:53.691006 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:53.690969 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_79f5d464-0615-476c-8303-51771c3852b6/prom-label-proxy/0.log" Apr 17 20:30:53.730106 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:53.730071 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_79f5d464-0615-476c-8303-51771c3852b6/init-config-reloader/0.log" Apr 17 20:30:53.895263 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:53.895197 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-7f9447cc77-gq78m_8fb9d968-11f8-424c-833a-9133403fbf4e/metrics-server/0.log" Apr 17 20:30:53.916602 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:53.916575 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-klvtv_336ee31c-aa4b-4408-9cbd-46e77e017cfa/monitoring-plugin/0.log" Apr 17 20:30:53.943903 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:53.943880 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-79v5j_695db0e5-f748-4b2b-ad9f-a5a810dcad9b/node-exporter/0.log" Apr 17 20:30:53.961965 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:53.961949 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-79v5j_695db0e5-f748-4b2b-ad9f-a5a810dcad9b/kube-rbac-proxy/0.log" Apr 17 20:30:53.981963 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:53.981943 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-79v5j_695db0e5-f748-4b2b-ad9f-a5a810dcad9b/init-textfile/0.log" Apr 17 20:30:54.175862 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:54.175784 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-qqk8v_dd485dc8-bb69-4490-83ff-fb09472c93f4/kube-rbac-proxy-main/0.log" Apr 17 20:30:54.195419 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:54.195390 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-qqk8v_dd485dc8-bb69-4490-83ff-fb09472c93f4/kube-rbac-proxy-self/0.log" Apr 17 20:30:54.216705 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:54.216676 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-qqk8v_dd485dc8-bb69-4490-83ff-fb09472c93f4/openshift-state-metrics/0.log" Apr 17 20:30:54.268149 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:54.268109 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_18946c1e-e029-484b-89f2-05cde4450668/prometheus/0.log" Apr 17 20:30:54.285663 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:54.285618 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_18946c1e-e029-484b-89f2-05cde4450668/config-reloader/0.log" Apr 17 20:30:54.304267 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:54.304249 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_18946c1e-e029-484b-89f2-05cde4450668/thanos-sidecar/0.log" Apr 17 20:30:54.325129 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:54.325112 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_18946c1e-e029-484b-89f2-05cde4450668/kube-rbac-proxy-web/0.log" Apr 17 20:30:54.345629 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:54.345607 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_18946c1e-e029-484b-89f2-05cde4450668/kube-rbac-proxy/0.log" Apr 17 20:30:54.364067 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:54.364049 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_18946c1e-e029-484b-89f2-05cde4450668/kube-rbac-proxy-thanos/0.log" Apr 17 20:30:54.387483 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:54.387439 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_18946c1e-e029-484b-89f2-05cde4450668/init-config-reloader/0.log" Apr 17 20:30:54.486399 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:54.486323 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7dffdb7bd4-5fdhx_754c0844-0836-4dca-9230-21c7a04f6de9/telemeter-client/0.log" Apr 17 20:30:54.505598 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:54.505578 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7dffdb7bd4-5fdhx_754c0844-0836-4dca-9230-21c7a04f6de9/reload/0.log" Apr 17 20:30:54.524552 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:54.524535 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7dffdb7bd4-5fdhx_754c0844-0836-4dca-9230-21c7a04f6de9/kube-rbac-proxy/0.log" Apr 17 20:30:55.526708 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:55.526678 2580 scope.go:117] "RemoveContainer" containerID="8510d7ce61627b2c9e4ea0ca5122974b5cd6062105dfa3fa05b8c4b383cfa481" Apr 17 20:30:55.527080 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:55.526786 2580 scope.go:117] "RemoveContainer" containerID="6b6fd3a7811830c89150baf2482576905cae10a1cb3fc850d262abb64adffd04" Apr 17 20:30:55.527080 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:30:55.526881 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_llm(2b6aa065-f505-474b-a1ad-202164b57fc5)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" podUID="2b6aa065-f505-474b-a1ad-202164b57fc5" Apr 17 20:30:55.722261 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:55.722229 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-5grgb_5e2d98af-d82a-4f4f-a48d-77a96c374f2c/networking-console-plugin/0.log" Apr 17 20:30:56.048516 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:56.048482 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-klrcl/perf-node-gather-daemonset-v8bp8"] Apr 17 20:30:56.051748 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:56.051728 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-v8bp8" Apr 17 20:30:56.053812 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:56.053788 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-klrcl\"/\"kube-root-ca.crt\"" Apr 17 20:30:56.053920 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:56.053813 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-klrcl\"/\"default-dockercfg-b2tbp\"" Apr 17 20:30:56.053920 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:56.053800 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-klrcl\"/\"openshift-service-ca.crt\"" Apr 17 20:30:56.057993 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:56.057970 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-klrcl/perf-node-gather-daemonset-v8bp8"] Apr 17 20:30:56.184338 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:56.184250 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c1996311-a2b6-436d-8376-2ce22d0078df-lib-modules\") pod \"perf-node-gather-daemonset-v8bp8\" (UID: \"c1996311-a2b6-436d-8376-2ce22d0078df\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-v8bp8" Apr 17 20:30:56.184338 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:56.184287 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c1996311-a2b6-436d-8376-2ce22d0078df-sys\") pod \"perf-node-gather-daemonset-v8bp8\" (UID: \"c1996311-a2b6-436d-8376-2ce22d0078df\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-v8bp8" Apr 17 20:30:56.184338 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:56.184309 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz8x2\" (UniqueName: \"kubernetes.io/projected/c1996311-a2b6-436d-8376-2ce22d0078df-kube-api-access-gz8x2\") pod \"perf-node-gather-daemonset-v8bp8\" (UID: \"c1996311-a2b6-436d-8376-2ce22d0078df\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-v8bp8" Apr 17 20:30:56.184338 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:56.184331 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c1996311-a2b6-436d-8376-2ce22d0078df-proc\") pod \"perf-node-gather-daemonset-v8bp8\" (UID: \"c1996311-a2b6-436d-8376-2ce22d0078df\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-v8bp8" Apr 17 20:30:56.184617 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:56.184393 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c1996311-a2b6-436d-8376-2ce22d0078df-podres\") pod \"perf-node-gather-daemonset-v8bp8\" (UID: \"c1996311-a2b6-436d-8376-2ce22d0078df\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-v8bp8" Apr 17 20:30:56.285710 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:56.285681 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c1996311-a2b6-436d-8376-2ce22d0078df-podres\") pod \"perf-node-gather-daemonset-v8bp8\" (UID: \"c1996311-a2b6-436d-8376-2ce22d0078df\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-v8bp8" Apr 17 20:30:56.285855 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:56.285720 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c1996311-a2b6-436d-8376-2ce22d0078df-lib-modules\") pod \"perf-node-gather-daemonset-v8bp8\" (UID: \"c1996311-a2b6-436d-8376-2ce22d0078df\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-v8bp8" Apr 17 20:30:56.285855 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:56.285745 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c1996311-a2b6-436d-8376-2ce22d0078df-sys\") pod \"perf-node-gather-daemonset-v8bp8\" (UID: \"c1996311-a2b6-436d-8376-2ce22d0078df\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-v8bp8" Apr 17 20:30:56.285855 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:56.285766 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gz8x2\" (UniqueName: \"kubernetes.io/projected/c1996311-a2b6-436d-8376-2ce22d0078df-kube-api-access-gz8x2\") pod \"perf-node-gather-daemonset-v8bp8\" (UID: \"c1996311-a2b6-436d-8376-2ce22d0078df\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-v8bp8" Apr 17 20:30:56.285855 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:56.285790 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c1996311-a2b6-436d-8376-2ce22d0078df-proc\") pod \"perf-node-gather-daemonset-v8bp8\" (UID: \"c1996311-a2b6-436d-8376-2ce22d0078df\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-v8bp8" Apr 17 20:30:56.285855 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:56.285846 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c1996311-a2b6-436d-8376-2ce22d0078df-lib-modules\") pod \"perf-node-gather-daemonset-v8bp8\" (UID: \"c1996311-a2b6-436d-8376-2ce22d0078df\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-v8bp8" Apr 17 20:30:56.286050 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:56.285845 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c1996311-a2b6-436d-8376-2ce22d0078df-podres\") pod \"perf-node-gather-daemonset-v8bp8\" (UID: \"c1996311-a2b6-436d-8376-2ce22d0078df\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-v8bp8" Apr 17 20:30:56.286050 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:56.285882 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c1996311-a2b6-436d-8376-2ce22d0078df-proc\") pod \"perf-node-gather-daemonset-v8bp8\" (UID: \"c1996311-a2b6-436d-8376-2ce22d0078df\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-v8bp8" Apr 17 20:30:56.286050 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:56.285890 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c1996311-a2b6-436d-8376-2ce22d0078df-sys\") pod \"perf-node-gather-daemonset-v8bp8\" (UID: \"c1996311-a2b6-436d-8376-2ce22d0078df\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-v8bp8" Apr 17 20:30:56.292795 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:56.292761 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gz8x2\" (UniqueName: \"kubernetes.io/projected/c1996311-a2b6-436d-8376-2ce22d0078df-kube-api-access-gz8x2\") pod \"perf-node-gather-daemonset-v8bp8\" (UID: \"c1996311-a2b6-436d-8376-2ce22d0078df\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-v8bp8" Apr 17 20:30:56.362564 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:56.362537 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-v8bp8" Apr 17 20:30:56.483932 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:56.483904 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-klrcl/perf-node-gather-daemonset-v8bp8"] Apr 17 20:30:56.486525 ip-10-0-132-57 kubenswrapper[2580]: W0417 20:30:56.486492 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc1996311_a2b6_436d_8376_2ce22d0078df.slice/crio-3eb7ff4c0c70826c42a0512f22734de1e41c6b53111edbc257b60f1cd599ea55 WatchSource:0}: Error finding container 3eb7ff4c0c70826c42a0512f22734de1e41c6b53111edbc257b60f1cd599ea55: Status 404 returned error can't find the container with id 3eb7ff4c0c70826c42a0512f22734de1e41c6b53111edbc257b60f1cd599ea55 Apr 17 20:30:56.665384 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:56.665343 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607630-9czdd" event={"ID":"5abd1edf-8e99-4135-a512-9b0bd22c2d69","Type":"ContainerStarted","Data":"2c20658280b7becf9ffcd9189cbeeb6b34730aa792dbfb45a08a382dcb9c9d8b"} Apr 17 20:30:56.666693 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:56.666668 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-v8bp8" event={"ID":"c1996311-a2b6-436d-8376-2ce22d0078df","Type":"ContainerStarted","Data":"ff580bf2375c9b19b3718d56512b75a1bb69a0ac73d43f48497354ccf46b3b9e"} Apr 17 20:30:56.666693 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:56.666695 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-v8bp8" event={"ID":"c1996311-a2b6-436d-8376-2ce22d0078df","Type":"ContainerStarted","Data":"3eb7ff4c0c70826c42a0512f22734de1e41c6b53111edbc257b60f1cd599ea55"} Apr 17 20:30:56.666889 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:56.666791 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-v8bp8" Apr 17 20:30:56.691995 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:56.691892 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-v8bp8" podStartSLOduration=0.691879837 podStartE2EDuration="691.879837ms" podCreationTimestamp="2026-04-17 20:30:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:30:56.691376396 +0000 UTC m=+923.767748403" watchObservedRunningTime="2026-04-17 20:30:56.691879837 +0000 UTC m=+923.768251839" Apr 17 20:30:56.778722 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:56.778695 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-49zfd_7dcf50fe-0775-4474-b4fd-e451ff50c3a5/download-server/0.log" Apr 17 20:30:57.688276 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:57.688242 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607630-9czdd"] Apr 17 20:30:57.688688 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:57.688471 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29607630-9czdd" podUID="5abd1edf-8e99-4135-a512-9b0bd22c2d69" containerName="cleanup" containerID="cri-o://2c20658280b7becf9ffcd9189cbeeb6b34730aa792dbfb45a08a382dcb9c9d8b" gracePeriod=30 Apr 17 20:30:58.067640 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:58.067590 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-nztqs_29f57080-c48b-42b7-8c1a-747b7fd06533/dns/0.log" Apr 17 20:30:58.086652 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:58.086629 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-nztqs_29f57080-c48b-42b7-8c1a-747b7fd06533/kube-rbac-proxy/0.log" Apr 17 20:30:58.198085 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:58.198056 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-h6fk4_28b18d89-4df2-405e-8e06-5f5e39694305/dns-node-resolver/0.log" Apr 17 20:30:58.754509 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:58.754446 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-n6lwg_9d8cbec6-ac91-4373-b7ef-593404bf8a86/node-ca/0.log" Apr 17 20:30:59.600835 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:59.600811 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-686dfbc75c-zz9tq_3db732ac-f37e-4b9d-a213-cc5a30a150bf/kube-auth-proxy/0.log" Apr 17 20:30:59.650167 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:30:59.650140 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-65d65c8fdb-58dhc_3009a1d4-0ea7-4cb8-bffe-4a367b96d4b9/router/0.log" Apr 17 20:31:00.136118 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:31:00.136089 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-4x62t_e270f686-4250-41f8-a9c1-6f192df2ee57/serve-healthcheck-canary/0.log" Apr 17 20:31:00.624663 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:31:00.624637 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2vjgj_a1e13515-b5f1-46d8-bc11-773c27792e7e/kube-rbac-proxy/0.log" Apr 17 20:31:00.646846 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:31:00.646822 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2vjgj_a1e13515-b5f1-46d8-bc11-773c27792e7e/exporter/0.log" Apr 17 20:31:00.667205 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:31:00.667186 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2vjgj_a1e13515-b5f1-46d8-bc11-773c27792e7e/extractor/0.log" Apr 17 20:31:02.636735 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:31:02.636705 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-key-cleanup-29607630-9czdd_5abd1edf-8e99-4135-a512-9b0bd22c2d69/cleanup/2.log" Apr 17 20:31:02.637107 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:31:02.636705 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-key-cleanup-29607630-9czdd_5abd1edf-8e99-4135-a512-9b0bd22c2d69/cleanup/1.log" Apr 17 20:31:02.681067 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:31:02.681041 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-v8bp8" Apr 17 20:31:02.786138 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:31:02.786112 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-799c8bc7d9-97hnf_95bc4d30-7395-4a2a-8de4-7d525388ec83/manager/0.log" Apr 17 20:31:03.927434 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:31:03.927408 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-fd99964b4-jkhdk_97ad1cbe-cd5a-4054-ba83-b963654e0afb/manager/0.log" Apr 17 20:31:08.319846 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:31:08.319819 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-bmcs7_719e086f-1a8a-434b-90a0-cd72fcae76c0/migrator/0.log" Apr 17 20:31:08.338342 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:31:08.338316 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-bmcs7_719e086f-1a8a-434b-90a0-cd72fcae76c0/graceful-termination/0.log" Apr 17 20:31:09.990339 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:31:09.990313 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qvs8p_16a5e25e-23a4-4106-a67e-adda44b1aaa6/kube-multus-additional-cni-plugins/0.log" Apr 17 20:31:10.010314 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:31:10.010283 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qvs8p_16a5e25e-23a4-4106-a67e-adda44b1aaa6/egress-router-binary-copy/0.log" Apr 17 20:31:10.029298 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:31:10.029270 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qvs8p_16a5e25e-23a4-4106-a67e-adda44b1aaa6/cni-plugins/0.log" Apr 17 20:31:10.050203 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:31:10.050166 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qvs8p_16a5e25e-23a4-4106-a67e-adda44b1aaa6/bond-cni-plugin/0.log" Apr 17 20:31:10.072313 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:31:10.072289 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qvs8p_16a5e25e-23a4-4106-a67e-adda44b1aaa6/routeoverride-cni/0.log" Apr 17 20:31:10.091901 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:31:10.091821 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qvs8p_16a5e25e-23a4-4106-a67e-adda44b1aaa6/whereabouts-cni-bincopy/0.log" Apr 17 20:31:10.112453 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:31:10.109820 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qvs8p_16a5e25e-23a4-4106-a67e-adda44b1aaa6/whereabouts-cni/0.log" Apr 17 20:31:10.137827 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:31:10.137803 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lj6hr_018ba037-0cf6-4ce0-ba07-95893c240cd2/kube-multus/0.log" Apr 17 20:31:10.234742 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:31:10.234716 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-99wq2_d943896a-8c08-4d43-b1c4-d738b0079503/network-metrics-daemon/0.log" Apr 17 20:31:10.253090 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:31:10.253011 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-99wq2_d943896a-8c08-4d43-b1c4-d738b0079503/kube-rbac-proxy/0.log" Apr 17 20:31:10.527335 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:31:10.527254 2580 scope.go:117] "RemoveContainer" containerID="8510d7ce61627b2c9e4ea0ca5122974b5cd6062105dfa3fa05b8c4b383cfa481" Apr 17 20:31:10.527511 ip-10-0-132-57 kubenswrapper[2580]: E0417 20:31:10.527492 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v_llm(2b6aa065-f505-474b-a1ad-202164b57fc5)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-pc75v" podUID="2b6aa065-f505-474b-a1ad-202164b57fc5" Apr 17 20:31:11.619551 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:31:11.619521 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrmd6_c81797ca-3338-4294-8ef8-fb0416677637/ovn-controller/0.log" Apr 17 20:31:11.639612 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:31:11.639579 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrmd6_c81797ca-3338-4294-8ef8-fb0416677637/ovn-acl-logging/0.log" Apr 17 20:31:11.644334 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:31:11.644310 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrmd6_c81797ca-3338-4294-8ef8-fb0416677637/ovn-acl-logging/1.log" Apr 17 20:31:11.660601 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:31:11.660563 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrmd6_c81797ca-3338-4294-8ef8-fb0416677637/kube-rbac-proxy-node/0.log" Apr 17 20:31:11.679753 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:31:11.679726 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrmd6_c81797ca-3338-4294-8ef8-fb0416677637/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 20:31:11.696707 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:31:11.696680 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrmd6_c81797ca-3338-4294-8ef8-fb0416677637/northd/0.log" Apr 17 20:31:11.716334 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:31:11.716302 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrmd6_c81797ca-3338-4294-8ef8-fb0416677637/nbdb/0.log" Apr 17 20:31:11.735881 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:31:11.735846 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrmd6_c81797ca-3338-4294-8ef8-fb0416677637/sbdb/0.log" Apr 17 20:31:11.823262 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:31:11.823221 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vrmd6_c81797ca-3338-4294-8ef8-fb0416677637/ovnkube-controller/0.log" Apr 17 20:31:12.932519 ip-10-0-132-57 kubenswrapper[2580]: I0417 20:31:12.932492 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-d9lfz_05059fd8-9f1b-4374-81cf-fd56830ab0bb/network-check-target-container/0.log"