Apr 17 18:46:34.654346 ip-10-0-132-192 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 18:46:34.654359 ip-10-0-132-192 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 18:46:34.654369 ip-10-0-132-192 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 18:46:34.654653 ip-10-0-132-192 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 18:46:44.660036 ip-10-0-132-192 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 18:46:44.660052 ip-10-0-132-192 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot f5784c6c77164068b4b38cb32c0363c5 -- Apr 17 18:49:20.625796 ip-10-0-132-192 systemd[1]: Starting Kubernetes Kubelet... Apr 17 18:49:21.005531 ip-10-0-132-192 kubenswrapper[2571]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 18:49:21.005531 ip-10-0-132-192 kubenswrapper[2571]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 18:49:21.005531 ip-10-0-132-192 kubenswrapper[2571]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 18:49:21.005531 ip-10-0-132-192 kubenswrapper[2571]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 18:49:21.005531 ip-10-0-132-192 kubenswrapper[2571]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 18:49:21.006965 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.006847 2571 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 18:49:21.010097 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010077 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 18:49:21.010097 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010095 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 18:49:21.010097 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010098 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 18:49:21.010097 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010102 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 18:49:21.010258 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010105 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 18:49:21.010258 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010109 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 18:49:21.010258 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010112 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 18:49:21.010258 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010115 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 18:49:21.010258 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010117 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 18:49:21.010258 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010120 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 18:49:21.010258 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010123 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 18:49:21.010258 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010125 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 18:49:21.010258 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010128 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 18:49:21.010258 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010132 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 18:49:21.010258 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010135 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 18:49:21.010258 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010138 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 18:49:21.010258 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010141 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 18:49:21.010258 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010144 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 18:49:21.010258 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010146 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 18:49:21.010258 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010149 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 18:49:21.010258 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010152 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 18:49:21.010258 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010154 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 18:49:21.010258 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010157 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 18:49:21.010258 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010159 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 18:49:21.010792 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010162 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 18:49:21.010792 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010171 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 18:49:21.010792 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010174 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 18:49:21.010792 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010177 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 18:49:21.010792 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010179 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 18:49:21.010792 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010182 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 18:49:21.010792 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010184 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 18:49:21.010792 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010187 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 18:49:21.010792 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010189 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 18:49:21.010792 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010192 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 18:49:21.010792 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010195 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 18:49:21.010792 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010197 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 18:49:21.010792 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010200 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 18:49:21.010792 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010204 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 18:49:21.010792 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010207 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 18:49:21.010792 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010209 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 18:49:21.010792 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010212 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 18:49:21.010792 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010215 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 18:49:21.010792 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010217 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 18:49:21.010792 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010227 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 18:49:21.011277 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010229 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 18:49:21.011277 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010232 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 18:49:21.011277 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010235 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 18:49:21.011277 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010237 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 18:49:21.011277 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010240 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 18:49:21.011277 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010242 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 18:49:21.011277 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010245 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 17 18:49:21.011277 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010248 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 18:49:21.011277 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010250 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 18:49:21.011277 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010253 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 18:49:21.011277 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010255 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 18:49:21.011277 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010258 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 18:49:21.011277 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010261 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 18:49:21.011277 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010266 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 18:49:21.011277 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010270 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 18:49:21.011277 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010273 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 18:49:21.011277 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010277 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 18:49:21.011277 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010280 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 18:49:21.011277 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010283 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 18:49:21.011277 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010286 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 18:49:21.011783 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010289 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 18:49:21.011783 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010291 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 18:49:21.011783 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010294 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 18:49:21.011783 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010297 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 18:49:21.011783 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010300 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 18:49:21.011783 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010302 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 18:49:21.011783 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010305 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 18:49:21.011783 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010307 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 18:49:21.011783 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010310 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 18:49:21.011783 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010313 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 18:49:21.011783 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010315 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 18:49:21.011783 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010318 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 18:49:21.011783 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010321 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 18:49:21.011783 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010324 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 18:49:21.011783 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010327 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 18:49:21.011783 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010329 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 18:49:21.011783 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010332 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 18:49:21.011783 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010334 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 18:49:21.011783 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010338 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 18:49:21.012241 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010341 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 18:49:21.012241 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010343 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 18:49:21.012241 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010345 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 18:49:21.012241 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010730 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 18:49:21.012241 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010736 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 18:49:21.012241 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010739 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 18:49:21.012241 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010741 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 18:49:21.012241 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010744 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 18:49:21.012241 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010747 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 18:49:21.012241 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010750 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 18:49:21.012241 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010753 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 18:49:21.012241 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010756 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 18:49:21.012241 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010758 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 18:49:21.012241 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010761 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 18:49:21.012241 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010764 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 18:49:21.012241 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010767 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 18:49:21.012241 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010770 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 18:49:21.012241 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010773 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 18:49:21.012241 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010775 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 18:49:21.012241 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010778 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 18:49:21.012730 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010780 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 18:49:21.012730 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010783 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 18:49:21.012730 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010785 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 18:49:21.012730 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010788 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 18:49:21.012730 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010791 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 18:49:21.012730 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010794 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 18:49:21.012730 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010797 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 18:49:21.012730 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010801 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 18:49:21.012730 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010804 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 18:49:21.012730 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010807 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 18:49:21.012730 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010810 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 18:49:21.012730 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010813 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 18:49:21.012730 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010816 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 18:49:21.012730 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010818 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 18:49:21.012730 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010821 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 18:49:21.012730 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010823 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 18:49:21.012730 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010826 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 18:49:21.012730 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010830 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 18:49:21.012730 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010834 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 18:49:21.013242 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010836 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 18:49:21.013242 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010839 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 18:49:21.013242 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010842 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 18:49:21.013242 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010845 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 18:49:21.013242 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010848 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 18:49:21.013242 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010851 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 18:49:21.013242 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010853 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 18:49:21.013242 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010857 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 18:49:21.013242 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010860 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 18:49:21.013242 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010863 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 18:49:21.013242 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010866 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 18:49:21.013242 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010868 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 18:49:21.013242 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010872 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 18:49:21.013242 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010875 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 18:49:21.013242 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010877 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 18:49:21.013242 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010880 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 18:49:21.013242 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010882 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 18:49:21.013242 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010885 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 18:49:21.013242 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010887 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 18:49:21.013242 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010890 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 18:49:21.013740 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010893 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 18:49:21.013740 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010895 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 18:49:21.013740 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010898 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 18:49:21.013740 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010900 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 17 18:49:21.013740 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010903 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 18:49:21.013740 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010905 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 18:49:21.013740 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010908 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 18:49:21.013740 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010910 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 18:49:21.013740 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010913 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 18:49:21.013740 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010915 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 18:49:21.013740 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010917 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 18:49:21.013740 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010920 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 18:49:21.013740 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010922 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 18:49:21.013740 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010925 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 18:49:21.013740 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010928 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 18:49:21.013740 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010931 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 18:49:21.013740 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010933 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 18:49:21.013740 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010936 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 18:49:21.013740 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010938 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 18:49:21.013740 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010941 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 18:49:21.014223 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010944 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 18:49:21.014223 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010947 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 18:49:21.014223 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010949 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 18:49:21.014223 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010952 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 18:49:21.014223 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010954 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 18:49:21.014223 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010957 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 18:49:21.014223 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010960 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 18:49:21.014223 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010962 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 18:49:21.014223 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010964 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 18:49:21.014223 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.010967 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 18:49:21.014223 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012097 2571 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 18:49:21.014223 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012106 2571 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 18:49:21.014223 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012114 2571 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 18:49:21.014223 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012119 2571 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 18:49:21.014223 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012123 2571 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 18:49:21.014223 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012127 2571 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 18:49:21.014223 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012131 2571 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 18:49:21.014223 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012140 2571 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 18:49:21.014223 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012143 2571 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 18:49:21.014223 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012147 2571 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 18:49:21.014223 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012150 2571 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 18:49:21.014746 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012154 2571 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 18:49:21.014746 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012157 2571 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 18:49:21.014746 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012160 2571 flags.go:64] FLAG: --cgroup-root="" Apr 17 18:49:21.014746 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012163 2571 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 18:49:21.014746 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012166 2571 flags.go:64] FLAG: --client-ca-file="" Apr 17 18:49:21.014746 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012169 2571 flags.go:64] FLAG: --cloud-config="" Apr 17 18:49:21.014746 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012172 2571 flags.go:64] FLAG: --cloud-provider="external" Apr 17 18:49:21.014746 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012175 2571 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 18:49:21.014746 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012179 2571 flags.go:64] FLAG: --cluster-domain="" Apr 17 18:49:21.014746 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012182 2571 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 18:49:21.014746 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012185 2571 flags.go:64] FLAG: --config-dir="" Apr 17 18:49:21.014746 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012188 2571 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 18:49:21.014746 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012192 2571 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 18:49:21.014746 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012196 2571 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 18:49:21.014746 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012199 2571 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 18:49:21.014746 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012202 2571 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 18:49:21.014746 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012205 2571 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 18:49:21.014746 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012208 2571 flags.go:64] FLAG: --contention-profiling="false" Apr 17 18:49:21.014746 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012211 2571 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 18:49:21.014746 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012214 2571 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 18:49:21.014746 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012217 2571 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 18:49:21.014746 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012220 2571 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 18:49:21.014746 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012225 2571 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 18:49:21.014746 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012228 2571 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 18:49:21.014746 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012231 2571 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 18:49:21.015354 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012234 2571 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 18:49:21.015354 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012237 2571 flags.go:64] FLAG: --enable-server="true" Apr 17 18:49:21.015354 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012240 2571 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 18:49:21.015354 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012245 2571 flags.go:64] FLAG: --event-burst="100" Apr 17 18:49:21.015354 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012248 2571 flags.go:64] FLAG: --event-qps="50" Apr 17 18:49:21.015354 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012252 2571 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 18:49:21.015354 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012255 2571 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 18:49:21.015354 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012258 2571 flags.go:64] FLAG: --eviction-hard="" Apr 17 18:49:21.015354 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012261 2571 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 18:49:21.015354 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012264 2571 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 18:49:21.015354 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012268 2571 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 18:49:21.015354 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012271 2571 flags.go:64] FLAG: --eviction-soft="" Apr 17 18:49:21.015354 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012274 2571 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 18:49:21.015354 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012277 2571 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 18:49:21.015354 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012280 2571 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 18:49:21.015354 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012283 2571 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 18:49:21.015354 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012286 2571 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 18:49:21.015354 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012289 2571 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 18:49:21.015354 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012292 2571 flags.go:64] FLAG: --feature-gates="" Apr 17 18:49:21.015354 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012296 2571 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 18:49:21.015354 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012299 2571 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 18:49:21.015354 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012302 2571 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 18:49:21.015354 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012306 2571 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 18:49:21.015354 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012309 2571 flags.go:64] FLAG: --healthz-port="10248" Apr 17 18:49:21.015354 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012312 2571 flags.go:64] FLAG: --help="false" Apr 17 18:49:21.015968 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012315 2571 flags.go:64] FLAG: --hostname-override="ip-10-0-132-192.ec2.internal" Apr 17 18:49:21.015968 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012318 2571 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 18:49:21.015968 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012321 2571 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 18:49:21.015968 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012324 2571 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 18:49:21.015968 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012328 2571 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 18:49:21.015968 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012331 2571 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 18:49:21.015968 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012334 2571 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 18:49:21.015968 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012338 2571 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 18:49:21.015968 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012341 2571 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 18:49:21.015968 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012344 2571 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 18:49:21.015968 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012346 2571 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 18:49:21.015968 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012350 2571 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 18:49:21.015968 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012352 2571 flags.go:64] FLAG: --kube-reserved="" Apr 17 18:49:21.015968 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012356 2571 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 18:49:21.015968 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012358 2571 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 18:49:21.015968 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012362 2571 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 18:49:21.015968 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012365 2571 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 18:49:21.015968 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012368 2571 flags.go:64] FLAG: --lock-file="" Apr 17 18:49:21.015968 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012371 2571 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 18:49:21.015968 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012374 2571 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 18:49:21.015968 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012377 2571 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 18:49:21.015968 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012382 2571 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 18:49:21.015968 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012385 2571 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 18:49:21.016547 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012388 2571 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 18:49:21.016547 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012391 2571 flags.go:64] FLAG: --logging-format="text" Apr 17 18:49:21.016547 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012394 2571 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 18:49:21.016547 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012397 2571 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 18:49:21.016547 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012400 2571 flags.go:64] FLAG: --manifest-url="" Apr 17 18:49:21.016547 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012403 2571 flags.go:64] FLAG: --manifest-url-header="" Apr 17 18:49:21.016547 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012407 2571 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 18:49:21.016547 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012410 2571 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 18:49:21.016547 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012414 2571 flags.go:64] FLAG: --max-pods="110" Apr 17 18:49:21.016547 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012421 2571 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 18:49:21.016547 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012425 2571 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 18:49:21.016547 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012428 2571 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 18:49:21.016547 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012431 2571 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 18:49:21.016547 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012434 2571 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 18:49:21.016547 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012437 2571 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 18:49:21.016547 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012439 2571 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 18:49:21.016547 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012447 2571 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 18:49:21.016547 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012451 2571 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 18:49:21.016547 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012454 2571 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 18:49:21.016547 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012468 2571 flags.go:64] FLAG: --pod-cidr="" Apr 17 18:49:21.016547 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012472 2571 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 18:49:21.016547 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012477 2571 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 18:49:21.016547 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012480 2571 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 18:49:21.016547 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012483 2571 flags.go:64] FLAG: --pods-per-core="0" Apr 17 18:49:21.017110 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012486 2571 flags.go:64] FLAG: --port="10250" Apr 17 18:49:21.017110 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012489 2571 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 18:49:21.017110 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012492 2571 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0554bc2bc727d1a2d" Apr 17 18:49:21.017110 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012495 2571 flags.go:64] FLAG: --qos-reserved="" Apr 17 18:49:21.017110 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012498 2571 flags.go:64] FLAG: --read-only-port="10255" Apr 17 18:49:21.017110 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012501 2571 flags.go:64] FLAG: --register-node="true" Apr 17 18:49:21.017110 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012504 2571 flags.go:64] FLAG: --register-schedulable="true" Apr 17 18:49:21.017110 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012507 2571 flags.go:64] FLAG: --register-with-taints="" Apr 17 18:49:21.017110 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012510 2571 flags.go:64] FLAG: --registry-burst="10" Apr 17 18:49:21.017110 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012513 2571 flags.go:64] FLAG: --registry-qps="5" Apr 17 18:49:21.017110 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012516 2571 flags.go:64] FLAG: --reserved-cpus="" Apr 17 18:49:21.017110 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012520 2571 flags.go:64] FLAG: --reserved-memory="" Apr 17 18:49:21.017110 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012524 2571 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 18:49:21.017110 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012527 2571 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 18:49:21.017110 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012530 2571 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 18:49:21.017110 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012533 2571 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 18:49:21.017110 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012536 2571 flags.go:64] FLAG: --runonce="false" Apr 17 18:49:21.017110 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012540 2571 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 18:49:21.017110 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012543 2571 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 18:49:21.017110 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012546 2571 flags.go:64] FLAG: --seccomp-default="false" Apr 17 18:49:21.017110 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012549 2571 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 18:49:21.017110 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012553 2571 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 18:49:21.017110 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012556 2571 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 18:49:21.017110 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012559 2571 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 18:49:21.017110 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012562 2571 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 18:49:21.017110 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012565 2571 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 18:49:21.017755 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012568 2571 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 18:49:21.017755 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012571 2571 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 18:49:21.017755 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012574 2571 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 18:49:21.017755 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012577 2571 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 18:49:21.017755 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012580 2571 flags.go:64] FLAG: --system-cgroups="" Apr 17 18:49:21.017755 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012583 2571 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 18:49:21.017755 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012588 2571 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 18:49:21.017755 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012591 2571 flags.go:64] FLAG: --tls-cert-file="" Apr 17 18:49:21.017755 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012594 2571 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 18:49:21.017755 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012598 2571 flags.go:64] FLAG: --tls-min-version="" Apr 17 18:49:21.017755 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012601 2571 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 18:49:21.017755 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012604 2571 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 18:49:21.017755 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012606 2571 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 18:49:21.017755 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012609 2571 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 18:49:21.017755 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012612 2571 flags.go:64] FLAG: --v="2" Apr 17 18:49:21.017755 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012617 2571 flags.go:64] FLAG: --version="false" Apr 17 18:49:21.017755 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012621 2571 flags.go:64] FLAG: --vmodule="" Apr 17 18:49:21.017755 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012626 2571 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 18:49:21.017755 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.012629 2571 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 18:49:21.017755 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012715 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 18:49:21.017755 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012718 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 18:49:21.017755 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012722 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 18:49:21.017755 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012724 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 18:49:21.017755 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012728 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 18:49:21.018360 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012731 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 18:49:21.018360 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012734 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 18:49:21.018360 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012737 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 18:49:21.018360 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012740 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 18:49:21.018360 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012743 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 18:49:21.018360 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012745 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 18:49:21.018360 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012748 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 18:49:21.018360 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012750 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 18:49:21.018360 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012753 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 18:49:21.018360 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012756 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 18:49:21.018360 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012759 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 18:49:21.018360 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012761 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 17 18:49:21.018360 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012764 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 18:49:21.018360 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012766 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 18:49:21.018360 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012769 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 18:49:21.018360 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012771 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 18:49:21.018360 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012774 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 18:49:21.018360 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012776 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 18:49:21.018360 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012779 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 18:49:21.018360 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012781 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 18:49:21.019094 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012784 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 18:49:21.019094 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012786 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 18:49:21.019094 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012789 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 18:49:21.019094 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012791 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 18:49:21.019094 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012794 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 18:49:21.019094 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012797 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 18:49:21.019094 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012800 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 18:49:21.019094 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012803 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 18:49:21.019094 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012805 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 18:49:21.019094 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012808 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 18:49:21.019094 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012810 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 18:49:21.019094 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012814 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 18:49:21.019094 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012817 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 18:49:21.019094 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012819 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 18:49:21.019094 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012822 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 18:49:21.019094 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012825 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 18:49:21.019094 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012827 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 18:49:21.019094 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012830 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 18:49:21.019094 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012833 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 18:49:21.019094 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012835 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 18:49:21.020125 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012838 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 18:49:21.020125 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012841 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 18:49:21.020125 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012844 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 18:49:21.020125 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012846 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 18:49:21.020125 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012849 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 18:49:21.020125 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012852 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 18:49:21.020125 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012855 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 18:49:21.020125 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012857 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 18:49:21.020125 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012860 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 18:49:21.020125 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012862 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 18:49:21.020125 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012865 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 18:49:21.020125 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012867 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 18:49:21.020125 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012870 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 18:49:21.020125 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012872 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 18:49:21.020125 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012875 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 18:49:21.020125 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012877 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 18:49:21.020125 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012880 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 18:49:21.020125 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012884 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 18:49:21.020125 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012886 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 18:49:21.020125 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012889 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 18:49:21.020977 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012892 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 18:49:21.020977 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012895 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 18:49:21.020977 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012897 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 18:49:21.020977 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012901 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 18:49:21.020977 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012904 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 18:49:21.020977 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012906 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 18:49:21.020977 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012909 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 18:49:21.020977 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012911 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 18:49:21.020977 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012914 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 18:49:21.020977 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012916 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 18:49:21.020977 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012919 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 18:49:21.020977 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012923 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 18:49:21.020977 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012927 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 18:49:21.020977 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012930 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 18:49:21.020977 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012932 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 18:49:21.020977 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012940 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 18:49:21.020977 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012944 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 18:49:21.020977 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012947 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 18:49:21.020977 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012950 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 18:49:21.021841 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012952 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 18:49:21.021841 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.012955 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 18:49:21.021841 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.013574 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 18:49:21.021841 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.020620 2571 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 18:49:21.021841 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.020773 2571 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 18:49:21.021841 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.020850 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 18:49:21.021841 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.020861 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 18:49:21.021841 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.020867 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 18:49:21.021841 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.020872 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 18:49:21.021841 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.020876 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 18:49:21.021841 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.020881 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 18:49:21.021841 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.020885 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 18:49:21.021841 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.020889 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 18:49:21.021841 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.020894 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 18:49:21.021841 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.020898 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 18:49:21.022283 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.020902 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 18:49:21.022283 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.020907 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 18:49:21.022283 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.020911 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 18:49:21.022283 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.020915 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 18:49:21.022283 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.020919 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 18:49:21.022283 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.020924 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 18:49:21.022283 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.020929 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 18:49:21.022283 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.020933 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 18:49:21.022283 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.020937 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 18:49:21.022283 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.020942 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 18:49:21.022283 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.020947 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 18:49:21.022283 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.020952 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 18:49:21.022283 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.020956 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 18:49:21.022283 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.020961 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 18:49:21.022283 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.020965 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 18:49:21.022283 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.020969 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 18:49:21.022283 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.020973 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 18:49:21.022283 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.020976 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 18:49:21.022283 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.020981 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 18:49:21.022283 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.020985 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 18:49:21.022866 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.020988 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 18:49:21.022866 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.020999 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 18:49:21.022866 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021004 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 18:49:21.022866 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021009 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 18:49:21.022866 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021013 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 18:49:21.022866 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021018 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 18:49:21.022866 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021022 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 18:49:21.022866 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021026 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 18:49:21.022866 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021031 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 18:49:21.022866 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021035 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 18:49:21.022866 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021039 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 18:49:21.022866 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021043 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 18:49:21.022866 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021047 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 18:49:21.022866 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021051 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 18:49:21.022866 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021055 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 18:49:21.022866 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021059 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 18:49:21.022866 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021064 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 18:49:21.022866 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021068 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 17 18:49:21.022866 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021072 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 18:49:21.022866 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021076 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 18:49:21.023699 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021080 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 18:49:21.023699 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021084 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 18:49:21.023699 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021089 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 18:49:21.023699 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021093 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 18:49:21.023699 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021097 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 18:49:21.023699 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021102 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 18:49:21.023699 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021106 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 18:49:21.023699 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021110 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 18:49:21.023699 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021114 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 18:49:21.023699 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021131 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 18:49:21.023699 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021135 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 18:49:21.023699 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021140 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 18:49:21.023699 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021144 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 18:49:21.023699 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021149 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 18:49:21.023699 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021156 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 18:49:21.023699 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021160 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 18:49:21.023699 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021165 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 18:49:21.023699 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021169 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 18:49:21.023699 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021173 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 18:49:21.023699 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021178 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 18:49:21.024259 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021182 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 18:49:21.024259 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021186 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 18:49:21.024259 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021190 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 18:49:21.024259 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021194 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 18:49:21.024259 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021198 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 18:49:21.024259 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021203 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 18:49:21.024259 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021209 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 18:49:21.024259 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021215 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 18:49:21.024259 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021219 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 18:49:21.024259 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021224 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 18:49:21.024259 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021228 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 18:49:21.024259 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021232 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 18:49:21.024259 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021236 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 18:49:21.024259 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021241 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 18:49:21.024259 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021245 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 18:49:21.024259 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021249 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 18:49:21.024700 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.021257 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 18:49:21.024700 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021434 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 18:49:21.024700 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021442 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 18:49:21.024700 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021447 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 18:49:21.024700 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021452 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 18:49:21.024700 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021473 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 18:49:21.024700 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021479 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 18:49:21.024700 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021483 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 18:49:21.024700 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021487 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 18:49:21.024700 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021492 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 18:49:21.024700 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021497 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 17 18:49:21.024700 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021503 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 18:49:21.024700 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021507 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 18:49:21.024700 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021512 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 18:49:21.024700 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021516 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 18:49:21.024700 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021520 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 18:49:21.025086 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021524 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 18:49:21.025086 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021529 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 18:49:21.025086 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021533 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 18:49:21.025086 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021537 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 18:49:21.025086 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021541 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 18:49:21.025086 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021546 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 18:49:21.025086 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021550 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 18:49:21.025086 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021554 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 18:49:21.025086 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021559 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 18:49:21.025086 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021563 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 18:49:21.025086 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021567 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 18:49:21.025086 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021571 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 18:49:21.025086 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021576 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 18:49:21.025086 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021580 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 18:49:21.025086 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021584 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 18:49:21.025086 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021588 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 18:49:21.025086 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021592 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 18:49:21.025086 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021597 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 18:49:21.025086 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021601 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 18:49:21.025086 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021605 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 18:49:21.025791 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021610 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 18:49:21.025791 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021614 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 18:49:21.025791 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021619 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 18:49:21.025791 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021623 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 18:49:21.025791 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021626 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 18:49:21.025791 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021631 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 18:49:21.025791 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021636 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 18:49:21.025791 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021640 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 18:49:21.025791 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021645 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 18:49:21.025791 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021649 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 18:49:21.025791 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021653 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 18:49:21.025791 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021657 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 18:49:21.025791 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021661 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 18:49:21.025791 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021668 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 18:49:21.025791 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021674 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 18:49:21.025791 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021679 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 18:49:21.025791 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021684 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 18:49:21.025791 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021688 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 18:49:21.025791 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021693 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 18:49:21.025791 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021697 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 18:49:21.026569 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021703 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 18:49:21.026569 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021708 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 18:49:21.026569 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021713 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 18:49:21.026569 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021717 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 18:49:21.026569 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021721 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 18:49:21.026569 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021725 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 18:49:21.026569 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021729 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 18:49:21.026569 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021734 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 18:49:21.026569 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021738 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 18:49:21.026569 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021742 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 18:49:21.026569 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021746 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 18:49:21.026569 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021750 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 18:49:21.026569 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021753 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 18:49:21.026569 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021758 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 18:49:21.026569 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021762 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 18:49:21.026569 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021766 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 18:49:21.026569 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021770 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 18:49:21.026569 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021775 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 18:49:21.026569 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021779 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 18:49:21.027175 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021783 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 18:49:21.027175 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021788 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 18:49:21.027175 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021792 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 18:49:21.027175 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021796 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 18:49:21.027175 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021800 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 18:49:21.027175 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021805 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 18:49:21.027175 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021809 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 18:49:21.027175 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021813 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 18:49:21.027175 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021817 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 18:49:21.027175 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021821 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 18:49:21.027175 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021825 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 18:49:21.027175 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.021829 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 18:49:21.027175 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.021836 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 18:49:21.027175 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.022508 2571 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 18:49:21.027175 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.026868 2571 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 18:49:21.027752 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.027738 2571 server.go:1019] "Starting client certificate rotation" Apr 17 18:49:21.027878 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.027863 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 18:49:21.027911 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.027900 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 18:49:21.051584 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.051561 2571 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 18:49:21.054509 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.054487 2571 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 18:49:21.066585 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.066563 2571 log.go:25] "Validated CRI v1 runtime API" Apr 17 18:49:21.071839 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.071824 2571 log.go:25] "Validated CRI v1 image API" Apr 17 18:49:21.073526 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.073513 2571 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 18:49:21.076228 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.076210 2571 fs.go:135] Filesystem UUIDs: map[664a6ef3-481a-4563-be4c-c79612f922e0:/dev/nvme0n1p3 75bcb0cb-bf02-49e1-b97c-cd3141deed73:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 17 18:49:21.076282 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.076228 2571 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 18:49:21.080408 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.080392 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 18:49:21.081295 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.081187 2571 manager.go:217] Machine: {Timestamp:2026-04-17 18:49:21.080060709 +0000 UTC m=+0.354078014 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099429 MemoryCapacity:33164500992 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2861dfcbb967877fbf3d87b0750091 SystemUUID:ec2861df-cbb9-6787-7fbf-3d87b0750091 BootID:f5784c6c-7716-4068-b4b3-8cb32c0363c5 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582250496 Type:vfs Inodes:4048401 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:a1:1c:b1:54:29 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:a1:1c:b1:54:29 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:c6:43:6a:4a:3c:47 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164500992 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 18:49:21.081295 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.081291 2571 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 18:49:21.081445 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.081398 2571 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 18:49:21.083717 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.083692 2571 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 18:49:21.083891 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.083720 2571 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-132-192.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 18:49:21.083940 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.083909 2571 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 18:49:21.083940 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.083922 2571 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 18:49:21.083993 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.083941 2571 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 18:49:21.083993 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.083965 2571 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 18:49:21.084825 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.084814 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 17 18:49:21.084932 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.084923 2571 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 18:49:21.086939 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.086929 2571 kubelet.go:491] "Attempting to sync node with API server" Apr 17 18:49:21.086977 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.086948 2571 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 18:49:21.087551 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.087542 2571 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 18:49:21.087584 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.087554 2571 kubelet.go:397] "Adding apiserver pod source" Apr 17 18:49:21.087584 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.087563 2571 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 18:49:21.088786 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.088773 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 18:49:21.088840 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.088791 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 18:49:21.091601 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.091573 2571 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 18:49:21.093860 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.093847 2571 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 18:49:21.094905 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.094892 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 18:49:21.094946 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.094912 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 18:49:21.094946 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.094921 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 18:49:21.094946 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.094927 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 18:49:21.094946 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.094932 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 18:49:21.094946 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.094938 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 18:49:21.094946 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.094944 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 18:49:21.095104 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.094950 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 18:49:21.095104 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.094958 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 18:49:21.095104 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.094964 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 18:49:21.095104 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.094972 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 18:49:21.095104 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.094980 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 18:49:21.095737 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.095724 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 18:49:21.095780 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.095739 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 18:49:21.097329 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:21.097305 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-132-192.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 18:49:21.097419 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:21.097308 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 18:49:21.098835 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.098822 2571 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-132-192.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 18:49:21.099342 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.099331 2571 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 18:49:21.099386 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.099364 2571 server.go:1295] "Started kubelet" Apr 17 18:49:21.100134 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.099537 2571 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 18:49:21.100481 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.100247 2571 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 18:49:21.100569 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.099469 2571 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 18:49:21.101043 ip-10-0-132-192 systemd[1]: Started Kubernetes Kubelet. Apr 17 18:49:21.101985 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.101795 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-lctg2" Apr 17 18:49:21.102599 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.102577 2571 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 18:49:21.103816 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.103798 2571 server.go:317] "Adding debug handlers to kubelet server" Apr 17 18:49:21.106168 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.106149 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-lctg2" Apr 17 18:49:21.107302 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.107284 2571 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 18:49:21.107414 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.107400 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 18:49:21.107881 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.107862 2571 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 18:49:21.107881 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.107884 2571 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 18:49:21.108005 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.107870 2571 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 18:49:21.108005 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.107942 2571 reconstruct.go:97] "Volume reconstruction finished" Apr 17 18:49:21.108005 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.107948 2571 reconciler.go:26] "Reconciler: start to sync state" Apr 17 18:49:21.108149 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:21.108056 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-192.ec2.internal\" not found" Apr 17 18:49:21.109204 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:21.109178 2571 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-132-192.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 17 18:49:21.109556 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:21.109524 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 18:49:21.110118 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:21.109144 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-132-192.ec2.internal.18a73976f4f184c5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-132-192.ec2.internal,UID:ip-10-0-132-192.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-132-192.ec2.internal,},FirstTimestamp:2026-04-17 18:49:21.099343045 +0000 UTC m=+0.373360328,LastTimestamp:2026-04-17 18:49:21.099343045 +0000 UTC m=+0.373360328,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-132-192.ec2.internal,}" Apr 17 18:49:21.111347 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.111326 2571 factory.go:153] Registering CRI-O factory Apr 17 18:49:21.111347 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.111349 2571 factory.go:223] Registration of the crio container factory successfully Apr 17 18:49:21.111491 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.111413 2571 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 18:49:21.111491 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.111426 2571 factory.go:55] Registering systemd factory Apr 17 18:49:21.111491 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.111432 2571 factory.go:223] Registration of the systemd container factory successfully Apr 17 18:49:21.111491 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.111452 2571 factory.go:103] Registering Raw factory Apr 17 18:49:21.111491 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.111481 2571 manager.go:1196] Started watching for new ooms in manager Apr 17 18:49:21.111953 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.111939 2571 manager.go:319] Starting recovery of all containers Apr 17 18:49:21.112175 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:21.112145 2571 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 18:49:21.121308 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:21.121291 2571 helpers.go:245] readString: Failed to read "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service/memory.max": read /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service/memory.max: no such device Apr 17 18:49:21.121449 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.121427 2571 manager.go:324] Recovery completed Apr 17 18:49:21.125834 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.125813 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 18:49:21.129258 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.129246 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-192.ec2.internal" event="NodeHasSufficientMemory" Apr 17 18:49:21.129307 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.129270 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-192.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 18:49:21.129307 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.129287 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-192.ec2.internal" event="NodeHasSufficientPID" Apr 17 18:49:21.129817 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.129803 2571 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 18:49:21.129864 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.129817 2571 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 18:49:21.129864 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.129848 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 17 18:49:21.132921 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.132909 2571 policy_none.go:49] "None policy: Start" Apr 17 18:49:21.132968 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.132926 2571 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 18:49:21.132968 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.132936 2571 state_mem.go:35] "Initializing new in-memory state store" Apr 17 18:49:21.179808 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.174027 2571 manager.go:341] "Starting Device Plugin manager" Apr 17 18:49:21.179808 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:21.174051 2571 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 18:49:21.179808 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.174060 2571 server.go:85] "Starting device plugin registration server" Apr 17 18:49:21.179808 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.174233 2571 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 18:49:21.179808 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.174242 2571 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 18:49:21.179808 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.174341 2571 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 18:49:21.179808 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.174428 2571 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 18:49:21.179808 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.174436 2571 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 18:49:21.179808 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:21.174867 2571 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 18:49:21.179808 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:21.174901 2571 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-132-192.ec2.internal\" not found" Apr 17 18:49:21.231177 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.231141 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 18:49:21.232311 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.232293 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 18:49:21.232386 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.232317 2571 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 18:49:21.232386 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.232336 2571 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 18:49:21.232386 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.232343 2571 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 18:49:21.232386 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:21.232372 2571 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 18:49:21.234709 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.234690 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 18:49:21.274612 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.274556 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 18:49:21.275363 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.275348 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-192.ec2.internal" event="NodeHasSufficientMemory" Apr 17 18:49:21.275443 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.275381 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-192.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 18:49:21.275443 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.275398 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-192.ec2.internal" event="NodeHasSufficientPID" Apr 17 18:49:21.275443 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.275427 2571 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-132-192.ec2.internal" Apr 17 18:49:21.281059 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.281044 2571 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-132-192.ec2.internal" Apr 17 18:49:21.281140 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:21.281064 2571 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-132-192.ec2.internal\": node \"ip-10-0-132-192.ec2.internal\" not found" Apr 17 18:49:21.297626 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:21.297603 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-192.ec2.internal\" not found" Apr 17 18:49:21.332972 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.332944 2571 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-132-192.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-192.ec2.internal"] Apr 17 18:49:21.333104 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.333020 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 18:49:21.333958 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.333942 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-192.ec2.internal" event="NodeHasSufficientMemory" Apr 17 18:49:21.334045 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.333983 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-192.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 18:49:21.334045 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.333998 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-192.ec2.internal" event="NodeHasSufficientPID" Apr 17 18:49:21.336118 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.336102 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 18:49:21.336247 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.336227 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-192.ec2.internal" Apr 17 18:49:21.336295 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.336260 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 18:49:21.336823 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.336808 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-192.ec2.internal" event="NodeHasSufficientMemory" Apr 17 18:49:21.336895 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.336812 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-192.ec2.internal" event="NodeHasSufficientMemory" Apr 17 18:49:21.336895 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.336855 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-192.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 18:49:21.336895 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.336866 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-192.ec2.internal" event="NodeHasSufficientPID" Apr 17 18:49:21.336895 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.336835 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-192.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 18:49:21.337021 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.336912 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-192.ec2.internal" event="NodeHasSufficientPID" Apr 17 18:49:21.339021 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.339004 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-192.ec2.internal" Apr 17 18:49:21.339105 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.339030 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 18:49:21.339713 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.339691 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-192.ec2.internal" event="NodeHasSufficientMemory" Apr 17 18:49:21.339808 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.339724 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-192.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 18:49:21.339808 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.339738 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-192.ec2.internal" event="NodeHasSufficientPID" Apr 17 18:49:21.352981 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:21.352959 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-192.ec2.internal\" not found" node="ip-10-0-132-192.ec2.internal" Apr 17 18:49:21.356588 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:21.356573 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-192.ec2.internal\" not found" node="ip-10-0-132-192.ec2.internal" Apr 17 18:49:21.398523 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:21.398505 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-192.ec2.internal\" not found" Apr 17 18:49:21.410154 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.410134 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0286f842e2a4c3c84425f64aac72ff7f-config\") pod \"kube-apiserver-proxy-ip-10-0-132-192.ec2.internal\" (UID: \"0286f842e2a4c3c84425f64aac72ff7f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-192.ec2.internal" Apr 17 18:49:21.410236 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.410164 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b98093225b62e87c02ad86cc72d0d4aa-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-192.ec2.internal\" (UID: \"b98093225b62e87c02ad86cc72d0d4aa\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-192.ec2.internal" Apr 17 18:49:21.410236 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.410184 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b98093225b62e87c02ad86cc72d0d4aa-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-192.ec2.internal\" (UID: \"b98093225b62e87c02ad86cc72d0d4aa\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-192.ec2.internal" Apr 17 18:49:21.498584 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:21.498558 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-192.ec2.internal\" not found" Apr 17 18:49:21.510932 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.510909 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b98093225b62e87c02ad86cc72d0d4aa-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-192.ec2.internal\" (UID: \"b98093225b62e87c02ad86cc72d0d4aa\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-192.ec2.internal" Apr 17 18:49:21.510983 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.510940 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0286f842e2a4c3c84425f64aac72ff7f-config\") pod \"kube-apiserver-proxy-ip-10-0-132-192.ec2.internal\" (UID: \"0286f842e2a4c3c84425f64aac72ff7f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-192.ec2.internal" Apr 17 18:49:21.510983 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.510957 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b98093225b62e87c02ad86cc72d0d4aa-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-192.ec2.internal\" (UID: \"b98093225b62e87c02ad86cc72d0d4aa\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-192.ec2.internal" Apr 17 18:49:21.511042 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.511008 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b98093225b62e87c02ad86cc72d0d4aa-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-192.ec2.internal\" (UID: \"b98093225b62e87c02ad86cc72d0d4aa\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-192.ec2.internal" Apr 17 18:49:21.511074 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.511053 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0286f842e2a4c3c84425f64aac72ff7f-config\") pod \"kube-apiserver-proxy-ip-10-0-132-192.ec2.internal\" (UID: \"0286f842e2a4c3c84425f64aac72ff7f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-192.ec2.internal" Apr 17 18:49:21.511109 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.511077 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b98093225b62e87c02ad86cc72d0d4aa-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-192.ec2.internal\" (UID: \"b98093225b62e87c02ad86cc72d0d4aa\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-192.ec2.internal" Apr 17 18:49:21.599353 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:21.599296 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-192.ec2.internal\" not found" Apr 17 18:49:21.654796 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.654775 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-192.ec2.internal" Apr 17 18:49:21.658260 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:21.658244 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-192.ec2.internal" Apr 17 18:49:21.699732 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:21.699704 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-192.ec2.internal\" not found" Apr 17 18:49:21.800281 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:21.800245 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-192.ec2.internal\" not found" Apr 17 18:49:21.900746 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:21.900695 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-192.ec2.internal\" not found" Apr 17 18:49:22.001262 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:22.001240 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-192.ec2.internal\" not found" Apr 17 18:49:22.027713 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.027694 2571 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 18:49:22.028203 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.027842 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 18:49:22.036965 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.036948 2571 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 18:49:22.088104 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.088076 2571 apiserver.go:52] "Watching apiserver" Apr 17 18:49:22.095854 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.095834 2571 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 18:49:22.097338 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.097315 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-d99zz","openshift-network-operator/iptables-alerter-hscd2","kube-system/konnectivity-agent-rjfcm","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmqn4","openshift-cluster-node-tuning-operator/tuned-qzf4h","openshift-dns/node-resolver-plbd4","openshift-multus/multus-additional-cni-plugins-svmpv","openshift-multus/multus-mcrqw","openshift-network-diagnostics/network-check-target-kx5kn","openshift-ovn-kubernetes/ovnkube-node-sctgd","openshift-image-registry/node-ca-t5hn4"] Apr 17 18:49:22.102905 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.102878 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-hscd2" Apr 17 18:49:22.105840 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.105678 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 18:49:22.106035 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.106001 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 18:49:22.106120 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.106106 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-gczng\"" Apr 17 18:49:22.106360 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.106345 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 18:49:22.106435 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.106381 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d99zz" Apr 17 18:49:22.106512 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:22.106448 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d99zz" podUID="9cb68ed8-ce9b-48b8-9980-07d87baf968b" Apr 17 18:49:22.106512 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.106479 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-rjfcm" Apr 17 18:49:22.107577 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.107560 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 18:49:22.107672 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.107580 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-192.ec2.internal" Apr 17 18:49:22.107889 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.107867 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 18:44:21 +0000 UTC" deadline="2027-12-26 13:24:37.920472536 +0000 UTC" Apr 17 18:49:22.107946 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.107888 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14826h35m15.812586837s" Apr 17 18:49:22.108220 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.108204 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 18:49:22.108531 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.108516 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-rqpqv\"" Apr 17 18:49:22.108600 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.108530 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 18:49:22.110768 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.110751 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmqn4" Apr 17 18:49:22.112635 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.112615 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 18:49:22.112731 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.112714 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-74dfs\"" Apr 17 18:49:22.112789 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.112717 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 18:49:22.112844 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.112792 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 18:49:22.112844 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.112820 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" Apr 17 18:49:22.112947 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.112931 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-plbd4" Apr 17 18:49:22.113860 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.113844 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfjwg\" (UniqueName: \"kubernetes.io/projected/9cb68ed8-ce9b-48b8-9980-07d87baf968b-kube-api-access-cfjwg\") pod \"network-metrics-daemon-d99zz\" (UID: \"9cb68ed8-ce9b-48b8-9980-07d87baf968b\") " pod="openshift-multus/network-metrics-daemon-d99zz" Apr 17 18:49:22.113914 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.113871 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/396ffb22-0675-49a2-99f6-018d18d53fe3-agent-certs\") pod \"konnectivity-agent-rjfcm\" (UID: \"396ffb22-0675-49a2-99f6-018d18d53fe3\") " pod="kube-system/konnectivity-agent-rjfcm" Apr 17 18:49:22.113914 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.113905 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/696b7876-308e-4809-8fe2-ebe856ea41b9-etc-selinux\") pod \"aws-ebs-csi-driver-node-vmqn4\" (UID: \"696b7876-308e-4809-8fe2-ebe856ea41b9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmqn4" Apr 17 18:49:22.113978 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.113960 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/696b7876-308e-4809-8fe2-ebe856ea41b9-sys-fs\") pod \"aws-ebs-csi-driver-node-vmqn4\" (UID: \"696b7876-308e-4809-8fe2-ebe856ea41b9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmqn4" Apr 17 18:49:22.114018 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.113986 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9cb68ed8-ce9b-48b8-9980-07d87baf968b-metrics-certs\") pod \"network-metrics-daemon-d99zz\" (UID: \"9cb68ed8-ce9b-48b8-9980-07d87baf968b\") " pod="openshift-multus/network-metrics-daemon-d99zz" Apr 17 18:49:22.114018 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.114012 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvxqm\" (UniqueName: \"kubernetes.io/projected/91b8b586-8905-45d3-8ed4-577118981cd3-kube-api-access-tvxqm\") pod \"iptables-alerter-hscd2\" (UID: \"91b8b586-8905-45d3-8ed4-577118981cd3\") " pod="openshift-network-operator/iptables-alerter-hscd2" Apr 17 18:49:22.114080 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.114030 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/91b8b586-8905-45d3-8ed4-577118981cd3-iptables-alerter-script\") pod \"iptables-alerter-hscd2\" (UID: \"91b8b586-8905-45d3-8ed4-577118981cd3\") " pod="openshift-network-operator/iptables-alerter-hscd2" Apr 17 18:49:22.114080 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.114045 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/91b8b586-8905-45d3-8ed4-577118981cd3-host-slash\") pod \"iptables-alerter-hscd2\" (UID: \"91b8b586-8905-45d3-8ed4-577118981cd3\") " pod="openshift-network-operator/iptables-alerter-hscd2" Apr 17 18:49:22.114080 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.114061 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/696b7876-308e-4809-8fe2-ebe856ea41b9-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vmqn4\" (UID: \"696b7876-308e-4809-8fe2-ebe856ea41b9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmqn4" Apr 17 18:49:22.114080 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.114076 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/696b7876-308e-4809-8fe2-ebe856ea41b9-socket-dir\") pod \"aws-ebs-csi-driver-node-vmqn4\" (UID: \"696b7876-308e-4809-8fe2-ebe856ea41b9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmqn4" Apr 17 18:49:22.114198 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.114091 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/396ffb22-0675-49a2-99f6-018d18d53fe3-konnectivity-ca\") pod \"konnectivity-agent-rjfcm\" (UID: \"396ffb22-0675-49a2-99f6-018d18d53fe3\") " pod="kube-system/konnectivity-agent-rjfcm" Apr 17 18:49:22.114198 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.114105 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/696b7876-308e-4809-8fe2-ebe856ea41b9-registration-dir\") pod \"aws-ebs-csi-driver-node-vmqn4\" (UID: \"696b7876-308e-4809-8fe2-ebe856ea41b9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmqn4" Apr 17 18:49:22.114198 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.114119 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drd5n\" (UniqueName: \"kubernetes.io/projected/696b7876-308e-4809-8fe2-ebe856ea41b9-kube-api-access-drd5n\") pod \"aws-ebs-csi-driver-node-vmqn4\" (UID: \"696b7876-308e-4809-8fe2-ebe856ea41b9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmqn4" Apr 17 18:49:22.114198 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.114134 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/696b7876-308e-4809-8fe2-ebe856ea41b9-device-dir\") pod \"aws-ebs-csi-driver-node-vmqn4\" (UID: \"696b7876-308e-4809-8fe2-ebe856ea41b9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmqn4" Apr 17 18:49:22.114695 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.114676 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-xlxz2\"" Apr 17 18:49:22.114805 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.114703 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-b6ngv\"" Apr 17 18:49:22.114805 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.114679 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 18:49:22.114805 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.114680 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 18:49:22.114993 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.114818 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 18:49:22.114993 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.114988 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 18:49:22.115274 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.115260 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-svmpv" Apr 17 18:49:22.116735 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.116717 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 18:49:22.116815 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.116788 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-192.ec2.internal" Apr 17 18:49:22.117679 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.117421 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.117679 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.117487 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 18:49:22.117815 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.117767 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 18:49:22.117815 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.117782 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 18:49:22.117920 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.117841 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-dn6ct\"" Apr 17 18:49:22.118085 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.117982 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 18:49:22.118085 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.118034 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 18:49:22.119691 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.119672 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-8mcwb\"" Apr 17 18:49:22.119770 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.119699 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kx5kn" Apr 17 18:49:22.119819 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:22.119773 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kx5kn" podUID="a96394bb-18f6-42cc-975e-c0532c5c2943" Apr 17 18:49:22.119819 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.119806 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 18:49:22.121208 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.121190 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 18:49:22.122209 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.122190 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.124063 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.124047 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 18:49:22.124298 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.124283 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 18:49:22.124345 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.124307 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-132-192.ec2.internal"] Apr 17 18:49:22.124422 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.124406 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-t5hn4" Apr 17 18:49:22.125217 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.124682 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 18:49:22.125217 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.125009 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-sfft2\"" Apr 17 18:49:22.125217 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.125089 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 18:49:22.125217 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.125090 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 18:49:22.125474 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.125435 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 18:49:22.125937 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.125921 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 18:49:22.126349 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.126147 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-192.ec2.internal"] Apr 17 18:49:22.127759 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.127735 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 18:49:22.127851 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.127778 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 18:49:22.127909 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.127895 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 18:49:22.128490 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.128474 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-2vzsf\"" Apr 17 18:49:22.135698 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.135681 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-9cr59" Apr 17 18:49:22.142989 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.142973 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-9cr59" Apr 17 18:49:22.166286 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:22.166258 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb98093225b62e87c02ad86cc72d0d4aa.slice/crio-4c96cc5a44954bed221ad26dace619f89b350c5eb804d47939188e7d54febab7 WatchSource:0}: Error finding container 4c96cc5a44954bed221ad26dace619f89b350c5eb804d47939188e7d54febab7: Status 404 returned error can't find the container with id 4c96cc5a44954bed221ad26dace619f89b350c5eb804d47939188e7d54febab7 Apr 17 18:49:22.166596 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:22.166580 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0286f842e2a4c3c84425f64aac72ff7f.slice/crio-9c6467a961cd1bab1726ada5b141333abad38f8fbe49d0cde869658fe767c75d WatchSource:0}: Error finding container 9c6467a961cd1bab1726ada5b141333abad38f8fbe49d0cde869658fe767c75d: Status 404 returned error can't find the container with id 9c6467a961cd1bab1726ada5b141333abad38f8fbe49d0cde869658fe767c75d Apr 17 18:49:22.172967 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.172947 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 18:49:22.208692 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.208673 2571 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 18:49:22.214342 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.214319 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58b4g\" (UniqueName: \"kubernetes.io/projected/a96394bb-18f6-42cc-975e-c0532c5c2943-kube-api-access-58b4g\") pod \"network-check-target-kx5kn\" (UID: \"a96394bb-18f6-42cc-975e-c0532c5c2943\") " pod="openshift-network-diagnostics/network-check-target-kx5kn" Apr 17 18:49:22.214440 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.214353 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/976291a6-4419-41fa-a907-55cd1103cf75-tmp\") pod \"tuned-qzf4h\" (UID: \"976291a6-4419-41fa-a907-55cd1103cf75\") " pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" Apr 17 18:49:22.214440 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.214378 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcqft\" (UniqueName: \"kubernetes.io/projected/3d42e5aa-a588-4ce1-a264-4581a72945cb-kube-api-access-pcqft\") pod \"node-ca-t5hn4\" (UID: \"3d42e5aa-a588-4ce1-a264-4581a72945cb\") " pod="openshift-image-registry/node-ca-t5hn4" Apr 17 18:49:22.214440 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.214425 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-hostroot\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.214622 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.214448 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ccccfa46-ab60-4610-8d60-6fad3773eedb-ovnkube-config\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.214622 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.214495 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/396ffb22-0675-49a2-99f6-018d18d53fe3-konnectivity-ca\") pod \"konnectivity-agent-rjfcm\" (UID: \"396ffb22-0675-49a2-99f6-018d18d53fe3\") " pod="kube-system/konnectivity-agent-rjfcm" Apr 17 18:49:22.214622 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.214521 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-drd5n\" (UniqueName: \"kubernetes.io/projected/696b7876-308e-4809-8fe2-ebe856ea41b9-kube-api-access-drd5n\") pod \"aws-ebs-csi-driver-node-vmqn4\" (UID: \"696b7876-308e-4809-8fe2-ebe856ea41b9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmqn4" Apr 17 18:49:22.214622 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.214548 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/970c6428-01b9-4aa1-be61-fc714b218008-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-svmpv\" (UID: \"970c6428-01b9-4aa1-be61-fc714b218008\") " pod="openshift-multus/multus-additional-cni-plugins-svmpv" Apr 17 18:49:22.214622 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.214577 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-cni-binary-copy\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.214622 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.214600 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-host-run-netns\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.214849 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.214625 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-multus-socket-dir-parent\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.214849 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.214653 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/696b7876-308e-4809-8fe2-ebe856ea41b9-etc-selinux\") pod \"aws-ebs-csi-driver-node-vmqn4\" (UID: \"696b7876-308e-4809-8fe2-ebe856ea41b9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmqn4" Apr 17 18:49:22.214849 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.214678 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qvll\" (UniqueName: \"kubernetes.io/projected/8210137d-ed94-434c-897d-f67481261a39-kube-api-access-8qvll\") pod \"node-resolver-plbd4\" (UID: \"8210137d-ed94-434c-897d-f67481261a39\") " pod="openshift-dns/node-resolver-plbd4" Apr 17 18:49:22.214849 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.214702 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-multus-cni-dir\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.214849 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.214753 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-os-release\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.214849 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.214772 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/696b7876-308e-4809-8fe2-ebe856ea41b9-etc-selinux\") pod \"aws-ebs-csi-driver-node-vmqn4\" (UID: \"696b7876-308e-4809-8fe2-ebe856ea41b9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmqn4" Apr 17 18:49:22.214849 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.214792 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-run-systemd\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.214849 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.214813 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/976291a6-4419-41fa-a907-55cd1103cf75-etc-modprobe-d\") pod \"tuned-qzf4h\" (UID: \"976291a6-4419-41fa-a907-55cd1103cf75\") " pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" Apr 17 18:49:22.214849 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.214842 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/976291a6-4419-41fa-a907-55cd1103cf75-etc-sysctl-conf\") pod \"tuned-qzf4h\" (UID: \"976291a6-4419-41fa-a907-55cd1103cf75\") " pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" Apr 17 18:49:22.215168 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.214859 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-host-var-lib-kubelet\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.215168 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.214874 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-etc-openvswitch\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.215168 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.214890 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.215168 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.214904 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ccccfa46-ab60-4610-8d60-6fad3773eedb-ovn-node-metrics-cert\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.215168 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.214924 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/91b8b586-8905-45d3-8ed4-577118981cd3-host-slash\") pod \"iptables-alerter-hscd2\" (UID: \"91b8b586-8905-45d3-8ed4-577118981cd3\") " pod="openshift-network-operator/iptables-alerter-hscd2" Apr 17 18:49:22.215168 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.214941 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/696b7876-308e-4809-8fe2-ebe856ea41b9-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vmqn4\" (UID: \"696b7876-308e-4809-8fe2-ebe856ea41b9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmqn4" Apr 17 18:49:22.215168 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.214948 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/396ffb22-0675-49a2-99f6-018d18d53fe3-konnectivity-ca\") pod \"konnectivity-agent-rjfcm\" (UID: \"396ffb22-0675-49a2-99f6-018d18d53fe3\") " pod="kube-system/konnectivity-agent-rjfcm" Apr 17 18:49:22.215168 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.214962 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/696b7876-308e-4809-8fe2-ebe856ea41b9-socket-dir\") pod \"aws-ebs-csi-driver-node-vmqn4\" (UID: \"696b7876-308e-4809-8fe2-ebe856ea41b9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmqn4" Apr 17 18:49:22.215168 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.214983 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3d42e5aa-a588-4ce1-a264-4581a72945cb-serviceca\") pod \"node-ca-t5hn4\" (UID: \"3d42e5aa-a588-4ce1-a264-4581a72945cb\") " pod="openshift-image-registry/node-ca-t5hn4" Apr 17 18:49:22.215168 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.214991 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/91b8b586-8905-45d3-8ed4-577118981cd3-host-slash\") pod \"iptables-alerter-hscd2\" (UID: \"91b8b586-8905-45d3-8ed4-577118981cd3\") " pod="openshift-network-operator/iptables-alerter-hscd2" Apr 17 18:49:22.215168 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215005 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bshf5\" (UniqueName: \"kubernetes.io/projected/970c6428-01b9-4aa1-be61-fc714b218008-kube-api-access-bshf5\") pod \"multus-additional-cni-plugins-svmpv\" (UID: \"970c6428-01b9-4aa1-be61-fc714b218008\") " pod="openshift-multus/multus-additional-cni-plugins-svmpv" Apr 17 18:49:22.215168 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215033 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/696b7876-308e-4809-8fe2-ebe856ea41b9-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vmqn4\" (UID: \"696b7876-308e-4809-8fe2-ebe856ea41b9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmqn4" Apr 17 18:49:22.215168 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215049 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-host-run-k8s-cni-cncf-io\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.215168 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215075 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-host-var-lib-cni-multus\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.215168 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215096 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-host-cni-netd\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.215168 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215096 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/696b7876-308e-4809-8fe2-ebe856ea41b9-socket-dir\") pod \"aws-ebs-csi-driver-node-vmqn4\" (UID: \"696b7876-308e-4809-8fe2-ebe856ea41b9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmqn4" Apr 17 18:49:22.215717 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215119 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/976291a6-4419-41fa-a907-55cd1103cf75-etc-sysctl-d\") pod \"tuned-qzf4h\" (UID: \"976291a6-4419-41fa-a907-55cd1103cf75\") " pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" Apr 17 18:49:22.215717 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215134 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jzdk\" (UniqueName: \"kubernetes.io/projected/976291a6-4419-41fa-a907-55cd1103cf75-kube-api-access-8jzdk\") pod \"tuned-qzf4h\" (UID: \"976291a6-4419-41fa-a907-55cd1103cf75\") " pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" Apr 17 18:49:22.215717 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215153 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/970c6428-01b9-4aa1-be61-fc714b218008-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-svmpv\" (UID: \"970c6428-01b9-4aa1-be61-fc714b218008\") " pod="openshift-multus/multus-additional-cni-plugins-svmpv" Apr 17 18:49:22.215717 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215183 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-host-kubelet\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.215717 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215207 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46vpj\" (UniqueName: \"kubernetes.io/projected/ccccfa46-ab60-4610-8d60-6fad3773eedb-kube-api-access-46vpj\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.215717 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215234 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/696b7876-308e-4809-8fe2-ebe856ea41b9-device-dir\") pod \"aws-ebs-csi-driver-node-vmqn4\" (UID: \"696b7876-308e-4809-8fe2-ebe856ea41b9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmqn4" Apr 17 18:49:22.215717 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215272 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/976291a6-4419-41fa-a907-55cd1103cf75-etc-kubernetes\") pod \"tuned-qzf4h\" (UID: \"976291a6-4419-41fa-a907-55cd1103cf75\") " pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" Apr 17 18:49:22.215717 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215279 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/696b7876-308e-4809-8fe2-ebe856ea41b9-device-dir\") pod \"aws-ebs-csi-driver-node-vmqn4\" (UID: \"696b7876-308e-4809-8fe2-ebe856ea41b9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmqn4" Apr 17 18:49:22.215717 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215311 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/970c6428-01b9-4aa1-be61-fc714b218008-system-cni-dir\") pod \"multus-additional-cni-plugins-svmpv\" (UID: \"970c6428-01b9-4aa1-be61-fc714b218008\") " pod="openshift-multus/multus-additional-cni-plugins-svmpv" Apr 17 18:49:22.215717 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215334 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-host-run-netns\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.215717 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215351 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-host-slash\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.215717 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215365 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-run-ovn\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.215717 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215380 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ccccfa46-ab60-4610-8d60-6fad3773eedb-env-overrides\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.215717 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215407 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tvxqm\" (UniqueName: \"kubernetes.io/projected/91b8b586-8905-45d3-8ed4-577118981cd3-kube-api-access-tvxqm\") pod \"iptables-alerter-hscd2\" (UID: \"91b8b586-8905-45d3-8ed4-577118981cd3\") " pod="openshift-network-operator/iptables-alerter-hscd2" Apr 17 18:49:22.215717 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215424 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/976291a6-4419-41fa-a907-55cd1103cf75-run\") pod \"tuned-qzf4h\" (UID: \"976291a6-4419-41fa-a907-55cd1103cf75\") " pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" Apr 17 18:49:22.215717 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215438 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8210137d-ed94-434c-897d-f67481261a39-hosts-file\") pod \"node-resolver-plbd4\" (UID: \"8210137d-ed94-434c-897d-f67481261a39\") " pod="openshift-dns/node-resolver-plbd4" Apr 17 18:49:22.216404 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215452 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-log-socket\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.216404 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215512 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/970c6428-01b9-4aa1-be61-fc714b218008-os-release\") pod \"multus-additional-cni-plugins-svmpv\" (UID: \"970c6428-01b9-4aa1-be61-fc714b218008\") " pod="openshift-multus/multus-additional-cni-plugins-svmpv" Apr 17 18:49:22.216404 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215527 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-cnibin\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.216404 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215541 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-host-var-lib-cni-bin\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.216404 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215564 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-multus-daemon-config\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.216404 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215598 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-node-log\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.216404 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215640 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-host-run-ovn-kubernetes\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.216404 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215666 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cfjwg\" (UniqueName: \"kubernetes.io/projected/9cb68ed8-ce9b-48b8-9980-07d87baf968b-kube-api-access-cfjwg\") pod \"network-metrics-daemon-d99zz\" (UID: \"9cb68ed8-ce9b-48b8-9980-07d87baf968b\") " pod="openshift-multus/network-metrics-daemon-d99zz" Apr 17 18:49:22.216404 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215681 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/396ffb22-0675-49a2-99f6-018d18d53fe3-agent-certs\") pod \"konnectivity-agent-rjfcm\" (UID: \"396ffb22-0675-49a2-99f6-018d18d53fe3\") " pod="kube-system/konnectivity-agent-rjfcm" Apr 17 18:49:22.216404 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215696 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/696b7876-308e-4809-8fe2-ebe856ea41b9-sys-fs\") pod \"aws-ebs-csi-driver-node-vmqn4\" (UID: \"696b7876-308e-4809-8fe2-ebe856ea41b9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmqn4" Apr 17 18:49:22.216404 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215711 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/970c6428-01b9-4aa1-be61-fc714b218008-cnibin\") pod \"multus-additional-cni-plugins-svmpv\" (UID: \"970c6428-01b9-4aa1-be61-fc714b218008\") " pod="openshift-multus/multus-additional-cni-plugins-svmpv" Apr 17 18:49:22.216404 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215727 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-var-lib-openvswitch\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.216404 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215742 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-host-cni-bin\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.216404 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215776 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/976291a6-4419-41fa-a907-55cd1103cf75-var-lib-kubelet\") pod \"tuned-qzf4h\" (UID: \"976291a6-4419-41fa-a907-55cd1103cf75\") " pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" Apr 17 18:49:22.216404 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215805 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8210137d-ed94-434c-897d-f67481261a39-tmp-dir\") pod \"node-resolver-plbd4\" (UID: \"8210137d-ed94-434c-897d-f67481261a39\") " pod="openshift-dns/node-resolver-plbd4" Apr 17 18:49:22.216404 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215781 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/696b7876-308e-4809-8fe2-ebe856ea41b9-sys-fs\") pod \"aws-ebs-csi-driver-node-vmqn4\" (UID: \"696b7876-308e-4809-8fe2-ebe856ea41b9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmqn4" Apr 17 18:49:22.216404 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215827 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-etc-kubernetes\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.216932 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215873 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-systemd-units\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.216932 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215895 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ccccfa46-ab60-4610-8d60-6fad3773eedb-ovnkube-script-lib\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.216932 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215913 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/91b8b586-8905-45d3-8ed4-577118981cd3-iptables-alerter-script\") pod \"iptables-alerter-hscd2\" (UID: \"91b8b586-8905-45d3-8ed4-577118981cd3\") " pod="openshift-network-operator/iptables-alerter-hscd2" Apr 17 18:49:22.216932 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215931 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/976291a6-4419-41fa-a907-55cd1103cf75-etc-tuned\") pod \"tuned-qzf4h\" (UID: \"976291a6-4419-41fa-a907-55cd1103cf75\") " pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" Apr 17 18:49:22.216932 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215945 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-system-cni-dir\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.216932 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215963 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/696b7876-308e-4809-8fe2-ebe856ea41b9-registration-dir\") pod \"aws-ebs-csi-driver-node-vmqn4\" (UID: \"696b7876-308e-4809-8fe2-ebe856ea41b9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmqn4" Apr 17 18:49:22.216932 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.215988 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/970c6428-01b9-4aa1-be61-fc714b218008-cni-binary-copy\") pod \"multus-additional-cni-plugins-svmpv\" (UID: \"970c6428-01b9-4aa1-be61-fc714b218008\") " pod="openshift-multus/multus-additional-cni-plugins-svmpv" Apr 17 18:49:22.216932 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.216004 2571 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 18:49:22.216932 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.216014 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-host-run-multus-certs\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.216932 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.216029 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/696b7876-308e-4809-8fe2-ebe856ea41b9-registration-dir\") pod \"aws-ebs-csi-driver-node-vmqn4\" (UID: \"696b7876-308e-4809-8fe2-ebe856ea41b9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmqn4" Apr 17 18:49:22.216932 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.216038 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcmmh\" (UniqueName: \"kubernetes.io/projected/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-kube-api-access-xcmmh\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.216932 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.216064 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-run-openvswitch\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.216932 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.216096 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/976291a6-4419-41fa-a907-55cd1103cf75-sys\") pod \"tuned-qzf4h\" (UID: \"976291a6-4419-41fa-a907-55cd1103cf75\") " pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" Apr 17 18:49:22.216932 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.216129 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/976291a6-4419-41fa-a907-55cd1103cf75-lib-modules\") pod \"tuned-qzf4h\" (UID: \"976291a6-4419-41fa-a907-55cd1103cf75\") " pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" Apr 17 18:49:22.216932 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.216165 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/976291a6-4419-41fa-a907-55cd1103cf75-etc-systemd\") pod \"tuned-qzf4h\" (UID: \"976291a6-4419-41fa-a907-55cd1103cf75\") " pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" Apr 17 18:49:22.216932 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.216189 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/970c6428-01b9-4aa1-be61-fc714b218008-tuning-conf-dir\") pod \"multus-additional-cni-plugins-svmpv\" (UID: \"970c6428-01b9-4aa1-be61-fc714b218008\") " pod="openshift-multus/multus-additional-cni-plugins-svmpv" Apr 17 18:49:22.216932 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.216219 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-multus-conf-dir\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.217424 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.216242 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9cb68ed8-ce9b-48b8-9980-07d87baf968b-metrics-certs\") pod \"network-metrics-daemon-d99zz\" (UID: \"9cb68ed8-ce9b-48b8-9980-07d87baf968b\") " pod="openshift-multus/network-metrics-daemon-d99zz" Apr 17 18:49:22.217424 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.216273 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/976291a6-4419-41fa-a907-55cd1103cf75-etc-sysconfig\") pod \"tuned-qzf4h\" (UID: \"976291a6-4419-41fa-a907-55cd1103cf75\") " pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" Apr 17 18:49:22.217424 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:22.216332 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:49:22.217424 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.216356 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/976291a6-4419-41fa-a907-55cd1103cf75-host\") pod \"tuned-qzf4h\" (UID: \"976291a6-4419-41fa-a907-55cd1103cf75\") " pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" Apr 17 18:49:22.217424 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.216376 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3d42e5aa-a588-4ce1-a264-4581a72945cb-host\") pod \"node-ca-t5hn4\" (UID: \"3d42e5aa-a588-4ce1-a264-4581a72945cb\") " pod="openshift-image-registry/node-ca-t5hn4" Apr 17 18:49:22.217424 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:22.216399 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9cb68ed8-ce9b-48b8-9980-07d87baf968b-metrics-certs podName:9cb68ed8-ce9b-48b8-9980-07d87baf968b nodeName:}" failed. No retries permitted until 2026-04-17 18:49:22.716383017 +0000 UTC m=+1.990400315 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9cb68ed8-ce9b-48b8-9980-07d87baf968b-metrics-certs") pod "network-metrics-daemon-d99zz" (UID: "9cb68ed8-ce9b-48b8-9980-07d87baf968b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:49:22.217424 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.216404 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/91b8b586-8905-45d3-8ed4-577118981cd3-iptables-alerter-script\") pod \"iptables-alerter-hscd2\" (UID: \"91b8b586-8905-45d3-8ed4-577118981cd3\") " pod="openshift-network-operator/iptables-alerter-hscd2" Apr 17 18:49:22.218897 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.218881 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/396ffb22-0675-49a2-99f6-018d18d53fe3-agent-certs\") pod \"konnectivity-agent-rjfcm\" (UID: \"396ffb22-0675-49a2-99f6-018d18d53fe3\") " pod="kube-system/konnectivity-agent-rjfcm" Apr 17 18:49:22.221977 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.221959 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-drd5n\" (UniqueName: \"kubernetes.io/projected/696b7876-308e-4809-8fe2-ebe856ea41b9-kube-api-access-drd5n\") pod \"aws-ebs-csi-driver-node-vmqn4\" (UID: \"696b7876-308e-4809-8fe2-ebe856ea41b9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmqn4" Apr 17 18:49:22.225351 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.225333 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvxqm\" (UniqueName: \"kubernetes.io/projected/91b8b586-8905-45d3-8ed4-577118981cd3-kube-api-access-tvxqm\") pod \"iptables-alerter-hscd2\" (UID: \"91b8b586-8905-45d3-8ed4-577118981cd3\") " pod="openshift-network-operator/iptables-alerter-hscd2" Apr 17 18:49:22.226351 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.226332 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfjwg\" (UniqueName: \"kubernetes.io/projected/9cb68ed8-ce9b-48b8-9980-07d87baf968b-kube-api-access-cfjwg\") pod \"network-metrics-daemon-d99zz\" (UID: \"9cb68ed8-ce9b-48b8-9980-07d87baf968b\") " pod="openshift-multus/network-metrics-daemon-d99zz" Apr 17 18:49:22.234808 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.234763 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-192.ec2.internal" event={"ID":"0286f842e2a4c3c84425f64aac72ff7f","Type":"ContainerStarted","Data":"9c6467a961cd1bab1726ada5b141333abad38f8fbe49d0cde869658fe767c75d"} Apr 17 18:49:22.235646 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.235625 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-192.ec2.internal" event={"ID":"b98093225b62e87c02ad86cc72d0d4aa","Type":"ContainerStarted","Data":"4c96cc5a44954bed221ad26dace619f89b350c5eb804d47939188e7d54febab7"} Apr 17 18:49:22.316688 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.316663 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-etc-openvswitch\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.316688 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.316690 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.316874 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.316707 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ccccfa46-ab60-4610-8d60-6fad3773eedb-ovn-node-metrics-cert\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.316874 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.316724 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3d42e5aa-a588-4ce1-a264-4581a72945cb-serviceca\") pod \"node-ca-t5hn4\" (UID: \"3d42e5aa-a588-4ce1-a264-4581a72945cb\") " pod="openshift-image-registry/node-ca-t5hn4" Apr 17 18:49:22.316874 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.316740 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bshf5\" (UniqueName: \"kubernetes.io/projected/970c6428-01b9-4aa1-be61-fc714b218008-kube-api-access-bshf5\") pod \"multus-additional-cni-plugins-svmpv\" (UID: \"970c6428-01b9-4aa1-be61-fc714b218008\") " pod="openshift-multus/multus-additional-cni-plugins-svmpv" Apr 17 18:49:22.316874 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.316764 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-host-run-k8s-cni-cncf-io\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.316874 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.316782 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-host-var-lib-cni-multus\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.316874 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.316761 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-etc-openvswitch\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.316874 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.316804 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-host-cni-netd\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.316874 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.316814 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.316874 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.316837 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-host-run-k8s-cni-cncf-io\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.316874 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.316844 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-host-cni-netd\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.316874 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.316862 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/976291a6-4419-41fa-a907-55cd1103cf75-etc-sysctl-d\") pod \"tuned-qzf4h\" (UID: \"976291a6-4419-41fa-a907-55cd1103cf75\") " pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" Apr 17 18:49:22.317364 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.316880 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-host-var-lib-cni-multus\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.317364 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.316896 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8jzdk\" (UniqueName: \"kubernetes.io/projected/976291a6-4419-41fa-a907-55cd1103cf75-kube-api-access-8jzdk\") pod \"tuned-qzf4h\" (UID: \"976291a6-4419-41fa-a907-55cd1103cf75\") " pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" Apr 17 18:49:22.317364 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.316920 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/970c6428-01b9-4aa1-be61-fc714b218008-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-svmpv\" (UID: \"970c6428-01b9-4aa1-be61-fc714b218008\") " pod="openshift-multus/multus-additional-cni-plugins-svmpv" Apr 17 18:49:22.317364 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.316945 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-host-kubelet\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.317364 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317001 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-46vpj\" (UniqueName: \"kubernetes.io/projected/ccccfa46-ab60-4610-8d60-6fad3773eedb-kube-api-access-46vpj\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.317364 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317029 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/976291a6-4419-41fa-a907-55cd1103cf75-etc-kubernetes\") pod \"tuned-qzf4h\" (UID: \"976291a6-4419-41fa-a907-55cd1103cf75\") " pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" Apr 17 18:49:22.317364 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317051 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-host-kubelet\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.317364 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317065 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/970c6428-01b9-4aa1-be61-fc714b218008-system-cni-dir\") pod \"multus-additional-cni-plugins-svmpv\" (UID: \"970c6428-01b9-4aa1-be61-fc714b218008\") " pod="openshift-multus/multus-additional-cni-plugins-svmpv" Apr 17 18:49:22.317364 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317028 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/976291a6-4419-41fa-a907-55cd1103cf75-etc-sysctl-d\") pod \"tuned-qzf4h\" (UID: \"976291a6-4419-41fa-a907-55cd1103cf75\") " pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" Apr 17 18:49:22.317364 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317094 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-host-run-netns\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.317364 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317113 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/976291a6-4419-41fa-a907-55cd1103cf75-etc-kubernetes\") pod \"tuned-qzf4h\" (UID: \"976291a6-4419-41fa-a907-55cd1103cf75\") " pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" Apr 17 18:49:22.317364 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317132 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-host-slash\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.317364 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317157 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-run-ovn\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.317364 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317160 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-host-run-netns\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.317364 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317199 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ccccfa46-ab60-4610-8d60-6fad3773eedb-env-overrides\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.317364 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317215 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-host-slash\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.317364 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317215 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3d42e5aa-a588-4ce1-a264-4581a72945cb-serviceca\") pod \"node-ca-t5hn4\" (UID: \"3d42e5aa-a588-4ce1-a264-4581a72945cb\") " pod="openshift-image-registry/node-ca-t5hn4" Apr 17 18:49:22.317364 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317232 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-run-ovn\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.318031 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317227 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/976291a6-4419-41fa-a907-55cd1103cf75-run\") pod \"tuned-qzf4h\" (UID: \"976291a6-4419-41fa-a907-55cd1103cf75\") " pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" Apr 17 18:49:22.318031 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317264 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/970c6428-01b9-4aa1-be61-fc714b218008-system-cni-dir\") pod \"multus-additional-cni-plugins-svmpv\" (UID: \"970c6428-01b9-4aa1-be61-fc714b218008\") " pod="openshift-multus/multus-additional-cni-plugins-svmpv" Apr 17 18:49:22.318031 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317278 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8210137d-ed94-434c-897d-f67481261a39-hosts-file\") pod \"node-resolver-plbd4\" (UID: \"8210137d-ed94-434c-897d-f67481261a39\") " pod="openshift-dns/node-resolver-plbd4" Apr 17 18:49:22.318031 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317303 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-log-socket\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.318031 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317312 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/976291a6-4419-41fa-a907-55cd1103cf75-run\") pod \"tuned-qzf4h\" (UID: \"976291a6-4419-41fa-a907-55cd1103cf75\") " pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" Apr 17 18:49:22.318031 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317334 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/970c6428-01b9-4aa1-be61-fc714b218008-os-release\") pod \"multus-additional-cni-plugins-svmpv\" (UID: \"970c6428-01b9-4aa1-be61-fc714b218008\") " pod="openshift-multus/multus-additional-cni-plugins-svmpv" Apr 17 18:49:22.318031 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317353 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-log-socket\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.318031 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317359 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-cnibin\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.318031 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317367 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8210137d-ed94-434c-897d-f67481261a39-hosts-file\") pod \"node-resolver-plbd4\" (UID: \"8210137d-ed94-434c-897d-f67481261a39\") " pod="openshift-dns/node-resolver-plbd4" Apr 17 18:49:22.318031 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317413 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/970c6428-01b9-4aa1-be61-fc714b218008-os-release\") pod \"multus-additional-cni-plugins-svmpv\" (UID: \"970c6428-01b9-4aa1-be61-fc714b218008\") " pod="openshift-multus/multus-additional-cni-plugins-svmpv" Apr 17 18:49:22.318031 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317418 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-cnibin\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.318031 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317443 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-host-var-lib-cni-bin\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.318031 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317489 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-multus-daemon-config\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.318031 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317512 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-node-log\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.318031 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317519 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-host-var-lib-cni-bin\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.318031 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317535 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-host-run-ovn-kubernetes\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.318031 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317550 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/970c6428-01b9-4aa1-be61-fc714b218008-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-svmpv\" (UID: \"970c6428-01b9-4aa1-be61-fc714b218008\") " pod="openshift-multus/multus-additional-cni-plugins-svmpv" Apr 17 18:49:22.318031 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317565 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/970c6428-01b9-4aa1-be61-fc714b218008-cnibin\") pod \"multus-additional-cni-plugins-svmpv\" (UID: \"970c6428-01b9-4aa1-be61-fc714b218008\") " pod="openshift-multus/multus-additional-cni-plugins-svmpv" Apr 17 18:49:22.318716 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317590 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-var-lib-openvswitch\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.318716 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317599 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-host-run-ovn-kubernetes\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.318716 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317567 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-node-log\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.318716 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317615 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-host-cni-bin\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.318716 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317617 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/970c6428-01b9-4aa1-be61-fc714b218008-cnibin\") pod \"multus-additional-cni-plugins-svmpv\" (UID: \"970c6428-01b9-4aa1-be61-fc714b218008\") " pod="openshift-multus/multus-additional-cni-plugins-svmpv" Apr 17 18:49:22.318716 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317644 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-var-lib-openvswitch\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.318716 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317656 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/976291a6-4419-41fa-a907-55cd1103cf75-var-lib-kubelet\") pod \"tuned-qzf4h\" (UID: \"976291a6-4419-41fa-a907-55cd1103cf75\") " pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" Apr 17 18:49:22.318716 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317685 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8210137d-ed94-434c-897d-f67481261a39-tmp-dir\") pod \"node-resolver-plbd4\" (UID: \"8210137d-ed94-434c-897d-f67481261a39\") " pod="openshift-dns/node-resolver-plbd4" Apr 17 18:49:22.318716 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317659 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-host-cni-bin\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.318716 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317728 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-etc-kubernetes\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.318716 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317722 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/976291a6-4419-41fa-a907-55cd1103cf75-var-lib-kubelet\") pod \"tuned-qzf4h\" (UID: \"976291a6-4419-41fa-a907-55cd1103cf75\") " pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" Apr 17 18:49:22.318716 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317772 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-etc-kubernetes\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.318716 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317810 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-systemd-units\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.318716 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317836 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ccccfa46-ab60-4610-8d60-6fad3773eedb-ovnkube-script-lib\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.318716 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317846 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-systemd-units\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.318716 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317862 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/976291a6-4419-41fa-a907-55cd1103cf75-etc-tuned\") pod \"tuned-qzf4h\" (UID: \"976291a6-4419-41fa-a907-55cd1103cf75\") " pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" Apr 17 18:49:22.318716 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317887 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-system-cni-dir\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.318716 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317911 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8210137d-ed94-434c-897d-f67481261a39-tmp-dir\") pod \"node-resolver-plbd4\" (UID: \"8210137d-ed94-434c-897d-f67481261a39\") " pod="openshift-dns/node-resolver-plbd4" Apr 17 18:49:22.319507 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317914 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/970c6428-01b9-4aa1-be61-fc714b218008-cni-binary-copy\") pod \"multus-additional-cni-plugins-svmpv\" (UID: \"970c6428-01b9-4aa1-be61-fc714b218008\") " pod="openshift-multus/multus-additional-cni-plugins-svmpv" Apr 17 18:49:22.319507 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317939 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-host-run-multus-certs\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.319507 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317964 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xcmmh\" (UniqueName: \"kubernetes.io/projected/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-kube-api-access-xcmmh\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.319507 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317970 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-system-cni-dir\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.319507 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.317990 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-run-openvswitch\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.319507 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.318013 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-host-run-multus-certs\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.319507 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.318016 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/976291a6-4419-41fa-a907-55cd1103cf75-sys\") pod \"tuned-qzf4h\" (UID: \"976291a6-4419-41fa-a907-55cd1103cf75\") " pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" Apr 17 18:49:22.319507 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.318042 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/976291a6-4419-41fa-a907-55cd1103cf75-lib-modules\") pod \"tuned-qzf4h\" (UID: \"976291a6-4419-41fa-a907-55cd1103cf75\") " pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" Apr 17 18:49:22.319507 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.318068 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/976291a6-4419-41fa-a907-55cd1103cf75-etc-systemd\") pod \"tuned-qzf4h\" (UID: \"976291a6-4419-41fa-a907-55cd1103cf75\") " pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" Apr 17 18:49:22.319507 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.318096 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/970c6428-01b9-4aa1-be61-fc714b218008-tuning-conf-dir\") pod \"multus-additional-cni-plugins-svmpv\" (UID: \"970c6428-01b9-4aa1-be61-fc714b218008\") " pod="openshift-multus/multus-additional-cni-plugins-svmpv" Apr 17 18:49:22.319507 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.318120 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-multus-conf-dir\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.319507 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.318159 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/976291a6-4419-41fa-a907-55cd1103cf75-etc-sysconfig\") pod \"tuned-qzf4h\" (UID: \"976291a6-4419-41fa-a907-55cd1103cf75\") " pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" Apr 17 18:49:22.319507 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.318205 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/976291a6-4419-41fa-a907-55cd1103cf75-host\") pod \"tuned-qzf4h\" (UID: \"976291a6-4419-41fa-a907-55cd1103cf75\") " pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" Apr 17 18:49:22.319507 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.318229 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3d42e5aa-a588-4ce1-a264-4581a72945cb-host\") pod \"node-ca-t5hn4\" (UID: \"3d42e5aa-a588-4ce1-a264-4581a72945cb\") " pod="openshift-image-registry/node-ca-t5hn4" Apr 17 18:49:22.319507 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.318257 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58b4g\" (UniqueName: \"kubernetes.io/projected/a96394bb-18f6-42cc-975e-c0532c5c2943-kube-api-access-58b4g\") pod \"network-check-target-kx5kn\" (UID: \"a96394bb-18f6-42cc-975e-c0532c5c2943\") " pod="openshift-network-diagnostics/network-check-target-kx5kn" Apr 17 18:49:22.319507 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.318262 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ccccfa46-ab60-4610-8d60-6fad3773eedb-env-overrides\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.319507 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.318282 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/976291a6-4419-41fa-a907-55cd1103cf75-tmp\") pod \"tuned-qzf4h\" (UID: \"976291a6-4419-41fa-a907-55cd1103cf75\") " pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" Apr 17 18:49:22.319507 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.318295 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/976291a6-4419-41fa-a907-55cd1103cf75-etc-systemd\") pod \"tuned-qzf4h\" (UID: \"976291a6-4419-41fa-a907-55cd1103cf75\") " pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" Apr 17 18:49:22.320353 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.318305 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pcqft\" (UniqueName: \"kubernetes.io/projected/3d42e5aa-a588-4ce1-a264-4581a72945cb-kube-api-access-pcqft\") pod \"node-ca-t5hn4\" (UID: \"3d42e5aa-a588-4ce1-a264-4581a72945cb\") " pod="openshift-image-registry/node-ca-t5hn4" Apr 17 18:49:22.320353 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.318334 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/976291a6-4419-41fa-a907-55cd1103cf75-sys\") pod \"tuned-qzf4h\" (UID: \"976291a6-4419-41fa-a907-55cd1103cf75\") " pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" Apr 17 18:49:22.320353 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.318342 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-hostroot\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.320353 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.318368 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ccccfa46-ab60-4610-8d60-6fad3773eedb-ovnkube-config\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.320353 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.318372 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-run-openvswitch\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.320353 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.318385 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ccccfa46-ab60-4610-8d60-6fad3773eedb-ovnkube-script-lib\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.320353 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.318397 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/970c6428-01b9-4aa1-be61-fc714b218008-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-svmpv\" (UID: \"970c6428-01b9-4aa1-be61-fc714b218008\") " pod="openshift-multus/multus-additional-cni-plugins-svmpv" Apr 17 18:49:22.320353 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.318419 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/976291a6-4419-41fa-a907-55cd1103cf75-host\") pod \"tuned-qzf4h\" (UID: \"976291a6-4419-41fa-a907-55cd1103cf75\") " pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" Apr 17 18:49:22.320353 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.318425 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-cni-binary-copy\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.320353 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.318418 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/970c6428-01b9-4aa1-be61-fc714b218008-cni-binary-copy\") pod \"multus-additional-cni-plugins-svmpv\" (UID: \"970c6428-01b9-4aa1-be61-fc714b218008\") " pod="openshift-multus/multus-additional-cni-plugins-svmpv" Apr 17 18:49:22.320353 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.318452 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/970c6428-01b9-4aa1-be61-fc714b218008-tuning-conf-dir\") pod \"multus-additional-cni-plugins-svmpv\" (UID: \"970c6428-01b9-4aa1-be61-fc714b218008\") " pod="openshift-multus/multus-additional-cni-plugins-svmpv" Apr 17 18:49:22.320353 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.318489 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-hostroot\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.320353 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.318511 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/976291a6-4419-41fa-a907-55cd1103cf75-etc-sysconfig\") pod \"tuned-qzf4h\" (UID: \"976291a6-4419-41fa-a907-55cd1103cf75\") " pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" Apr 17 18:49:22.320353 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.318534 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3d42e5aa-a588-4ce1-a264-4581a72945cb-host\") pod \"node-ca-t5hn4\" (UID: \"3d42e5aa-a588-4ce1-a264-4581a72945cb\") " pod="openshift-image-registry/node-ca-t5hn4" Apr 17 18:49:22.320353 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.318615 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-host-run-netns\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.320353 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.318652 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-multus-socket-dir-parent\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.320353 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.318682 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8qvll\" (UniqueName: \"kubernetes.io/projected/8210137d-ed94-434c-897d-f67481261a39-kube-api-access-8qvll\") pod \"node-resolver-plbd4\" (UID: \"8210137d-ed94-434c-897d-f67481261a39\") " pod="openshift-dns/node-resolver-plbd4" Apr 17 18:49:22.320353 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.318706 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-multus-cni-dir\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.320960 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.318733 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-os-release\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.320960 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.318759 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-run-systemd\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.320960 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.318784 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/976291a6-4419-41fa-a907-55cd1103cf75-etc-modprobe-d\") pod \"tuned-qzf4h\" (UID: \"976291a6-4419-41fa-a907-55cd1103cf75\") " pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" Apr 17 18:49:22.320960 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.318808 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/976291a6-4419-41fa-a907-55cd1103cf75-etc-sysctl-conf\") pod \"tuned-qzf4h\" (UID: \"976291a6-4419-41fa-a907-55cd1103cf75\") " pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" Apr 17 18:49:22.320960 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.318833 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-host-var-lib-kubelet\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.320960 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.318898 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-host-var-lib-kubelet\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.320960 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.318954 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ccccfa46-ab60-4610-8d60-6fad3773eedb-ovnkube-config\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.320960 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.318958 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-host-run-netns\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.320960 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.318991 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-multus-cni-dir\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.320960 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.319002 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-multus-socket-dir-parent\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.320960 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.318760 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/976291a6-4419-41fa-a907-55cd1103cf75-lib-modules\") pod \"tuned-qzf4h\" (UID: \"976291a6-4419-41fa-a907-55cd1103cf75\") " pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" Apr 17 18:49:22.320960 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.319047 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ccccfa46-ab60-4610-8d60-6fad3773eedb-run-systemd\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.320960 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.319130 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/976291a6-4419-41fa-a907-55cd1103cf75-etc-modprobe-d\") pod \"tuned-qzf4h\" (UID: \"976291a6-4419-41fa-a907-55cd1103cf75\") " pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" Apr 17 18:49:22.320960 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.319155 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/976291a6-4419-41fa-a907-55cd1103cf75-etc-sysctl-conf\") pod \"tuned-qzf4h\" (UID: \"976291a6-4419-41fa-a907-55cd1103cf75\") " pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" Apr 17 18:49:22.320960 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.319197 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-multus-conf-dir\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.320960 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.319249 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-os-release\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.320960 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.319338 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ccccfa46-ab60-4610-8d60-6fad3773eedb-ovn-node-metrics-cert\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.320960 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.319398 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/970c6428-01b9-4aa1-be61-fc714b218008-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-svmpv\" (UID: \"970c6428-01b9-4aa1-be61-fc714b218008\") " pod="openshift-multus/multus-additional-cni-plugins-svmpv" Apr 17 18:49:22.321436 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.319669 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-multus-daemon-config\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.321436 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.319735 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-cni-binary-copy\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.321436 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.320117 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/976291a6-4419-41fa-a907-55cd1103cf75-etc-tuned\") pod \"tuned-qzf4h\" (UID: \"976291a6-4419-41fa-a907-55cd1103cf75\") " pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" Apr 17 18:49:22.321436 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.320750 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/976291a6-4419-41fa-a907-55cd1103cf75-tmp\") pod \"tuned-qzf4h\" (UID: \"976291a6-4419-41fa-a907-55cd1103cf75\") " pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" Apr 17 18:49:22.323655 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:22.323634 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 18:49:22.323737 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:22.323658 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 18:49:22.323737 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:22.323673 2571 projected.go:194] Error preparing data for projected volume kube-api-access-58b4g for pod openshift-network-diagnostics/network-check-target-kx5kn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:49:22.323737 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:22.323728 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a96394bb-18f6-42cc-975e-c0532c5c2943-kube-api-access-58b4g podName:a96394bb-18f6-42cc-975e-c0532c5c2943 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:22.823710421 +0000 UTC m=+2.097727705 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-58b4g" (UniqueName: "kubernetes.io/projected/a96394bb-18f6-42cc-975e-c0532c5c2943-kube-api-access-58b4g") pod "network-check-target-kx5kn" (UID: "a96394bb-18f6-42cc-975e-c0532c5c2943") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:49:22.325680 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.325661 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-46vpj\" (UniqueName: \"kubernetes.io/projected/ccccfa46-ab60-4610-8d60-6fad3773eedb-kube-api-access-46vpj\") pod \"ovnkube-node-sctgd\" (UID: \"ccccfa46-ab60-4610-8d60-6fad3773eedb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.325893 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.325873 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bshf5\" (UniqueName: \"kubernetes.io/projected/970c6428-01b9-4aa1-be61-fc714b218008-kube-api-access-bshf5\") pod \"multus-additional-cni-plugins-svmpv\" (UID: \"970c6428-01b9-4aa1-be61-fc714b218008\") " pod="openshift-multus/multus-additional-cni-plugins-svmpv" Apr 17 18:49:22.326049 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.326033 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jzdk\" (UniqueName: \"kubernetes.io/projected/976291a6-4419-41fa-a907-55cd1103cf75-kube-api-access-8jzdk\") pod \"tuned-qzf4h\" (UID: \"976291a6-4419-41fa-a907-55cd1103cf75\") " pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" Apr 17 18:49:22.326224 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.326209 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcqft\" (UniqueName: \"kubernetes.io/projected/3d42e5aa-a588-4ce1-a264-4581a72945cb-kube-api-access-pcqft\") pod \"node-ca-t5hn4\" (UID: \"3d42e5aa-a588-4ce1-a264-4581a72945cb\") " pod="openshift-image-registry/node-ca-t5hn4" Apr 17 18:49:22.326302 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.326287 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcmmh\" (UniqueName: \"kubernetes.io/projected/d04b9d8e-1775-4cd8-ac25-94161e15b4ee-kube-api-access-xcmmh\") pod \"multus-mcrqw\" (UID: \"d04b9d8e-1775-4cd8-ac25-94161e15b4ee\") " pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.326381 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.326366 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qvll\" (UniqueName: \"kubernetes.io/projected/8210137d-ed94-434c-897d-f67481261a39-kube-api-access-8qvll\") pod \"node-resolver-plbd4\" (UID: \"8210137d-ed94-434c-897d-f67481261a39\") " pod="openshift-dns/node-resolver-plbd4" Apr 17 18:49:22.406514 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.406419 2571 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 18:49:22.430675 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.430653 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-hscd2" Apr 17 18:49:22.436746 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:22.436722 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91b8b586_8905_45d3_8ed4_577118981cd3.slice/crio-485a6e3ae371607ea61597fd65e8b7f14d1cfb8f4a7e0e15dd2a26b42631b116 WatchSource:0}: Error finding container 485a6e3ae371607ea61597fd65e8b7f14d1cfb8f4a7e0e15dd2a26b42631b116: Status 404 returned error can't find the container with id 485a6e3ae371607ea61597fd65e8b7f14d1cfb8f4a7e0e15dd2a26b42631b116 Apr 17 18:49:22.449552 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.449535 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-rjfcm" Apr 17 18:49:22.455600 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:22.455581 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod396ffb22_0675_49a2_99f6_018d18d53fe3.slice/crio-eb698e67b6f8700588e29f9196e3a8a01a732c50a72c5bda11895fdd8c4b3963 WatchSource:0}: Error finding container eb698e67b6f8700588e29f9196e3a8a01a732c50a72c5bda11895fdd8c4b3963: Status 404 returned error can't find the container with id eb698e67b6f8700588e29f9196e3a8a01a732c50a72c5bda11895fdd8c4b3963 Apr 17 18:49:22.475470 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.475429 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmqn4" Apr 17 18:49:22.479045 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.479022 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" Apr 17 18:49:22.480994 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:22.480972 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod696b7876_308e_4809_8fe2_ebe856ea41b9.slice/crio-37a79bba6485a8c0860c92279d136188339d2b3318d62cfc45cf6dd8e51ab92e WatchSource:0}: Error finding container 37a79bba6485a8c0860c92279d136188339d2b3318d62cfc45cf6dd8e51ab92e: Status 404 returned error can't find the container with id 37a79bba6485a8c0860c92279d136188339d2b3318d62cfc45cf6dd8e51ab92e Apr 17 18:49:22.485233 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.485145 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-plbd4" Apr 17 18:49:22.485569 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.485550 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 18:49:22.487470 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:22.487439 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod976291a6_4419_41fa_a907_55cd1103cf75.slice/crio-9ad4407b58100234ed033f2c5aa4d6a8810c4af76ed4602a9723fbeca9e12c5e WatchSource:0}: Error finding container 9ad4407b58100234ed033f2c5aa4d6a8810c4af76ed4602a9723fbeca9e12c5e: Status 404 returned error can't find the container with id 9ad4407b58100234ed033f2c5aa4d6a8810c4af76ed4602a9723fbeca9e12c5e Apr 17 18:49:22.491169 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.491147 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-svmpv" Apr 17 18:49:22.494854 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:22.494831 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8210137d_ed94_434c_897d_f67481261a39.slice/crio-a64324e8ff1f051e3a9dc5175a3f98340689e55e35e09929005f7cf0bfa4709b WatchSource:0}: Error finding container a64324e8ff1f051e3a9dc5175a3f98340689e55e35e09929005f7cf0bfa4709b: Status 404 returned error can't find the container with id a64324e8ff1f051e3a9dc5175a3f98340689e55e35e09929005f7cf0bfa4709b Apr 17 18:49:22.495483 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.495450 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mcrqw" Apr 17 18:49:22.500194 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:22.500171 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod970c6428_01b9_4aa1_be61_fc714b218008.slice/crio-b7d53429656ef4680f3861e277cde04df7103fd91865c782b4655a42b26754ba WatchSource:0}: Error finding container b7d53429656ef4680f3861e277cde04df7103fd91865c782b4655a42b26754ba: Status 404 returned error can't find the container with id b7d53429656ef4680f3861e277cde04df7103fd91865c782b4655a42b26754ba Apr 17 18:49:22.500975 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.500956 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:22.503355 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:22.503329 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd04b9d8e_1775_4cd8_ac25_94161e15b4ee.slice/crio-964cebb2b17a7d89a01e38c9c3a5b51eb666a27c590b9795363a8d96fb3ff09c WatchSource:0}: Error finding container 964cebb2b17a7d89a01e38c9c3a5b51eb666a27c590b9795363a8d96fb3ff09c: Status 404 returned error can't find the container with id 964cebb2b17a7d89a01e38c9c3a5b51eb666a27c590b9795363a8d96fb3ff09c Apr 17 18:49:22.505363 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.505345 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-t5hn4" Apr 17 18:49:22.510472 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:22.510439 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podccccfa46_ab60_4610_8d60_6fad3773eedb.slice/crio-8ad96016f7fecf8b6fd6a1589337515e25330b94ec83734d230a44a36e0c25d4 WatchSource:0}: Error finding container 8ad96016f7fecf8b6fd6a1589337515e25330b94ec83734d230a44a36e0c25d4: Status 404 returned error can't find the container with id 8ad96016f7fecf8b6fd6a1589337515e25330b94ec83734d230a44a36e0c25d4 Apr 17 18:49:22.513857 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:22.513838 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d42e5aa_a588_4ce1_a264_4581a72945cb.slice/crio-f8ef39d8b9dc57c4a26dafa45abef73c4a5bbe632c4c534bcdc145c29cdc177f WatchSource:0}: Error finding container f8ef39d8b9dc57c4a26dafa45abef73c4a5bbe632c4c534bcdc145c29cdc177f: Status 404 returned error can't find the container with id f8ef39d8b9dc57c4a26dafa45abef73c4a5bbe632c4c534bcdc145c29cdc177f Apr 17 18:49:22.722961 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.722306 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9cb68ed8-ce9b-48b8-9980-07d87baf968b-metrics-certs\") pod \"network-metrics-daemon-d99zz\" (UID: \"9cb68ed8-ce9b-48b8-9980-07d87baf968b\") " pod="openshift-multus/network-metrics-daemon-d99zz" Apr 17 18:49:22.722961 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:22.722505 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:49:22.722961 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:22.722574 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9cb68ed8-ce9b-48b8-9980-07d87baf968b-metrics-certs podName:9cb68ed8-ce9b-48b8-9980-07d87baf968b nodeName:}" failed. No retries permitted until 2026-04-17 18:49:23.722554289 +0000 UTC m=+2.996571561 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9cb68ed8-ce9b-48b8-9980-07d87baf968b-metrics-certs") pod "network-metrics-daemon-d99zz" (UID: "9cb68ed8-ce9b-48b8-9980-07d87baf968b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:49:22.923663 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:22.923626 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58b4g\" (UniqueName: \"kubernetes.io/projected/a96394bb-18f6-42cc-975e-c0532c5c2943-kube-api-access-58b4g\") pod \"network-check-target-kx5kn\" (UID: \"a96394bb-18f6-42cc-975e-c0532c5c2943\") " pod="openshift-network-diagnostics/network-check-target-kx5kn" Apr 17 18:49:22.923817 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:22.923803 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 18:49:22.923892 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:22.923829 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 18:49:22.923892 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:22.923844 2571 projected.go:194] Error preparing data for projected volume kube-api-access-58b4g for pod openshift-network-diagnostics/network-check-target-kx5kn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:49:22.924000 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:22.923912 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a96394bb-18f6-42cc-975e-c0532c5c2943-kube-api-access-58b4g podName:a96394bb-18f6-42cc-975e-c0532c5c2943 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:23.923889574 +0000 UTC m=+3.197906845 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-58b4g" (UniqueName: "kubernetes.io/projected/a96394bb-18f6-42cc-975e-c0532c5c2943-kube-api-access-58b4g") pod "network-check-target-kx5kn" (UID: "a96394bb-18f6-42cc-975e-c0532c5c2943") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:49:23.144414 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:23.144328 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 18:44:22 +0000 UTC" deadline="2028-01-13 22:23:50.073613692 +0000 UTC" Apr 17 18:49:23.144414 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:23.144368 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15267h34m26.92924996s" Apr 17 18:49:23.181724 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:23.181693 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 18:49:23.247740 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:23.247704 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-t5hn4" event={"ID":"3d42e5aa-a588-4ce1-a264-4581a72945cb","Type":"ContainerStarted","Data":"f8ef39d8b9dc57c4a26dafa45abef73c4a5bbe632c4c534bcdc145c29cdc177f"} Apr 17 18:49:23.262100 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:23.262031 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" event={"ID":"ccccfa46-ab60-4610-8d60-6fad3773eedb","Type":"ContainerStarted","Data":"8ad96016f7fecf8b6fd6a1589337515e25330b94ec83734d230a44a36e0c25d4"} Apr 17 18:49:23.269918 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:23.269889 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-svmpv" event={"ID":"970c6428-01b9-4aa1-be61-fc714b218008","Type":"ContainerStarted","Data":"b7d53429656ef4680f3861e277cde04df7103fd91865c782b4655a42b26754ba"} Apr 17 18:49:23.274716 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:23.274645 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-rjfcm" event={"ID":"396ffb22-0675-49a2-99f6-018d18d53fe3","Type":"ContainerStarted","Data":"eb698e67b6f8700588e29f9196e3a8a01a732c50a72c5bda11895fdd8c4b3963"} Apr 17 18:49:23.286558 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:23.286508 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-hscd2" event={"ID":"91b8b586-8905-45d3-8ed4-577118981cd3","Type":"ContainerStarted","Data":"485a6e3ae371607ea61597fd65e8b7f14d1cfb8f4a7e0e15dd2a26b42631b116"} Apr 17 18:49:23.290094 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:23.290043 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mcrqw" event={"ID":"d04b9d8e-1775-4cd8-ac25-94161e15b4ee","Type":"ContainerStarted","Data":"964cebb2b17a7d89a01e38c9c3a5b51eb666a27c590b9795363a8d96fb3ff09c"} Apr 17 18:49:23.293817 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:23.293763 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-plbd4" event={"ID":"8210137d-ed94-434c-897d-f67481261a39","Type":"ContainerStarted","Data":"a64324e8ff1f051e3a9dc5175a3f98340689e55e35e09929005f7cf0bfa4709b"} Apr 17 18:49:23.300603 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:23.300546 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" event={"ID":"976291a6-4419-41fa-a907-55cd1103cf75","Type":"ContainerStarted","Data":"9ad4407b58100234ed033f2c5aa4d6a8810c4af76ed4602a9723fbeca9e12c5e"} Apr 17 18:49:23.308602 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:23.308571 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmqn4" event={"ID":"696b7876-308e-4809-8fe2-ebe856ea41b9","Type":"ContainerStarted","Data":"37a79bba6485a8c0860c92279d136188339d2b3318d62cfc45cf6dd8e51ab92e"} Apr 17 18:49:23.729996 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:23.729958 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9cb68ed8-ce9b-48b8-9980-07d87baf968b-metrics-certs\") pod \"network-metrics-daemon-d99zz\" (UID: \"9cb68ed8-ce9b-48b8-9980-07d87baf968b\") " pod="openshift-multus/network-metrics-daemon-d99zz" Apr 17 18:49:23.730180 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:23.730132 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:49:23.730242 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:23.730206 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9cb68ed8-ce9b-48b8-9980-07d87baf968b-metrics-certs podName:9cb68ed8-ce9b-48b8-9980-07d87baf968b nodeName:}" failed. No retries permitted until 2026-04-17 18:49:25.730185595 +0000 UTC m=+5.004202888 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9cb68ed8-ce9b-48b8-9980-07d87baf968b-metrics-certs") pod "network-metrics-daemon-d99zz" (UID: "9cb68ed8-ce9b-48b8-9980-07d87baf968b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:49:23.932578 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:23.931924 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58b4g\" (UniqueName: \"kubernetes.io/projected/a96394bb-18f6-42cc-975e-c0532c5c2943-kube-api-access-58b4g\") pod \"network-check-target-kx5kn\" (UID: \"a96394bb-18f6-42cc-975e-c0532c5c2943\") " pod="openshift-network-diagnostics/network-check-target-kx5kn" Apr 17 18:49:23.932578 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:23.932115 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 18:49:23.932578 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:23.932136 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 18:49:23.932578 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:23.932149 2571 projected.go:194] Error preparing data for projected volume kube-api-access-58b4g for pod openshift-network-diagnostics/network-check-target-kx5kn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:49:23.932578 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:23.932203 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a96394bb-18f6-42cc-975e-c0532c5c2943-kube-api-access-58b4g podName:a96394bb-18f6-42cc-975e-c0532c5c2943 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:25.932184672 +0000 UTC m=+5.206201952 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-58b4g" (UniqueName: "kubernetes.io/projected/a96394bb-18f6-42cc-975e-c0532c5c2943-kube-api-access-58b4g") pod "network-check-target-kx5kn" (UID: "a96394bb-18f6-42cc-975e-c0532c5c2943") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:49:24.145204 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:24.145162 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 18:44:22 +0000 UTC" deadline="2027-10-12 11:46:20.813495022 +0000 UTC" Apr 17 18:49:24.145204 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:24.145198 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13024h56m56.66830065s" Apr 17 18:49:24.234536 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:24.234506 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kx5kn" Apr 17 18:49:24.234702 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:24.234612 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kx5kn" podUID="a96394bb-18f6-42cc-975e-c0532c5c2943" Apr 17 18:49:24.235003 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:24.234983 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d99zz" Apr 17 18:49:24.235126 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:24.235086 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d99zz" podUID="9cb68ed8-ce9b-48b8-9980-07d87baf968b" Apr 17 18:49:25.747485 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:25.746765 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9cb68ed8-ce9b-48b8-9980-07d87baf968b-metrics-certs\") pod \"network-metrics-daemon-d99zz\" (UID: \"9cb68ed8-ce9b-48b8-9980-07d87baf968b\") " pod="openshift-multus/network-metrics-daemon-d99zz" Apr 17 18:49:25.751479 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:25.748162 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:49:25.751479 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:25.748250 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9cb68ed8-ce9b-48b8-9980-07d87baf968b-metrics-certs podName:9cb68ed8-ce9b-48b8-9980-07d87baf968b nodeName:}" failed. No retries permitted until 2026-04-17 18:49:29.748229516 +0000 UTC m=+9.022246807 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9cb68ed8-ce9b-48b8-9980-07d87baf968b-metrics-certs") pod "network-metrics-daemon-d99zz" (UID: "9cb68ed8-ce9b-48b8-9980-07d87baf968b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:49:25.948312 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:25.948271 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58b4g\" (UniqueName: \"kubernetes.io/projected/a96394bb-18f6-42cc-975e-c0532c5c2943-kube-api-access-58b4g\") pod \"network-check-target-kx5kn\" (UID: \"a96394bb-18f6-42cc-975e-c0532c5c2943\") " pod="openshift-network-diagnostics/network-check-target-kx5kn" Apr 17 18:49:25.948526 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:25.948486 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 18:49:25.948526 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:25.948504 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 18:49:25.948526 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:25.948517 2571 projected.go:194] Error preparing data for projected volume kube-api-access-58b4g for pod openshift-network-diagnostics/network-check-target-kx5kn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:49:25.948706 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:25.948582 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a96394bb-18f6-42cc-975e-c0532c5c2943-kube-api-access-58b4g podName:a96394bb-18f6-42cc-975e-c0532c5c2943 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:29.948560605 +0000 UTC m=+9.222577879 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-58b4g" (UniqueName: "kubernetes.io/projected/a96394bb-18f6-42cc-975e-c0532c5c2943-kube-api-access-58b4g") pod "network-check-target-kx5kn" (UID: "a96394bb-18f6-42cc-975e-c0532c5c2943") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:49:26.233217 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:26.233081 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d99zz" Apr 17 18:49:26.233217 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:26.233123 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kx5kn" Apr 17 18:49:26.233494 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:26.233232 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d99zz" podUID="9cb68ed8-ce9b-48b8-9980-07d87baf968b" Apr 17 18:49:26.233640 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:26.233587 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kx5kn" podUID="a96394bb-18f6-42cc-975e-c0532c5c2943" Apr 17 18:49:28.233205 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:28.233170 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kx5kn" Apr 17 18:49:28.233702 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:28.233301 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kx5kn" podUID="a96394bb-18f6-42cc-975e-c0532c5c2943" Apr 17 18:49:28.233702 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:28.233424 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d99zz" Apr 17 18:49:28.233702 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:28.233547 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d99zz" podUID="9cb68ed8-ce9b-48b8-9980-07d87baf968b" Apr 17 18:49:29.781964 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:29.781927 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9cb68ed8-ce9b-48b8-9980-07d87baf968b-metrics-certs\") pod \"network-metrics-daemon-d99zz\" (UID: \"9cb68ed8-ce9b-48b8-9980-07d87baf968b\") " pod="openshift-multus/network-metrics-daemon-d99zz" Apr 17 18:49:29.782379 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:29.782084 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:49:29.782379 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:29.782153 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9cb68ed8-ce9b-48b8-9980-07d87baf968b-metrics-certs podName:9cb68ed8-ce9b-48b8-9980-07d87baf968b nodeName:}" failed. No retries permitted until 2026-04-17 18:49:37.782132897 +0000 UTC m=+17.056150185 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9cb68ed8-ce9b-48b8-9980-07d87baf968b-metrics-certs") pod "network-metrics-daemon-d99zz" (UID: "9cb68ed8-ce9b-48b8-9980-07d87baf968b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:49:29.984515 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:29.984481 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58b4g\" (UniqueName: \"kubernetes.io/projected/a96394bb-18f6-42cc-975e-c0532c5c2943-kube-api-access-58b4g\") pod \"network-check-target-kx5kn\" (UID: \"a96394bb-18f6-42cc-975e-c0532c5c2943\") " pod="openshift-network-diagnostics/network-check-target-kx5kn" Apr 17 18:49:29.984695 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:29.984627 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 18:49:29.984695 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:29.984647 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 18:49:29.984695 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:29.984661 2571 projected.go:194] Error preparing data for projected volume kube-api-access-58b4g for pod openshift-network-diagnostics/network-check-target-kx5kn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:49:29.984859 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:29.984721 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a96394bb-18f6-42cc-975e-c0532c5c2943-kube-api-access-58b4g podName:a96394bb-18f6-42cc-975e-c0532c5c2943 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:37.984701814 +0000 UTC m=+17.258719104 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-58b4g" (UniqueName: "kubernetes.io/projected/a96394bb-18f6-42cc-975e-c0532c5c2943-kube-api-access-58b4g") pod "network-check-target-kx5kn" (UID: "a96394bb-18f6-42cc-975e-c0532c5c2943") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:49:30.233448 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:30.233063 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kx5kn" Apr 17 18:49:30.233448 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:30.233090 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d99zz" Apr 17 18:49:30.233448 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:30.233242 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kx5kn" podUID="a96394bb-18f6-42cc-975e-c0532c5c2943" Apr 17 18:49:30.233731 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:30.233673 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d99zz" podUID="9cb68ed8-ce9b-48b8-9980-07d87baf968b" Apr 17 18:49:32.232779 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:32.232747 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kx5kn" Apr 17 18:49:32.233190 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:32.232747 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d99zz" Apr 17 18:49:32.233190 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:32.232865 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kx5kn" podUID="a96394bb-18f6-42cc-975e-c0532c5c2943" Apr 17 18:49:32.233190 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:32.232935 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d99zz" podUID="9cb68ed8-ce9b-48b8-9980-07d87baf968b" Apr 17 18:49:34.233149 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:34.233099 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kx5kn" Apr 17 18:49:34.233739 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:34.233104 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d99zz" Apr 17 18:49:34.233739 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:34.233356 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d99zz" podUID="9cb68ed8-ce9b-48b8-9980-07d87baf968b" Apr 17 18:49:34.233739 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:34.233232 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kx5kn" podUID="a96394bb-18f6-42cc-975e-c0532c5c2943" Apr 17 18:49:36.233500 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:36.233468 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d99zz" Apr 17 18:49:36.233940 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:36.233469 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kx5kn" Apr 17 18:49:36.233940 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:36.233605 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d99zz" podUID="9cb68ed8-ce9b-48b8-9980-07d87baf968b" Apr 17 18:49:36.233940 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:36.233682 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kx5kn" podUID="a96394bb-18f6-42cc-975e-c0532c5c2943" Apr 17 18:49:37.844610 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:37.844571 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9cb68ed8-ce9b-48b8-9980-07d87baf968b-metrics-certs\") pod \"network-metrics-daemon-d99zz\" (UID: \"9cb68ed8-ce9b-48b8-9980-07d87baf968b\") " pod="openshift-multus/network-metrics-daemon-d99zz" Apr 17 18:49:37.845100 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:37.844730 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:49:37.845100 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:37.844833 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9cb68ed8-ce9b-48b8-9980-07d87baf968b-metrics-certs podName:9cb68ed8-ce9b-48b8-9980-07d87baf968b nodeName:}" failed. No retries permitted until 2026-04-17 18:49:53.844812039 +0000 UTC m=+33.118829314 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9cb68ed8-ce9b-48b8-9980-07d87baf968b-metrics-certs") pod "network-metrics-daemon-d99zz" (UID: "9cb68ed8-ce9b-48b8-9980-07d87baf968b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:49:38.045507 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:38.045474 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58b4g\" (UniqueName: \"kubernetes.io/projected/a96394bb-18f6-42cc-975e-c0532c5c2943-kube-api-access-58b4g\") pod \"network-check-target-kx5kn\" (UID: \"a96394bb-18f6-42cc-975e-c0532c5c2943\") " pod="openshift-network-diagnostics/network-check-target-kx5kn" Apr 17 18:49:38.045777 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:38.045622 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 18:49:38.045777 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:38.045643 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 18:49:38.045777 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:38.045657 2571 projected.go:194] Error preparing data for projected volume kube-api-access-58b4g for pod openshift-network-diagnostics/network-check-target-kx5kn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:49:38.045777 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:38.045724 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a96394bb-18f6-42cc-975e-c0532c5c2943-kube-api-access-58b4g podName:a96394bb-18f6-42cc-975e-c0532c5c2943 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:54.045704599 +0000 UTC m=+33.319721885 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-58b4g" (UniqueName: "kubernetes.io/projected/a96394bb-18f6-42cc-975e-c0532c5c2943-kube-api-access-58b4g") pod "network-check-target-kx5kn" (UID: "a96394bb-18f6-42cc-975e-c0532c5c2943") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:49:38.233610 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:38.233512 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kx5kn" Apr 17 18:49:38.233610 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:38.233606 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d99zz" Apr 17 18:49:38.233843 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:38.233730 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d99zz" podUID="9cb68ed8-ce9b-48b8-9980-07d87baf968b" Apr 17 18:49:38.233906 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:38.233866 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kx5kn" podUID="a96394bb-18f6-42cc-975e-c0532c5c2943" Apr 17 18:49:40.233675 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:40.233213 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kx5kn" Apr 17 18:49:40.234260 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:40.233222 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d99zz" Apr 17 18:49:40.234260 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:40.233736 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kx5kn" podUID="a96394bb-18f6-42cc-975e-c0532c5c2943" Apr 17 18:49:40.234260 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:40.233823 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d99zz" podUID="9cb68ed8-ce9b-48b8-9980-07d87baf968b" Apr 17 18:49:40.349774 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:40.349655 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" event={"ID":"ccccfa46-ab60-4610-8d60-6fad3773eedb","Type":"ContainerStarted","Data":"15231d6b42b7d4986e53d522b2b1fd5b140450af967daa53d7a95bb1f6b70ded"} Apr 17 18:49:40.349774 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:40.349696 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" event={"ID":"ccccfa46-ab60-4610-8d60-6fad3773eedb","Type":"ContainerStarted","Data":"0e68f1f4985343687fa16e15213538f61f7eb391597a84ee3d7642d307b6ea18"} Apr 17 18:49:40.349774 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:40.349710 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" event={"ID":"ccccfa46-ab60-4610-8d60-6fad3773eedb","Type":"ContainerStarted","Data":"50a1f6989e049a9a30316fee6e12dd2957fb33acd11348603124cd980965fc5a"} Apr 17 18:49:40.349774 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:40.349722 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" event={"ID":"ccccfa46-ab60-4610-8d60-6fad3773eedb","Type":"ContainerStarted","Data":"65ae4ddd47ad38e4743503b389ede003a60cd0ae13d181cc9c1165b2071ee8f2"} Apr 17 18:49:40.349774 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:40.349734 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" event={"ID":"ccccfa46-ab60-4610-8d60-6fad3773eedb","Type":"ContainerStarted","Data":"0fe93c21a34406b75997c0b960878122551bd64501f7dc239064e0caa1d518b7"} Apr 17 18:49:40.349774 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:40.349747 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" event={"ID":"ccccfa46-ab60-4610-8d60-6fad3773eedb","Type":"ContainerStarted","Data":"06ec734168075e245e44d4b23ad542dc6082086426e48f7658f36d72b125a13a"} Apr 17 18:49:40.352287 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:40.352265 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mcrqw" event={"ID":"d04b9d8e-1775-4cd8-ac25-94161e15b4ee","Type":"ContainerStarted","Data":"77f8682eef97736928de1fc1819b358affb88470f1abfa89fd81068c9a3938dc"} Apr 17 18:49:40.354209 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:40.354186 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" event={"ID":"976291a6-4419-41fa-a907-55cd1103cf75","Type":"ContainerStarted","Data":"c7a1b772974bdb6069ec3ced2c31022e9d52713a9eebee8a91473b0eabed6109"} Apr 17 18:49:40.356068 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:40.355704 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-192.ec2.internal" event={"ID":"0286f842e2a4c3c84425f64aac72ff7f","Type":"ContainerStarted","Data":"b5377673355cae71576632d9f2f1fde99a114019ae029ce14122a60be2f225d5"} Apr 17 18:49:40.368108 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:40.368057 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-mcrqw" podStartSLOduration=2.075366158 podStartE2EDuration="19.368045633s" podCreationTimestamp="2026-04-17 18:49:21 +0000 UTC" firstStartedPulling="2026-04-17 18:49:22.505545846 +0000 UTC m=+1.779563119" lastFinishedPulling="2026-04-17 18:49:39.798225323 +0000 UTC m=+19.072242594" observedRunningTime="2026-04-17 18:49:40.367715084 +0000 UTC m=+19.641732376" watchObservedRunningTime="2026-04-17 18:49:40.368045633 +0000 UTC m=+19.642062927" Apr 17 18:49:40.384609 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:40.384571 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-192.ec2.internal" podStartSLOduration=18.384556559 podStartE2EDuration="18.384556559s" podCreationTimestamp="2026-04-17 18:49:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:49:40.384138367 +0000 UTC m=+19.658155656" watchObservedRunningTime="2026-04-17 18:49:40.384556559 +0000 UTC m=+19.658573853" Apr 17 18:49:40.399651 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:40.399616 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-qzf4h" podStartSLOduration=2.126778844 podStartE2EDuration="19.399605919s" podCreationTimestamp="2026-04-17 18:49:21 +0000 UTC" firstStartedPulling="2026-04-17 18:49:22.489531024 +0000 UTC m=+1.763548299" lastFinishedPulling="2026-04-17 18:49:39.762358098 +0000 UTC m=+19.036375374" observedRunningTime="2026-04-17 18:49:40.399321046 +0000 UTC m=+19.673338340" watchObservedRunningTime="2026-04-17 18:49:40.399605919 +0000 UTC m=+19.673623230" Apr 17 18:49:41.306258 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:41.306237 2571 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 18:49:41.359105 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:41.358897 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-plbd4" event={"ID":"8210137d-ed94-434c-897d-f67481261a39","Type":"ContainerStarted","Data":"a3dedabd18865ce5be9e02130f9662215bf0af20c6a675084f70bb5c8c87b84c"} Apr 17 18:49:41.360377 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:41.360354 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmqn4" event={"ID":"696b7876-308e-4809-8fe2-ebe856ea41b9","Type":"ContainerStarted","Data":"10de38d5a623cf77664e130eb483c4e686fdea0ce722d32e362f3009d2ec36b6"} Apr 17 18:49:41.360377 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:41.360381 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmqn4" event={"ID":"696b7876-308e-4809-8fe2-ebe856ea41b9","Type":"ContainerStarted","Data":"adf1a2782e4f0f1ffd613a2fa9bf639b6fe3dec467ab372b74794eaabfdc0a52"} Apr 17 18:49:41.361507 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:41.361487 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-t5hn4" event={"ID":"3d42e5aa-a588-4ce1-a264-4581a72945cb","Type":"ContainerStarted","Data":"bdb2885821cbdf6691eb93b7d9bf25af5fbb50eb054dafc9ec87c1f101b3340e"} Apr 17 18:49:41.362755 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:41.362731 2571 generic.go:358] "Generic (PLEG): container finished" podID="970c6428-01b9-4aa1-be61-fc714b218008" containerID="d762ac50fc06c37205facc0f50a3806d5a2a31362d5aac0c62b00c6b5a96c79a" exitCode=0 Apr 17 18:49:41.362837 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:41.362808 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-svmpv" event={"ID":"970c6428-01b9-4aa1-be61-fc714b218008","Type":"ContainerDied","Data":"d762ac50fc06c37205facc0f50a3806d5a2a31362d5aac0c62b00c6b5a96c79a"} Apr 17 18:49:41.366335 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:41.364307 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-rjfcm" event={"ID":"396ffb22-0675-49a2-99f6-018d18d53fe3","Type":"ContainerStarted","Data":"941c25e7d31d313545061a2957a1cbac0afab8add5a9b01dc2e8cdb9e9c5cda0"} Apr 17 18:49:41.367774 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:41.367751 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-hscd2" event={"ID":"91b8b586-8905-45d3-8ed4-577118981cd3","Type":"ContainerStarted","Data":"305c70d1c29ea80d6aa6c3712fef7a8927a3542f31b5bcd79d49df1db69b8b95"} Apr 17 18:49:41.368996 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:41.368973 2571 generic.go:358] "Generic (PLEG): container finished" podID="b98093225b62e87c02ad86cc72d0d4aa" containerID="5963e3bf74d06ee1c8f9f0e1c214269ba2020bee4c8ea476f5d78a2aacaba8f7" exitCode=0 Apr 17 18:49:41.369100 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:41.369080 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-192.ec2.internal" event={"ID":"b98093225b62e87c02ad86cc72d0d4aa","Type":"ContainerDied","Data":"5963e3bf74d06ee1c8f9f0e1c214269ba2020bee4c8ea476f5d78a2aacaba8f7"} Apr 17 18:49:41.374451 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:41.374416 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-plbd4" podStartSLOduration=3.108230167 podStartE2EDuration="20.374406772s" podCreationTimestamp="2026-04-17 18:49:21 +0000 UTC" firstStartedPulling="2026-04-17 18:49:22.496607883 +0000 UTC m=+1.770625155" lastFinishedPulling="2026-04-17 18:49:39.762784476 +0000 UTC m=+19.036801760" observedRunningTime="2026-04-17 18:49:41.374184029 +0000 UTC m=+20.648201323" watchObservedRunningTime="2026-04-17 18:49:41.374406772 +0000 UTC m=+20.648424065" Apr 17 18:49:41.394471 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:41.394395 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-t5hn4" podStartSLOduration=3.148968798 podStartE2EDuration="20.394384679s" podCreationTimestamp="2026-04-17 18:49:21 +0000 UTC" firstStartedPulling="2026-04-17 18:49:22.515297527 +0000 UTC m=+1.789314798" lastFinishedPulling="2026-04-17 18:49:39.760713397 +0000 UTC m=+19.034730679" observedRunningTime="2026-04-17 18:49:41.394221933 +0000 UTC m=+20.668239226" watchObservedRunningTime="2026-04-17 18:49:41.394384679 +0000 UTC m=+20.668401987" Apr 17 18:49:41.410110 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:41.410069 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-rjfcm" podStartSLOduration=3.1063929200000002 podStartE2EDuration="20.410056296s" podCreationTimestamp="2026-04-17 18:49:21 +0000 UTC" firstStartedPulling="2026-04-17 18:49:22.457013511 +0000 UTC m=+1.731030783" lastFinishedPulling="2026-04-17 18:49:39.760676887 +0000 UTC m=+19.034694159" observedRunningTime="2026-04-17 18:49:41.410007 +0000 UTC m=+20.684024295" watchObservedRunningTime="2026-04-17 18:49:41.410056296 +0000 UTC m=+20.684073592" Apr 17 18:49:41.472337 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:41.472288 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-hscd2" podStartSLOduration=3.149916511 podStartE2EDuration="20.472271964s" podCreationTimestamp="2026-04-17 18:49:21 +0000 UTC" firstStartedPulling="2026-04-17 18:49:22.438152152 +0000 UTC m=+1.712169423" lastFinishedPulling="2026-04-17 18:49:39.760507592 +0000 UTC m=+19.034524876" observedRunningTime="2026-04-17 18:49:41.471923378 +0000 UTC m=+20.745940670" watchObservedRunningTime="2026-04-17 18:49:41.472271964 +0000 UTC m=+20.746289257" Apr 17 18:49:42.184509 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:42.184378 2571 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T18:49:41.306255547Z","UUID":"acdf665a-f1c0-4651-83e9-ac9140ad3fc2","Handler":null,"Name":"","Endpoint":""} Apr 17 18:49:42.186360 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:42.186341 2571 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 18:49:42.186442 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:42.186367 2571 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 18:49:42.232922 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:42.232897 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kx5kn" Apr 17 18:49:42.232922 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:42.232912 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d99zz" Apr 17 18:49:42.233095 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:42.233012 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kx5kn" podUID="a96394bb-18f6-42cc-975e-c0532c5c2943" Apr 17 18:49:42.233174 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:42.233142 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d99zz" podUID="9cb68ed8-ce9b-48b8-9980-07d87baf968b" Apr 17 18:49:42.372981 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:42.372918 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmqn4" event={"ID":"696b7876-308e-4809-8fe2-ebe856ea41b9","Type":"ContainerStarted","Data":"e1418b5abf31e616196a283f496a21164d91fd4cc4598ad9fb6efe95f4b0436d"} Apr 17 18:49:42.375034 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:42.374997 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-192.ec2.internal" event={"ID":"b98093225b62e87c02ad86cc72d0d4aa","Type":"ContainerStarted","Data":"f62b1e6dbbfb2af6f9b20368d8516f41a5ea258387cff97efe715152600a8985"} Apr 17 18:49:42.391673 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:42.391617 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmqn4" podStartSLOduration=1.830254358 podStartE2EDuration="21.391600872s" podCreationTimestamp="2026-04-17 18:49:21 +0000 UTC" firstStartedPulling="2026-04-17 18:49:22.483606217 +0000 UTC m=+1.757623488" lastFinishedPulling="2026-04-17 18:49:42.044952731 +0000 UTC m=+21.318970002" observedRunningTime="2026-04-17 18:49:42.391540489 +0000 UTC m=+21.665557779" watchObservedRunningTime="2026-04-17 18:49:42.391600872 +0000 UTC m=+21.665618166" Apr 17 18:49:42.406082 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:42.406039 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-192.ec2.internal" podStartSLOduration=20.406024188 podStartE2EDuration="20.406024188s" podCreationTimestamp="2026-04-17 18:49:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:49:42.405908413 +0000 UTC m=+21.679925716" watchObservedRunningTime="2026-04-17 18:49:42.406024188 +0000 UTC m=+21.680041479" Apr 17 18:49:43.379881 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:43.379840 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" event={"ID":"ccccfa46-ab60-4610-8d60-6fad3773eedb","Type":"ContainerStarted","Data":"205d902f1070e9e8d25c0ecfee3896c63f21a27aa4f3b8d50899b0eee17992bb"} Apr 17 18:49:43.990438 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:43.990400 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-rjfcm" Apr 17 18:49:44.232507 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:44.232471 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-rjfcm" Apr 17 18:49:44.232697 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:44.232575 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d99zz" Apr 17 18:49:44.232697 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:44.232595 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kx5kn" Apr 17 18:49:44.232857 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:44.232695 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d99zz" podUID="9cb68ed8-ce9b-48b8-9980-07d87baf968b" Apr 17 18:49:44.232857 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:44.232825 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kx5kn" podUID="a96394bb-18f6-42cc-975e-c0532c5c2943" Apr 17 18:49:44.233537 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:44.233513 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-rjfcm" Apr 17 18:49:44.382405 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:44.382373 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-rjfcm" Apr 17 18:49:45.388404 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:45.387998 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" event={"ID":"ccccfa46-ab60-4610-8d60-6fad3773eedb","Type":"ContainerStarted","Data":"df9644b71cea1fbd7641fecd441045e67035a7d2f117c5ebe12f51b603d8d493"} Apr 17 18:49:45.423947 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:45.423902 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" podStartSLOduration=7.033539449 podStartE2EDuration="24.423888669s" podCreationTimestamp="2026-04-17 18:49:21 +0000 UTC" firstStartedPulling="2026-04-17 18:49:22.512611201 +0000 UTC m=+1.786628479" lastFinishedPulling="2026-04-17 18:49:39.902960428 +0000 UTC m=+19.176977699" observedRunningTime="2026-04-17 18:49:45.42367186 +0000 UTC m=+24.697689152" watchObservedRunningTime="2026-04-17 18:49:45.423888669 +0000 UTC m=+24.697905962" Apr 17 18:49:46.232938 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:46.232908 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d99zz" Apr 17 18:49:46.233124 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:46.232917 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kx5kn" Apr 17 18:49:46.233124 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:46.233007 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d99zz" podUID="9cb68ed8-ce9b-48b8-9980-07d87baf968b" Apr 17 18:49:46.233124 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:46.233082 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kx5kn" podUID="a96394bb-18f6-42cc-975e-c0532c5c2943" Apr 17 18:49:46.391679 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:46.391647 2571 generic.go:358] "Generic (PLEG): container finished" podID="970c6428-01b9-4aa1-be61-fc714b218008" containerID="e4a21838bd17afbcfb16bde354d578b5abb9e4fe0c38518d9e714a0244607614" exitCode=0 Apr 17 18:49:46.392152 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:46.391712 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-svmpv" event={"ID":"970c6428-01b9-4aa1-be61-fc714b218008","Type":"ContainerDied","Data":"e4a21838bd17afbcfb16bde354d578b5abb9e4fe0c38518d9e714a0244607614"} Apr 17 18:49:46.392599 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:46.392523 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:46.392599 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:46.392566 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:46.392599 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:46.392581 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:46.407315 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:46.407296 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:46.408007 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:46.407993 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:49:47.261993 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:47.261781 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-kx5kn"] Apr 17 18:49:47.262119 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:47.262094 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kx5kn" Apr 17 18:49:47.262202 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:47.262180 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kx5kn" podUID="a96394bb-18f6-42cc-975e-c0532c5c2943" Apr 17 18:49:47.264862 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:47.264839 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-d99zz"] Apr 17 18:49:47.264946 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:47.264923 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d99zz" Apr 17 18:49:47.265017 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:47.265000 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d99zz" podUID="9cb68ed8-ce9b-48b8-9980-07d87baf968b" Apr 17 18:49:48.396512 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:48.396471 2571 generic.go:358] "Generic (PLEG): container finished" podID="970c6428-01b9-4aa1-be61-fc714b218008" containerID="e564daea6c37527d52ad8071792d8f606b6c666fcbf53cb5d3d236edda0037e9" exitCode=0 Apr 17 18:49:48.396936 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:48.396551 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-svmpv" event={"ID":"970c6428-01b9-4aa1-be61-fc714b218008","Type":"ContainerDied","Data":"e564daea6c37527d52ad8071792d8f606b6c666fcbf53cb5d3d236edda0037e9"} Apr 17 18:49:49.232800 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:49.232770 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d99zz" Apr 17 18:49:49.232940 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:49.232881 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d99zz" podUID="9cb68ed8-ce9b-48b8-9980-07d87baf968b" Apr 17 18:49:49.233003 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:49.232934 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kx5kn" Apr 17 18:49:49.233057 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:49.233026 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kx5kn" podUID="a96394bb-18f6-42cc-975e-c0532c5c2943" Apr 17 18:49:49.400761 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:49.400681 2571 generic.go:358] "Generic (PLEG): container finished" podID="970c6428-01b9-4aa1-be61-fc714b218008" containerID="38f23793effedcba45d09ec5fa43124bb29ef91b43e0ef9560d7d013a6de3b13" exitCode=0 Apr 17 18:49:49.400761 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:49.400742 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-svmpv" event={"ID":"970c6428-01b9-4aa1-be61-fc714b218008","Type":"ContainerDied","Data":"38f23793effedcba45d09ec5fa43124bb29ef91b43e0ef9560d7d013a6de3b13"} Apr 17 18:49:51.233368 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:51.233338 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kx5kn" Apr 17 18:49:51.233884 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:51.233443 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kx5kn" podUID="a96394bb-18f6-42cc-975e-c0532c5c2943" Apr 17 18:49:51.233884 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:51.233506 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d99zz" Apr 17 18:49:51.233884 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:51.233587 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d99zz" podUID="9cb68ed8-ce9b-48b8-9980-07d87baf968b" Apr 17 18:49:52.608709 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:52.608674 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-192.ec2.internal" event="NodeReady" Apr 17 18:49:52.609137 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:52.608809 2571 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 18:49:52.666270 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:52.666241 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-gr8p7"] Apr 17 18:49:52.689065 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:52.689037 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-ntz86"] Apr 17 18:49:52.689226 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:52.689207 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gr8p7" Apr 17 18:49:52.692152 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:52.692121 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ctvd8\"" Apr 17 18:49:52.692152 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:52.692121 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 18:49:52.692344 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:52.692166 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 18:49:52.703700 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:52.703681 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-gr8p7"] Apr 17 18:49:52.703700 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:52.703702 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ntz86"] Apr 17 18:49:52.703819 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:52.703774 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ntz86" Apr 17 18:49:52.707396 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:52.707379 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 18:49:52.707538 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:52.707524 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 18:49:52.707538 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:52.707529 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 18:49:52.707666 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:52.707556 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ppj8p\"" Apr 17 18:49:52.753705 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:52.753681 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/029e659d-f8ff-4796-ba06-aba3f3e2a830-tmp-dir\") pod \"dns-default-gr8p7\" (UID: \"029e659d-f8ff-4796-ba06-aba3f3e2a830\") " pod="openshift-dns/dns-default-gr8p7" Apr 17 18:49:52.753820 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:52.753723 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hsn9\" (UniqueName: \"kubernetes.io/projected/029e659d-f8ff-4796-ba06-aba3f3e2a830-kube-api-access-2hsn9\") pod \"dns-default-gr8p7\" (UID: \"029e659d-f8ff-4796-ba06-aba3f3e2a830\") " pod="openshift-dns/dns-default-gr8p7" Apr 17 18:49:52.753820 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:52.753765 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/029e659d-f8ff-4796-ba06-aba3f3e2a830-config-volume\") pod \"dns-default-gr8p7\" (UID: \"029e659d-f8ff-4796-ba06-aba3f3e2a830\") " pod="openshift-dns/dns-default-gr8p7" Apr 17 18:49:52.753820 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:52.753817 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/029e659d-f8ff-4796-ba06-aba3f3e2a830-metrics-tls\") pod \"dns-default-gr8p7\" (UID: \"029e659d-f8ff-4796-ba06-aba3f3e2a830\") " pod="openshift-dns/dns-default-gr8p7" Apr 17 18:49:52.854474 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:52.854429 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/029e659d-f8ff-4796-ba06-aba3f3e2a830-tmp-dir\") pod \"dns-default-gr8p7\" (UID: \"029e659d-f8ff-4796-ba06-aba3f3e2a830\") " pod="openshift-dns/dns-default-gr8p7" Apr 17 18:49:52.854623 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:52.854485 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2hsn9\" (UniqueName: \"kubernetes.io/projected/029e659d-f8ff-4796-ba06-aba3f3e2a830-kube-api-access-2hsn9\") pod \"dns-default-gr8p7\" (UID: \"029e659d-f8ff-4796-ba06-aba3f3e2a830\") " pod="openshift-dns/dns-default-gr8p7" Apr 17 18:49:52.854623 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:52.854531 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/029e659d-f8ff-4796-ba06-aba3f3e2a830-config-volume\") pod \"dns-default-gr8p7\" (UID: \"029e659d-f8ff-4796-ba06-aba3f3e2a830\") " pod="openshift-dns/dns-default-gr8p7" Apr 17 18:49:52.854623 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:52.854590 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/029e659d-f8ff-4796-ba06-aba3f3e2a830-metrics-tls\") pod \"dns-default-gr8p7\" (UID: \"029e659d-f8ff-4796-ba06-aba3f3e2a830\") " pod="openshift-dns/dns-default-gr8p7" Apr 17 18:49:52.854623 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:52.854613 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff724f21-8096-4540-9e0b-484999e3ecd1-cert\") pod \"ingress-canary-ntz86\" (UID: \"ff724f21-8096-4540-9e0b-484999e3ecd1\") " pod="openshift-ingress-canary/ingress-canary-ntz86" Apr 17 18:49:52.854807 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:52.854653 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn2kx\" (UniqueName: \"kubernetes.io/projected/ff724f21-8096-4540-9e0b-484999e3ecd1-kube-api-access-mn2kx\") pod \"ingress-canary-ntz86\" (UID: \"ff724f21-8096-4540-9e0b-484999e3ecd1\") " pod="openshift-ingress-canary/ingress-canary-ntz86" Apr 17 18:49:52.854859 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:52.854814 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/029e659d-f8ff-4796-ba06-aba3f3e2a830-tmp-dir\") pod \"dns-default-gr8p7\" (UID: \"029e659d-f8ff-4796-ba06-aba3f3e2a830\") " pod="openshift-dns/dns-default-gr8p7" Apr 17 18:49:52.854933 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:52.854917 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 18:49:52.854986 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:52.854978 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/029e659d-f8ff-4796-ba06-aba3f3e2a830-metrics-tls podName:029e659d-f8ff-4796-ba06-aba3f3e2a830 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:53.354956648 +0000 UTC m=+32.628973928 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/029e659d-f8ff-4796-ba06-aba3f3e2a830-metrics-tls") pod "dns-default-gr8p7" (UID: "029e659d-f8ff-4796-ba06-aba3f3e2a830") : secret "dns-default-metrics-tls" not found Apr 17 18:49:52.855218 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:52.855195 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/029e659d-f8ff-4796-ba06-aba3f3e2a830-config-volume\") pod \"dns-default-gr8p7\" (UID: \"029e659d-f8ff-4796-ba06-aba3f3e2a830\") " pod="openshift-dns/dns-default-gr8p7" Apr 17 18:49:52.865356 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:52.865116 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hsn9\" (UniqueName: \"kubernetes.io/projected/029e659d-f8ff-4796-ba06-aba3f3e2a830-kube-api-access-2hsn9\") pod \"dns-default-gr8p7\" (UID: \"029e659d-f8ff-4796-ba06-aba3f3e2a830\") " pod="openshift-dns/dns-default-gr8p7" Apr 17 18:49:52.955530 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:52.955494 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff724f21-8096-4540-9e0b-484999e3ecd1-cert\") pod \"ingress-canary-ntz86\" (UID: \"ff724f21-8096-4540-9e0b-484999e3ecd1\") " pod="openshift-ingress-canary/ingress-canary-ntz86" Apr 17 18:49:52.955674 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:52.955545 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mn2kx\" (UniqueName: \"kubernetes.io/projected/ff724f21-8096-4540-9e0b-484999e3ecd1-kube-api-access-mn2kx\") pod \"ingress-canary-ntz86\" (UID: \"ff724f21-8096-4540-9e0b-484999e3ecd1\") " pod="openshift-ingress-canary/ingress-canary-ntz86" Apr 17 18:49:52.955674 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:52.955662 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 18:49:52.955799 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:52.955736 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff724f21-8096-4540-9e0b-484999e3ecd1-cert podName:ff724f21-8096-4540-9e0b-484999e3ecd1 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:53.455716996 +0000 UTC m=+32.729734281 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff724f21-8096-4540-9e0b-484999e3ecd1-cert") pod "ingress-canary-ntz86" (UID: "ff724f21-8096-4540-9e0b-484999e3ecd1") : secret "canary-serving-cert" not found Apr 17 18:49:52.966736 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:52.966702 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn2kx\" (UniqueName: \"kubernetes.io/projected/ff724f21-8096-4540-9e0b-484999e3ecd1-kube-api-access-mn2kx\") pod \"ingress-canary-ntz86\" (UID: \"ff724f21-8096-4540-9e0b-484999e3ecd1\") " pod="openshift-ingress-canary/ingress-canary-ntz86" Apr 17 18:49:53.233349 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:53.233270 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d99zz" Apr 17 18:49:53.233349 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:53.233318 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kx5kn" Apr 17 18:49:53.236097 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:53.235898 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 18:49:53.236097 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:53.235923 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 18:49:53.236097 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:53.235909 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-dwb49\"" Apr 17 18:49:53.236097 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:53.235907 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 18:49:53.236097 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:53.235969 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-tlcmp\"" Apr 17 18:49:53.357955 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:53.357920 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/029e659d-f8ff-4796-ba06-aba3f3e2a830-metrics-tls\") pod \"dns-default-gr8p7\" (UID: \"029e659d-f8ff-4796-ba06-aba3f3e2a830\") " pod="openshift-dns/dns-default-gr8p7" Apr 17 18:49:53.358125 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:53.358052 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 18:49:53.358185 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:53.358128 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/029e659d-f8ff-4796-ba06-aba3f3e2a830-metrics-tls podName:029e659d-f8ff-4796-ba06-aba3f3e2a830 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:54.358106023 +0000 UTC m=+33.632123294 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/029e659d-f8ff-4796-ba06-aba3f3e2a830-metrics-tls") pod "dns-default-gr8p7" (UID: "029e659d-f8ff-4796-ba06-aba3f3e2a830") : secret "dns-default-metrics-tls" not found Apr 17 18:49:53.458703 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:53.458661 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff724f21-8096-4540-9e0b-484999e3ecd1-cert\") pod \"ingress-canary-ntz86\" (UID: \"ff724f21-8096-4540-9e0b-484999e3ecd1\") " pod="openshift-ingress-canary/ingress-canary-ntz86" Apr 17 18:49:53.458889 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:53.458815 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 18:49:53.458952 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:53.458908 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff724f21-8096-4540-9e0b-484999e3ecd1-cert podName:ff724f21-8096-4540-9e0b-484999e3ecd1 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:54.458886448 +0000 UTC m=+33.732903743 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff724f21-8096-4540-9e0b-484999e3ecd1-cert") pod "ingress-canary-ntz86" (UID: "ff724f21-8096-4540-9e0b-484999e3ecd1") : secret "canary-serving-cert" not found Apr 17 18:49:53.861665 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:53.861622 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9cb68ed8-ce9b-48b8-9980-07d87baf968b-metrics-certs\") pod \"network-metrics-daemon-d99zz\" (UID: \"9cb68ed8-ce9b-48b8-9980-07d87baf968b\") " pod="openshift-multus/network-metrics-daemon-d99zz" Apr 17 18:49:53.862065 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:53.861749 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 18:49:53.862065 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:53.861833 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9cb68ed8-ce9b-48b8-9980-07d87baf968b-metrics-certs podName:9cb68ed8-ce9b-48b8-9980-07d87baf968b nodeName:}" failed. No retries permitted until 2026-04-17 18:50:25.86181173 +0000 UTC m=+65.135829005 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9cb68ed8-ce9b-48b8-9980-07d87baf968b-metrics-certs") pod "network-metrics-daemon-d99zz" (UID: "9cb68ed8-ce9b-48b8-9980-07d87baf968b") : secret "metrics-daemon-secret" not found Apr 17 18:49:54.063553 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:54.063514 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58b4g\" (UniqueName: \"kubernetes.io/projected/a96394bb-18f6-42cc-975e-c0532c5c2943-kube-api-access-58b4g\") pod \"network-check-target-kx5kn\" (UID: \"a96394bb-18f6-42cc-975e-c0532c5c2943\") " pod="openshift-network-diagnostics/network-check-target-kx5kn" Apr 17 18:49:54.066735 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:54.066696 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-58b4g\" (UniqueName: \"kubernetes.io/projected/a96394bb-18f6-42cc-975e-c0532c5c2943-kube-api-access-58b4g\") pod \"network-check-target-kx5kn\" (UID: \"a96394bb-18f6-42cc-975e-c0532c5c2943\") " pod="openshift-network-diagnostics/network-check-target-kx5kn" Apr 17 18:49:54.149779 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:54.149706 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kx5kn" Apr 17 18:49:54.365838 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:54.365794 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/029e659d-f8ff-4796-ba06-aba3f3e2a830-metrics-tls\") pod \"dns-default-gr8p7\" (UID: \"029e659d-f8ff-4796-ba06-aba3f3e2a830\") " pod="openshift-dns/dns-default-gr8p7" Apr 17 18:49:54.366004 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:54.365944 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 18:49:54.366074 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:54.366018 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/029e659d-f8ff-4796-ba06-aba3f3e2a830-metrics-tls podName:029e659d-f8ff-4796-ba06-aba3f3e2a830 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:56.366003282 +0000 UTC m=+35.640020567 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/029e659d-f8ff-4796-ba06-aba3f3e2a830-metrics-tls") pod "dns-default-gr8p7" (UID: "029e659d-f8ff-4796-ba06-aba3f3e2a830") : secret "dns-default-metrics-tls" not found Apr 17 18:49:54.466218 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:54.466143 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff724f21-8096-4540-9e0b-484999e3ecd1-cert\") pod \"ingress-canary-ntz86\" (UID: \"ff724f21-8096-4540-9e0b-484999e3ecd1\") " pod="openshift-ingress-canary/ingress-canary-ntz86" Apr 17 18:49:54.466361 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:54.466301 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 18:49:54.466414 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:54.466366 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff724f21-8096-4540-9e0b-484999e3ecd1-cert podName:ff724f21-8096-4540-9e0b-484999e3ecd1 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:56.466347956 +0000 UTC m=+35.740365227 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff724f21-8096-4540-9e0b-484999e3ecd1-cert") pod "ingress-canary-ntz86" (UID: "ff724f21-8096-4540-9e0b-484999e3ecd1") : secret "canary-serving-cert" not found Apr 17 18:49:55.016847 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:55.016807 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-kx5kn"] Apr 17 18:49:55.104993 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:49:55.104954 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda96394bb_18f6_42cc_975e_c0532c5c2943.slice/crio-30f0d559e43c1c708cefe832c495b53a8a3eb0e1e838bffe80206821cc71457b WatchSource:0}: Error finding container 30f0d559e43c1c708cefe832c495b53a8a3eb0e1e838bffe80206821cc71457b: Status 404 returned error can't find the container with id 30f0d559e43c1c708cefe832c495b53a8a3eb0e1e838bffe80206821cc71457b Apr 17 18:49:55.413194 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:55.413167 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-kx5kn" event={"ID":"a96394bb-18f6-42cc-975e-c0532c5c2943","Type":"ContainerStarted","Data":"30f0d559e43c1c708cefe832c495b53a8a3eb0e1e838bffe80206821cc71457b"} Apr 17 18:49:55.415345 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:55.415318 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-svmpv" event={"ID":"970c6428-01b9-4aa1-be61-fc714b218008","Type":"ContainerStarted","Data":"224e882c878bbe07b92765706dcec8733a9649e1dbb8d4f5e85a8fa033f6de83"} Apr 17 18:49:56.380077 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:56.379893 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/029e659d-f8ff-4796-ba06-aba3f3e2a830-metrics-tls\") pod \"dns-default-gr8p7\" (UID: \"029e659d-f8ff-4796-ba06-aba3f3e2a830\") " pod="openshift-dns/dns-default-gr8p7" Apr 17 18:49:56.380531 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:56.380031 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 18:49:56.380531 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:56.380194 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/029e659d-f8ff-4796-ba06-aba3f3e2a830-metrics-tls podName:029e659d-f8ff-4796-ba06-aba3f3e2a830 nodeName:}" failed. No retries permitted until 2026-04-17 18:50:00.380172337 +0000 UTC m=+39.654189608 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/029e659d-f8ff-4796-ba06-aba3f3e2a830-metrics-tls") pod "dns-default-gr8p7" (UID: "029e659d-f8ff-4796-ba06-aba3f3e2a830") : secret "dns-default-metrics-tls" not found Apr 17 18:49:56.420093 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:56.420050 2571 generic.go:358] "Generic (PLEG): container finished" podID="970c6428-01b9-4aa1-be61-fc714b218008" containerID="224e882c878bbe07b92765706dcec8733a9649e1dbb8d4f5e85a8fa033f6de83" exitCode=0 Apr 17 18:49:56.420233 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:56.420115 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-svmpv" event={"ID":"970c6428-01b9-4aa1-be61-fc714b218008","Type":"ContainerDied","Data":"224e882c878bbe07b92765706dcec8733a9649e1dbb8d4f5e85a8fa033f6de83"} Apr 17 18:49:56.480767 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:56.480730 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff724f21-8096-4540-9e0b-484999e3ecd1-cert\") pod \"ingress-canary-ntz86\" (UID: \"ff724f21-8096-4540-9e0b-484999e3ecd1\") " pod="openshift-ingress-canary/ingress-canary-ntz86" Apr 17 18:49:56.481316 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:56.480917 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 18:49:56.481316 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:49:56.480988 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff724f21-8096-4540-9e0b-484999e3ecd1-cert podName:ff724f21-8096-4540-9e0b-484999e3ecd1 nodeName:}" failed. No retries permitted until 2026-04-17 18:50:00.480967989 +0000 UTC m=+39.754985277 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff724f21-8096-4540-9e0b-484999e3ecd1-cert") pod "ingress-canary-ntz86" (UID: "ff724f21-8096-4540-9e0b-484999e3ecd1") : secret "canary-serving-cert" not found Apr 17 18:49:57.426331 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:57.425721 2571 generic.go:358] "Generic (PLEG): container finished" podID="970c6428-01b9-4aa1-be61-fc714b218008" containerID="d788bf76c394ee2d420b07c1d67b4c9c87a7e9eded00a9952e680f20f28f80f9" exitCode=0 Apr 17 18:49:57.426331 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:57.425774 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-svmpv" event={"ID":"970c6428-01b9-4aa1-be61-fc714b218008","Type":"ContainerDied","Data":"d788bf76c394ee2d420b07c1d67b4c9c87a7e9eded00a9952e680f20f28f80f9"} Apr 17 18:49:58.429944 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:58.429918 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-svmpv" event={"ID":"970c6428-01b9-4aa1-be61-fc714b218008","Type":"ContainerStarted","Data":"49f5fbdefb3a9e53b93e98f02fea9fcafb234cf5c76e302b9edcfcbcad45511d"} Apr 17 18:49:58.451099 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:58.451062 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-svmpv" podStartSLOduration=4.816208749 podStartE2EDuration="37.451049052s" podCreationTimestamp="2026-04-17 18:49:21 +0000 UTC" firstStartedPulling="2026-04-17 18:49:22.502093407 +0000 UTC m=+1.776110686" lastFinishedPulling="2026-04-17 18:49:55.136933718 +0000 UTC m=+34.410950989" observedRunningTime="2026-04-17 18:49:58.44964565 +0000 UTC m=+37.723662942" watchObservedRunningTime="2026-04-17 18:49:58.451049052 +0000 UTC m=+37.725066346" Apr 17 18:49:59.433007 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:59.432968 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-kx5kn" event={"ID":"a96394bb-18f6-42cc-975e-c0532c5c2943","Type":"ContainerStarted","Data":"8acd5357efd17cb7d32eff713fd4d3d160b6408d15742dac289368a36ba4fcd2"} Apr 17 18:49:59.433440 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:59.433179 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-kx5kn" Apr 17 18:49:59.447088 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:49:59.447049 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-kx5kn" podStartSLOduration=34.928617123 podStartE2EDuration="38.447038813s" podCreationTimestamp="2026-04-17 18:49:21 +0000 UTC" firstStartedPulling="2026-04-17 18:49:55.113737813 +0000 UTC m=+34.387755084" lastFinishedPulling="2026-04-17 18:49:58.632159501 +0000 UTC m=+37.906176774" observedRunningTime="2026-04-17 18:49:59.446597465 +0000 UTC m=+38.720614753" watchObservedRunningTime="2026-04-17 18:49:59.447038813 +0000 UTC m=+38.721056105" Apr 17 18:50:00.412136 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:00.412098 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/029e659d-f8ff-4796-ba06-aba3f3e2a830-metrics-tls\") pod \"dns-default-gr8p7\" (UID: \"029e659d-f8ff-4796-ba06-aba3f3e2a830\") " pod="openshift-dns/dns-default-gr8p7" Apr 17 18:50:00.412305 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:50:00.412231 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 18:50:00.412305 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:50:00.412295 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/029e659d-f8ff-4796-ba06-aba3f3e2a830-metrics-tls podName:029e659d-f8ff-4796-ba06-aba3f3e2a830 nodeName:}" failed. No retries permitted until 2026-04-17 18:50:08.412278478 +0000 UTC m=+47.686295753 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/029e659d-f8ff-4796-ba06-aba3f3e2a830-metrics-tls") pod "dns-default-gr8p7" (UID: "029e659d-f8ff-4796-ba06-aba3f3e2a830") : secret "dns-default-metrics-tls" not found Apr 17 18:50:00.513077 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:00.513039 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff724f21-8096-4540-9e0b-484999e3ecd1-cert\") pod \"ingress-canary-ntz86\" (UID: \"ff724f21-8096-4540-9e0b-484999e3ecd1\") " pod="openshift-ingress-canary/ingress-canary-ntz86" Apr 17 18:50:00.513481 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:50:00.513186 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 18:50:00.513481 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:50:00.513251 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff724f21-8096-4540-9e0b-484999e3ecd1-cert podName:ff724f21-8096-4540-9e0b-484999e3ecd1 nodeName:}" failed. No retries permitted until 2026-04-17 18:50:08.51323646 +0000 UTC m=+47.787253730 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff724f21-8096-4540-9e0b-484999e3ecd1-cert") pod "ingress-canary-ntz86" (UID: "ff724f21-8096-4540-9e0b-484999e3ecd1") : secret "canary-serving-cert" not found Apr 17 18:50:06.068929 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:06.068894 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-66f9f76c87-v7gbz"] Apr 17 18:50:06.113470 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:06.113420 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-66f9f76c87-v7gbz"] Apr 17 18:50:06.113470 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:06.113468 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f4567b6f-c92mb"] Apr 17 18:50:06.113695 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:06.113530 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-66f9f76c87-v7gbz" Apr 17 18:50:06.116010 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:06.115987 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 18:50:06.116662 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:06.116645 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 18:50:06.116771 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:06.116679 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-b7c5q\"" Apr 17 18:50:06.116771 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:06.116743 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 18:50:06.116908 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:06.116791 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 17 18:50:06.127838 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:06.127820 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f4567b6f-c92mb"] Apr 17 18:50:06.127930 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:06.127912 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f4567b6f-c92mb" Apr 17 18:50:06.130018 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:06.130000 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 17 18:50:06.248494 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:06.248445 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b16c6eb0-dcd4-4fa0-aba0-5e3fab5f8128-tmp\") pod \"klusterlet-addon-workmgr-68f4567b6f-c92mb\" (UID: \"b16c6eb0-dcd4-4fa0-aba0-5e3fab5f8128\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f4567b6f-c92mb" Apr 17 18:50:06.248674 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:06.248570 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/b16c6eb0-dcd4-4fa0-aba0-5e3fab5f8128-klusterlet-config\") pod \"klusterlet-addon-workmgr-68f4567b6f-c92mb\" (UID: \"b16c6eb0-dcd4-4fa0-aba0-5e3fab5f8128\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f4567b6f-c92mb" Apr 17 18:50:06.248674 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:06.248598 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjzkh\" (UniqueName: \"kubernetes.io/projected/f619fb83-49ab-4de8-aa02-9d653c915c2e-kube-api-access-cjzkh\") pod \"managed-serviceaccount-addon-agent-66f9f76c87-v7gbz\" (UID: \"f619fb83-49ab-4de8-aa02-9d653c915c2e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-66f9f76c87-v7gbz" Apr 17 18:50:06.248674 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:06.248621 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f619fb83-49ab-4de8-aa02-9d653c915c2e-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-66f9f76c87-v7gbz\" (UID: \"f619fb83-49ab-4de8-aa02-9d653c915c2e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-66f9f76c87-v7gbz" Apr 17 18:50:06.248674 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:06.248645 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf4wf\" (UniqueName: \"kubernetes.io/projected/b16c6eb0-dcd4-4fa0-aba0-5e3fab5f8128-kube-api-access-tf4wf\") pod \"klusterlet-addon-workmgr-68f4567b6f-c92mb\" (UID: \"b16c6eb0-dcd4-4fa0-aba0-5e3fab5f8128\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f4567b6f-c92mb" Apr 17 18:50:06.349846 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:06.349756 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b16c6eb0-dcd4-4fa0-aba0-5e3fab5f8128-tmp\") pod \"klusterlet-addon-workmgr-68f4567b6f-c92mb\" (UID: \"b16c6eb0-dcd4-4fa0-aba0-5e3fab5f8128\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f4567b6f-c92mb" Apr 17 18:50:06.349987 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:06.349865 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/b16c6eb0-dcd4-4fa0-aba0-5e3fab5f8128-klusterlet-config\") pod \"klusterlet-addon-workmgr-68f4567b6f-c92mb\" (UID: \"b16c6eb0-dcd4-4fa0-aba0-5e3fab5f8128\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f4567b6f-c92mb" Apr 17 18:50:06.349987 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:06.349894 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cjzkh\" (UniqueName: \"kubernetes.io/projected/f619fb83-49ab-4de8-aa02-9d653c915c2e-kube-api-access-cjzkh\") pod \"managed-serviceaccount-addon-agent-66f9f76c87-v7gbz\" (UID: \"f619fb83-49ab-4de8-aa02-9d653c915c2e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-66f9f76c87-v7gbz" Apr 17 18:50:06.349987 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:06.349924 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f619fb83-49ab-4de8-aa02-9d653c915c2e-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-66f9f76c87-v7gbz\" (UID: \"f619fb83-49ab-4de8-aa02-9d653c915c2e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-66f9f76c87-v7gbz" Apr 17 18:50:06.349987 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:06.349953 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tf4wf\" (UniqueName: \"kubernetes.io/projected/b16c6eb0-dcd4-4fa0-aba0-5e3fab5f8128-kube-api-access-tf4wf\") pod \"klusterlet-addon-workmgr-68f4567b6f-c92mb\" (UID: \"b16c6eb0-dcd4-4fa0-aba0-5e3fab5f8128\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f4567b6f-c92mb" Apr 17 18:50:06.350246 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:06.350209 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b16c6eb0-dcd4-4fa0-aba0-5e3fab5f8128-tmp\") pod \"klusterlet-addon-workmgr-68f4567b6f-c92mb\" (UID: \"b16c6eb0-dcd4-4fa0-aba0-5e3fab5f8128\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f4567b6f-c92mb" Apr 17 18:50:06.353618 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:06.353595 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f619fb83-49ab-4de8-aa02-9d653c915c2e-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-66f9f76c87-v7gbz\" (UID: \"f619fb83-49ab-4de8-aa02-9d653c915c2e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-66f9f76c87-v7gbz" Apr 17 18:50:06.353618 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:06.353604 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/b16c6eb0-dcd4-4fa0-aba0-5e3fab5f8128-klusterlet-config\") pod \"klusterlet-addon-workmgr-68f4567b6f-c92mb\" (UID: \"b16c6eb0-dcd4-4fa0-aba0-5e3fab5f8128\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f4567b6f-c92mb" Apr 17 18:50:06.356928 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:06.356905 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjzkh\" (UniqueName: \"kubernetes.io/projected/f619fb83-49ab-4de8-aa02-9d653c915c2e-kube-api-access-cjzkh\") pod \"managed-serviceaccount-addon-agent-66f9f76c87-v7gbz\" (UID: \"f619fb83-49ab-4de8-aa02-9d653c915c2e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-66f9f76c87-v7gbz" Apr 17 18:50:06.357080 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:06.357053 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf4wf\" (UniqueName: \"kubernetes.io/projected/b16c6eb0-dcd4-4fa0-aba0-5e3fab5f8128-kube-api-access-tf4wf\") pod \"klusterlet-addon-workmgr-68f4567b6f-c92mb\" (UID: \"b16c6eb0-dcd4-4fa0-aba0-5e3fab5f8128\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f4567b6f-c92mb" Apr 17 18:50:06.431513 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:06.431475 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-66f9f76c87-v7gbz" Apr 17 18:50:06.438213 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:06.438189 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f4567b6f-c92mb" Apr 17 18:50:06.570884 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:06.570852 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f4567b6f-c92mb"] Apr 17 18:50:06.585405 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:06.585335 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-66f9f76c87-v7gbz"] Apr 17 18:50:06.611181 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:50:06.611156 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf619fb83_49ab_4de8_aa02_9d653c915c2e.slice/crio-7d56f0cfb80ab8c5d4b772ba21b59ff49182e7848c481f5cfc83db0083d34b43 WatchSource:0}: Error finding container 7d56f0cfb80ab8c5d4b772ba21b59ff49182e7848c481f5cfc83db0083d34b43: Status 404 returned error can't find the container with id 7d56f0cfb80ab8c5d4b772ba21b59ff49182e7848c481f5cfc83db0083d34b43 Apr 17 18:50:07.448226 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:07.448183 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f4567b6f-c92mb" event={"ID":"b16c6eb0-dcd4-4fa0-aba0-5e3fab5f8128","Type":"ContainerStarted","Data":"46a19e43d1294b3a53b7fa8e18860014cb28232e56e5ef9ae8f8a94a7c22da13"} Apr 17 18:50:07.449527 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:07.449495 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-66f9f76c87-v7gbz" event={"ID":"f619fb83-49ab-4de8-aa02-9d653c915c2e","Type":"ContainerStarted","Data":"7d56f0cfb80ab8c5d4b772ba21b59ff49182e7848c481f5cfc83db0083d34b43"} Apr 17 18:50:08.465801 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:08.465767 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/029e659d-f8ff-4796-ba06-aba3f3e2a830-metrics-tls\") pod \"dns-default-gr8p7\" (UID: \"029e659d-f8ff-4796-ba06-aba3f3e2a830\") " pod="openshift-dns/dns-default-gr8p7" Apr 17 18:50:08.466191 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:50:08.465909 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 18:50:08.466191 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:50:08.465992 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/029e659d-f8ff-4796-ba06-aba3f3e2a830-metrics-tls podName:029e659d-f8ff-4796-ba06-aba3f3e2a830 nodeName:}" failed. No retries permitted until 2026-04-17 18:50:24.465974131 +0000 UTC m=+63.739991411 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/029e659d-f8ff-4796-ba06-aba3f3e2a830-metrics-tls") pod "dns-default-gr8p7" (UID: "029e659d-f8ff-4796-ba06-aba3f3e2a830") : secret "dns-default-metrics-tls" not found Apr 17 18:50:08.566578 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:08.566543 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff724f21-8096-4540-9e0b-484999e3ecd1-cert\") pod \"ingress-canary-ntz86\" (UID: \"ff724f21-8096-4540-9e0b-484999e3ecd1\") " pod="openshift-ingress-canary/ingress-canary-ntz86" Apr 17 18:50:08.566748 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:50:08.566679 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 18:50:08.566748 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:50:08.566749 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff724f21-8096-4540-9e0b-484999e3ecd1-cert podName:ff724f21-8096-4540-9e0b-484999e3ecd1 nodeName:}" failed. No retries permitted until 2026-04-17 18:50:24.566733274 +0000 UTC m=+63.840750544 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff724f21-8096-4540-9e0b-484999e3ecd1-cert") pod "ingress-canary-ntz86" (UID: "ff724f21-8096-4540-9e0b-484999e3ecd1") : secret "canary-serving-cert" not found Apr 17 18:50:10.456980 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:10.456950 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-66f9f76c87-v7gbz" event={"ID":"f619fb83-49ab-4de8-aa02-9d653c915c2e","Type":"ContainerStarted","Data":"a13bc092818fe0475dc3c692fa3f8de214c7a037e4de664c002ec76293cb98c5"} Apr 17 18:50:10.470917 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:10.470872 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-66f9f76c87-v7gbz" podStartSLOduration=1.327418857 podStartE2EDuration="4.47085983s" podCreationTimestamp="2026-04-17 18:50:06 +0000 UTC" firstStartedPulling="2026-04-17 18:50:06.613013614 +0000 UTC m=+45.887030886" lastFinishedPulling="2026-04-17 18:50:09.756454584 +0000 UTC m=+49.030471859" observedRunningTime="2026-04-17 18:50:10.470209035 +0000 UTC m=+49.744226327" watchObservedRunningTime="2026-04-17 18:50:10.47085983 +0000 UTC m=+49.744877123" Apr 17 18:50:16.469036 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:16.469000 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f4567b6f-c92mb" event={"ID":"b16c6eb0-dcd4-4fa0-aba0-5e3fab5f8128","Type":"ContainerStarted","Data":"c2110912268e7d202484fe6383965739020f79224ab4d4f37ccdba25f2ec7f9e"} Apr 17 18:50:16.469382 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:16.469217 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f4567b6f-c92mb" Apr 17 18:50:16.470453 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:16.470435 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f4567b6f-c92mb" Apr 17 18:50:16.498724 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:16.498678 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f4567b6f-c92mb" podStartSLOduration=1.180598302 podStartE2EDuration="10.498664515s" podCreationTimestamp="2026-04-17 18:50:06 +0000 UTC" firstStartedPulling="2026-04-17 18:50:06.575378274 +0000 UTC m=+45.849395548" lastFinishedPulling="2026-04-17 18:50:15.893444488 +0000 UTC m=+55.167461761" observedRunningTime="2026-04-17 18:50:16.483546032 +0000 UTC m=+55.757563328" watchObservedRunningTime="2026-04-17 18:50:16.498664515 +0000 UTC m=+55.772681808" Apr 17 18:50:18.409550 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:18.409519 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sctgd" Apr 17 18:50:24.476665 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:24.476610 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/029e659d-f8ff-4796-ba06-aba3f3e2a830-metrics-tls\") pod \"dns-default-gr8p7\" (UID: \"029e659d-f8ff-4796-ba06-aba3f3e2a830\") " pod="openshift-dns/dns-default-gr8p7" Apr 17 18:50:24.477130 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:50:24.476745 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 18:50:24.477130 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:50:24.476826 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/029e659d-f8ff-4796-ba06-aba3f3e2a830-metrics-tls podName:029e659d-f8ff-4796-ba06-aba3f3e2a830 nodeName:}" failed. No retries permitted until 2026-04-17 18:50:56.476808834 +0000 UTC m=+95.750826105 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/029e659d-f8ff-4796-ba06-aba3f3e2a830-metrics-tls") pod "dns-default-gr8p7" (UID: "029e659d-f8ff-4796-ba06-aba3f3e2a830") : secret "dns-default-metrics-tls" not found Apr 17 18:50:24.577607 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:24.577576 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff724f21-8096-4540-9e0b-484999e3ecd1-cert\") pod \"ingress-canary-ntz86\" (UID: \"ff724f21-8096-4540-9e0b-484999e3ecd1\") " pod="openshift-ingress-canary/ingress-canary-ntz86" Apr 17 18:50:24.577738 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:50:24.577666 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 18:50:24.577738 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:50:24.577719 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff724f21-8096-4540-9e0b-484999e3ecd1-cert podName:ff724f21-8096-4540-9e0b-484999e3ecd1 nodeName:}" failed. No retries permitted until 2026-04-17 18:50:56.577706065 +0000 UTC m=+95.851723335 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff724f21-8096-4540-9e0b-484999e3ecd1-cert") pod "ingress-canary-ntz86" (UID: "ff724f21-8096-4540-9e0b-484999e3ecd1") : secret "canary-serving-cert" not found Apr 17 18:50:25.885020 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:25.884981 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9cb68ed8-ce9b-48b8-9980-07d87baf968b-metrics-certs\") pod \"network-metrics-daemon-d99zz\" (UID: \"9cb68ed8-ce9b-48b8-9980-07d87baf968b\") " pod="openshift-multus/network-metrics-daemon-d99zz" Apr 17 18:50:25.885539 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:50:25.885140 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 18:50:25.885539 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:50:25.885223 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9cb68ed8-ce9b-48b8-9980-07d87baf968b-metrics-certs podName:9cb68ed8-ce9b-48b8-9980-07d87baf968b nodeName:}" failed. No retries permitted until 2026-04-17 18:51:29.885200733 +0000 UTC m=+129.159218004 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9cb68ed8-ce9b-48b8-9980-07d87baf968b-metrics-certs") pod "network-metrics-daemon-d99zz" (UID: "9cb68ed8-ce9b-48b8-9980-07d87baf968b") : secret "metrics-daemon-secret" not found Apr 17 18:50:30.437948 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:30.437919 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-kx5kn" Apr 17 18:50:37.314861 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.314751 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-88g9b"] Apr 17 18:50:37.316836 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.316819 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7bb99cc44b-rr54f"] Apr 17 18:50:37.316964 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.316945 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-88g9b" Apr 17 18:50:37.318472 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.318442 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7bb99cc44b-rr54f" Apr 17 18:50:37.319166 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.319142 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 17 18:50:37.319166 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.319166 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 18:50:37.319306 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.319146 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-86flp\"" Apr 17 18:50:37.319306 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.319258 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 17 18:50:37.319598 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.319581 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 18:50:37.320350 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.320332 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 18:50:37.320474 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.320387 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 18:50:37.321060 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.321042 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 18:50:37.321155 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.321045 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-2m8h2\"" Apr 17 18:50:37.326212 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.326184 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 17 18:50:37.329228 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.329208 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 18:50:37.329811 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.329788 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-88g9b"] Apr 17 18:50:37.330813 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.330604 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7bb99cc44b-rr54f"] Apr 17 18:50:37.357308 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.357286 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5fd97d3-b332-4c85-9344-c0e9f314aed6-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-88g9b\" (UID: \"c5fd97d3-b332-4c85-9344-c0e9f314aed6\") " pod="openshift-insights/insights-operator-585dfdc468-88g9b" Apr 17 18:50:37.357482 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.357314 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1b34c122-9e66-4876-ad10-d90898363166-installation-pull-secrets\") pod \"image-registry-7bb99cc44b-rr54f\" (UID: \"1b34c122-9e66-4876-ad10-d90898363166\") " pod="openshift-image-registry/image-registry-7bb99cc44b-rr54f" Apr 17 18:50:37.357482 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.357336 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5fd97d3-b332-4c85-9344-c0e9f314aed6-service-ca-bundle\") pod \"insights-operator-585dfdc468-88g9b\" (UID: \"c5fd97d3-b332-4c85-9344-c0e9f314aed6\") " pod="openshift-insights/insights-operator-585dfdc468-88g9b" Apr 17 18:50:37.357482 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.357351 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1b34c122-9e66-4876-ad10-d90898363166-registry-certificates\") pod \"image-registry-7bb99cc44b-rr54f\" (UID: \"1b34c122-9e66-4876-ad10-d90898363166\") " pod="openshift-image-registry/image-registry-7bb99cc44b-rr54f" Apr 17 18:50:37.357482 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.357427 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c5fd97d3-b332-4c85-9344-c0e9f314aed6-tmp\") pod \"insights-operator-585dfdc468-88g9b\" (UID: \"c5fd97d3-b332-4c85-9344-c0e9f314aed6\") " pod="openshift-insights/insights-operator-585dfdc468-88g9b" Apr 17 18:50:37.357482 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.357448 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b34c122-9e66-4876-ad10-d90898363166-registry-tls\") pod \"image-registry-7bb99cc44b-rr54f\" (UID: \"1b34c122-9e66-4876-ad10-d90898363166\") " pod="openshift-image-registry/image-registry-7bb99cc44b-rr54f" Apr 17 18:50:37.357643 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.357490 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b34c122-9e66-4876-ad10-d90898363166-bound-sa-token\") pod \"image-registry-7bb99cc44b-rr54f\" (UID: \"1b34c122-9e66-4876-ad10-d90898363166\") " pod="openshift-image-registry/image-registry-7bb99cc44b-rr54f" Apr 17 18:50:37.357643 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.357531 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c5fd97d3-b332-4c85-9344-c0e9f314aed6-snapshots\") pod \"insights-operator-585dfdc468-88g9b\" (UID: \"c5fd97d3-b332-4c85-9344-c0e9f314aed6\") " pod="openshift-insights/insights-operator-585dfdc468-88g9b" Apr 17 18:50:37.357643 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.357551 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfcgp\" (UniqueName: \"kubernetes.io/projected/c5fd97d3-b332-4c85-9344-c0e9f314aed6-kube-api-access-vfcgp\") pod \"insights-operator-585dfdc468-88g9b\" (UID: \"c5fd97d3-b332-4c85-9344-c0e9f314aed6\") " pod="openshift-insights/insights-operator-585dfdc468-88g9b" Apr 17 18:50:37.357643 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.357569 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1b34c122-9e66-4876-ad10-d90898363166-image-registry-private-configuration\") pod \"image-registry-7bb99cc44b-rr54f\" (UID: \"1b34c122-9e66-4876-ad10-d90898363166\") " pod="openshift-image-registry/image-registry-7bb99cc44b-rr54f" Apr 17 18:50:37.357643 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.357586 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1b34c122-9e66-4876-ad10-d90898363166-ca-trust-extracted\") pod \"image-registry-7bb99cc44b-rr54f\" (UID: \"1b34c122-9e66-4876-ad10-d90898363166\") " pod="openshift-image-registry/image-registry-7bb99cc44b-rr54f" Apr 17 18:50:37.357643 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.357600 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw4z7\" (UniqueName: \"kubernetes.io/projected/1b34c122-9e66-4876-ad10-d90898363166-kube-api-access-dw4z7\") pod \"image-registry-7bb99cc44b-rr54f\" (UID: \"1b34c122-9e66-4876-ad10-d90898363166\") " pod="openshift-image-registry/image-registry-7bb99cc44b-rr54f" Apr 17 18:50:37.357643 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.357619 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5fd97d3-b332-4c85-9344-c0e9f314aed6-serving-cert\") pod \"insights-operator-585dfdc468-88g9b\" (UID: \"c5fd97d3-b332-4c85-9344-c0e9f314aed6\") " pod="openshift-insights/insights-operator-585dfdc468-88g9b" Apr 17 18:50:37.357865 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.357660 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1b34c122-9e66-4876-ad10-d90898363166-trusted-ca\") pod \"image-registry-7bb99cc44b-rr54f\" (UID: \"1b34c122-9e66-4876-ad10-d90898363166\") " pod="openshift-image-registry/image-registry-7bb99cc44b-rr54f" Apr 17 18:50:37.457946 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.457916 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b34c122-9e66-4876-ad10-d90898363166-bound-sa-token\") pod \"image-registry-7bb99cc44b-rr54f\" (UID: \"1b34c122-9e66-4876-ad10-d90898363166\") " pod="openshift-image-registry/image-registry-7bb99cc44b-rr54f" Apr 17 18:50:37.458084 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.457952 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c5fd97d3-b332-4c85-9344-c0e9f314aed6-snapshots\") pod \"insights-operator-585dfdc468-88g9b\" (UID: \"c5fd97d3-b332-4c85-9344-c0e9f314aed6\") " pod="openshift-insights/insights-operator-585dfdc468-88g9b" Apr 17 18:50:37.458084 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.457969 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vfcgp\" (UniqueName: \"kubernetes.io/projected/c5fd97d3-b332-4c85-9344-c0e9f314aed6-kube-api-access-vfcgp\") pod \"insights-operator-585dfdc468-88g9b\" (UID: \"c5fd97d3-b332-4c85-9344-c0e9f314aed6\") " pod="openshift-insights/insights-operator-585dfdc468-88g9b" Apr 17 18:50:37.458084 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.457992 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1b34c122-9e66-4876-ad10-d90898363166-image-registry-private-configuration\") pod \"image-registry-7bb99cc44b-rr54f\" (UID: \"1b34c122-9e66-4876-ad10-d90898363166\") " pod="openshift-image-registry/image-registry-7bb99cc44b-rr54f" Apr 17 18:50:37.458084 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.458019 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1b34c122-9e66-4876-ad10-d90898363166-ca-trust-extracted\") pod \"image-registry-7bb99cc44b-rr54f\" (UID: \"1b34c122-9e66-4876-ad10-d90898363166\") " pod="openshift-image-registry/image-registry-7bb99cc44b-rr54f" Apr 17 18:50:37.458084 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.458067 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dw4z7\" (UniqueName: \"kubernetes.io/projected/1b34c122-9e66-4876-ad10-d90898363166-kube-api-access-dw4z7\") pod \"image-registry-7bb99cc44b-rr54f\" (UID: \"1b34c122-9e66-4876-ad10-d90898363166\") " pod="openshift-image-registry/image-registry-7bb99cc44b-rr54f" Apr 17 18:50:37.458335 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.458104 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5fd97d3-b332-4c85-9344-c0e9f314aed6-serving-cert\") pod \"insights-operator-585dfdc468-88g9b\" (UID: \"c5fd97d3-b332-4c85-9344-c0e9f314aed6\") " pod="openshift-insights/insights-operator-585dfdc468-88g9b" Apr 17 18:50:37.458335 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.458141 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1b34c122-9e66-4876-ad10-d90898363166-trusted-ca\") pod \"image-registry-7bb99cc44b-rr54f\" (UID: \"1b34c122-9e66-4876-ad10-d90898363166\") " pod="openshift-image-registry/image-registry-7bb99cc44b-rr54f" Apr 17 18:50:37.458335 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.458170 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5fd97d3-b332-4c85-9344-c0e9f314aed6-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-88g9b\" (UID: \"c5fd97d3-b332-4c85-9344-c0e9f314aed6\") " pod="openshift-insights/insights-operator-585dfdc468-88g9b" Apr 17 18:50:37.458335 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.458199 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1b34c122-9e66-4876-ad10-d90898363166-installation-pull-secrets\") pod \"image-registry-7bb99cc44b-rr54f\" (UID: \"1b34c122-9e66-4876-ad10-d90898363166\") " pod="openshift-image-registry/image-registry-7bb99cc44b-rr54f" Apr 17 18:50:37.458335 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.458230 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5fd97d3-b332-4c85-9344-c0e9f314aed6-service-ca-bundle\") pod \"insights-operator-585dfdc468-88g9b\" (UID: \"c5fd97d3-b332-4c85-9344-c0e9f314aed6\") " pod="openshift-insights/insights-operator-585dfdc468-88g9b" Apr 17 18:50:37.458598 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.458539 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1b34c122-9e66-4876-ad10-d90898363166-registry-certificates\") pod \"image-registry-7bb99cc44b-rr54f\" (UID: \"1b34c122-9e66-4876-ad10-d90898363166\") " pod="openshift-image-registry/image-registry-7bb99cc44b-rr54f" Apr 17 18:50:37.458653 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.458640 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c5fd97d3-b332-4c85-9344-c0e9f314aed6-tmp\") pod \"insights-operator-585dfdc468-88g9b\" (UID: \"c5fd97d3-b332-4c85-9344-c0e9f314aed6\") " pod="openshift-insights/insights-operator-585dfdc468-88g9b" Apr 17 18:50:37.458714 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.458663 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1b34c122-9e66-4876-ad10-d90898363166-ca-trust-extracted\") pod \"image-registry-7bb99cc44b-rr54f\" (UID: \"1b34c122-9e66-4876-ad10-d90898363166\") " pod="openshift-image-registry/image-registry-7bb99cc44b-rr54f" Apr 17 18:50:37.458714 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.458670 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b34c122-9e66-4876-ad10-d90898363166-registry-tls\") pod \"image-registry-7bb99cc44b-rr54f\" (UID: \"1b34c122-9e66-4876-ad10-d90898363166\") " pod="openshift-image-registry/image-registry-7bb99cc44b-rr54f" Apr 17 18:50:37.458818 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.458729 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c5fd97d3-b332-4c85-9344-c0e9f314aed6-snapshots\") pod \"insights-operator-585dfdc468-88g9b\" (UID: \"c5fd97d3-b332-4c85-9344-c0e9f314aed6\") " pod="openshift-insights/insights-operator-585dfdc468-88g9b" Apr 17 18:50:37.458818 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:50:37.458749 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 18:50:37.458818 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:50:37.458764 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bb99cc44b-rr54f: secret "image-registry-tls" not found Apr 17 18:50:37.458967 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:50:37.458824 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1b34c122-9e66-4876-ad10-d90898363166-registry-tls podName:1b34c122-9e66-4876-ad10-d90898363166 nodeName:}" failed. No retries permitted until 2026-04-17 18:50:37.958806883 +0000 UTC m=+77.232824155 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1b34c122-9e66-4876-ad10-d90898363166-registry-tls") pod "image-registry-7bb99cc44b-rr54f" (UID: "1b34c122-9e66-4876-ad10-d90898363166") : secret "image-registry-tls" not found Apr 17 18:50:37.459029 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.459013 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c5fd97d3-b332-4c85-9344-c0e9f314aed6-tmp\") pod \"insights-operator-585dfdc468-88g9b\" (UID: \"c5fd97d3-b332-4c85-9344-c0e9f314aed6\") " pod="openshift-insights/insights-operator-585dfdc468-88g9b" Apr 17 18:50:37.459315 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.459289 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1b34c122-9e66-4876-ad10-d90898363166-trusted-ca\") pod \"image-registry-7bb99cc44b-rr54f\" (UID: \"1b34c122-9e66-4876-ad10-d90898363166\") " pod="openshift-image-registry/image-registry-7bb99cc44b-rr54f" Apr 17 18:50:37.459474 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.459439 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5fd97d3-b332-4c85-9344-c0e9f314aed6-service-ca-bundle\") pod \"insights-operator-585dfdc468-88g9b\" (UID: \"c5fd97d3-b332-4c85-9344-c0e9f314aed6\") " pod="openshift-insights/insights-operator-585dfdc468-88g9b" Apr 17 18:50:37.459580 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.459558 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1b34c122-9e66-4876-ad10-d90898363166-registry-certificates\") pod \"image-registry-7bb99cc44b-rr54f\" (UID: \"1b34c122-9e66-4876-ad10-d90898363166\") " pod="openshift-image-registry/image-registry-7bb99cc44b-rr54f" Apr 17 18:50:37.459733 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.459713 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5fd97d3-b332-4c85-9344-c0e9f314aed6-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-88g9b\" (UID: \"c5fd97d3-b332-4c85-9344-c0e9f314aed6\") " pod="openshift-insights/insights-operator-585dfdc468-88g9b" Apr 17 18:50:37.460600 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.460570 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5fd97d3-b332-4c85-9344-c0e9f314aed6-serving-cert\") pod \"insights-operator-585dfdc468-88g9b\" (UID: \"c5fd97d3-b332-4c85-9344-c0e9f314aed6\") " pod="openshift-insights/insights-operator-585dfdc468-88g9b" Apr 17 18:50:37.460687 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.460609 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1b34c122-9e66-4876-ad10-d90898363166-image-registry-private-configuration\") pod \"image-registry-7bb99cc44b-rr54f\" (UID: \"1b34c122-9e66-4876-ad10-d90898363166\") " pod="openshift-image-registry/image-registry-7bb99cc44b-rr54f" Apr 17 18:50:37.461014 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.460997 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1b34c122-9e66-4876-ad10-d90898363166-installation-pull-secrets\") pod \"image-registry-7bb99cc44b-rr54f\" (UID: \"1b34c122-9e66-4876-ad10-d90898363166\") " pod="openshift-image-registry/image-registry-7bb99cc44b-rr54f" Apr 17 18:50:37.470323 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.470302 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b34c122-9e66-4876-ad10-d90898363166-bound-sa-token\") pod \"image-registry-7bb99cc44b-rr54f\" (UID: \"1b34c122-9e66-4876-ad10-d90898363166\") " pod="openshift-image-registry/image-registry-7bb99cc44b-rr54f" Apr 17 18:50:37.470668 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.470653 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw4z7\" (UniqueName: \"kubernetes.io/projected/1b34c122-9e66-4876-ad10-d90898363166-kube-api-access-dw4z7\") pod \"image-registry-7bb99cc44b-rr54f\" (UID: \"1b34c122-9e66-4876-ad10-d90898363166\") " pod="openshift-image-registry/image-registry-7bb99cc44b-rr54f" Apr 17 18:50:37.470882 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.470860 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfcgp\" (UniqueName: \"kubernetes.io/projected/c5fd97d3-b332-4c85-9344-c0e9f314aed6-kube-api-access-vfcgp\") pod \"insights-operator-585dfdc468-88g9b\" (UID: \"c5fd97d3-b332-4c85-9344-c0e9f314aed6\") " pod="openshift-insights/insights-operator-585dfdc468-88g9b" Apr 17 18:50:37.627361 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.627293 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-88g9b" Apr 17 18:50:37.738621 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.737161 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-88g9b"] Apr 17 18:50:37.744214 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:50:37.744189 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5fd97d3_b332_4c85_9344_c0e9f314aed6.slice/crio-53023f0695c971b8f7f083d600cefccd8bb76057f515164499c0f0863c617400 WatchSource:0}: Error finding container 53023f0695c971b8f7f083d600cefccd8bb76057f515164499c0f0863c617400: Status 404 returned error can't find the container with id 53023f0695c971b8f7f083d600cefccd8bb76057f515164499c0f0863c617400 Apr 17 18:50:37.961632 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:37.961558 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b34c122-9e66-4876-ad10-d90898363166-registry-tls\") pod \"image-registry-7bb99cc44b-rr54f\" (UID: \"1b34c122-9e66-4876-ad10-d90898363166\") " pod="openshift-image-registry/image-registry-7bb99cc44b-rr54f" Apr 17 18:50:37.961759 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:50:37.961700 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 18:50:37.961759 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:50:37.961721 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bb99cc44b-rr54f: secret "image-registry-tls" not found Apr 17 18:50:37.961829 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:50:37.961775 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1b34c122-9e66-4876-ad10-d90898363166-registry-tls podName:1b34c122-9e66-4876-ad10-d90898363166 nodeName:}" failed. No retries permitted until 2026-04-17 18:50:38.961758136 +0000 UTC m=+78.235775408 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1b34c122-9e66-4876-ad10-d90898363166-registry-tls") pod "image-registry-7bb99cc44b-rr54f" (UID: "1b34c122-9e66-4876-ad10-d90898363166") : secret "image-registry-tls" not found Apr 17 18:50:38.516865 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:38.516830 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-88g9b" event={"ID":"c5fd97d3-b332-4c85-9344-c0e9f314aed6","Type":"ContainerStarted","Data":"53023f0695c971b8f7f083d600cefccd8bb76057f515164499c0f0863c617400"} Apr 17 18:50:38.967446 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:38.967346 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b34c122-9e66-4876-ad10-d90898363166-registry-tls\") pod \"image-registry-7bb99cc44b-rr54f\" (UID: \"1b34c122-9e66-4876-ad10-d90898363166\") " pod="openshift-image-registry/image-registry-7bb99cc44b-rr54f" Apr 17 18:50:38.967609 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:50:38.967476 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 18:50:38.967609 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:50:38.967495 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bb99cc44b-rr54f: secret "image-registry-tls" not found Apr 17 18:50:38.967609 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:50:38.967586 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1b34c122-9e66-4876-ad10-d90898363166-registry-tls podName:1b34c122-9e66-4876-ad10-d90898363166 nodeName:}" failed. No retries permitted until 2026-04-17 18:50:40.967566522 +0000 UTC m=+80.241583797 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1b34c122-9e66-4876-ad10-d90898363166-registry-tls") pod "image-registry-7bb99cc44b-rr54f" (UID: "1b34c122-9e66-4876-ad10-d90898363166") : secret "image-registry-tls" not found Apr 17 18:50:40.521909 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:40.521876 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-88g9b" event={"ID":"c5fd97d3-b332-4c85-9344-c0e9f314aed6","Type":"ContainerStarted","Data":"fcbe228a0b910472d5955143d790361f7c64018ed8249004501032f57b6ba5c9"} Apr 17 18:50:40.536560 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:40.536504 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-88g9b" podStartSLOduration=1.316062957 podStartE2EDuration="3.536485898s" podCreationTimestamp="2026-04-17 18:50:37 +0000 UTC" firstStartedPulling="2026-04-17 18:50:37.745966165 +0000 UTC m=+77.019983437" lastFinishedPulling="2026-04-17 18:50:39.966389107 +0000 UTC m=+79.240406378" observedRunningTime="2026-04-17 18:50:40.535511408 +0000 UTC m=+79.809528702" watchObservedRunningTime="2026-04-17 18:50:40.536485898 +0000 UTC m=+79.810503191" Apr 17 18:50:40.983750 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:40.983652 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b34c122-9e66-4876-ad10-d90898363166-registry-tls\") pod \"image-registry-7bb99cc44b-rr54f\" (UID: \"1b34c122-9e66-4876-ad10-d90898363166\") " pod="openshift-image-registry/image-registry-7bb99cc44b-rr54f" Apr 17 18:50:40.983921 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:50:40.983786 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 18:50:40.983921 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:50:40.983804 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bb99cc44b-rr54f: secret "image-registry-tls" not found Apr 17 18:50:40.983921 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:50:40.983877 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1b34c122-9e66-4876-ad10-d90898363166-registry-tls podName:1b34c122-9e66-4876-ad10-d90898363166 nodeName:}" failed. No retries permitted until 2026-04-17 18:50:44.983858443 +0000 UTC m=+84.257875717 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1b34c122-9e66-4876-ad10-d90898363166-registry-tls") pod "image-registry-7bb99cc44b-rr54f" (UID: "1b34c122-9e66-4876-ad10-d90898363166") : secret "image-registry-tls" not found Apr 17 18:50:42.535228 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:42.535196 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-mflf9"] Apr 17 18:50:42.539228 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:42.539211 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mflf9" Apr 17 18:50:42.541400 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:42.541378 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 17 18:50:42.541980 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:42.541966 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 17 18:50:42.542030 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:42.541987 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-hd6hw\"" Apr 17 18:50:42.544337 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:42.544319 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-mflf9"] Apr 17 18:50:42.596534 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:42.596509 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zccsg\" (UniqueName: \"kubernetes.io/projected/4c94010b-eab7-490c-8843-0f4859c4d6fd-kube-api-access-zccsg\") pod \"migrator-74bb7799d9-mflf9\" (UID: \"4c94010b-eab7-490c-8843-0f4859c4d6fd\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mflf9" Apr 17 18:50:42.697813 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:42.697788 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zccsg\" (UniqueName: \"kubernetes.io/projected/4c94010b-eab7-490c-8843-0f4859c4d6fd-kube-api-access-zccsg\") pod \"migrator-74bb7799d9-mflf9\" (UID: \"4c94010b-eab7-490c-8843-0f4859c4d6fd\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mflf9" Apr 17 18:50:42.704691 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:42.704662 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zccsg\" (UniqueName: \"kubernetes.io/projected/4c94010b-eab7-490c-8843-0f4859c4d6fd-kube-api-access-zccsg\") pod \"migrator-74bb7799d9-mflf9\" (UID: \"4c94010b-eab7-490c-8843-0f4859c4d6fd\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mflf9" Apr 17 18:50:42.847808 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:42.847753 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mflf9" Apr 17 18:50:42.951730 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:42.951702 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-mflf9"] Apr 17 18:50:42.956095 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:50:42.956058 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c94010b_eab7_490c_8843_0f4859c4d6fd.slice/crio-82df3a5a7c6f9622f10ca29d04daaed091ac18d0639b367ed2b60f5423313f7e WatchSource:0}: Error finding container 82df3a5a7c6f9622f10ca29d04daaed091ac18d0639b367ed2b60f5423313f7e: Status 404 returned error can't find the container with id 82df3a5a7c6f9622f10ca29d04daaed091ac18d0639b367ed2b60f5423313f7e Apr 17 18:50:43.207755 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:43.207686 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-plbd4_8210137d-ed94-434c-897d-f67481261a39/dns-node-resolver/0.log" Apr 17 18:50:43.528985 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:43.528948 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mflf9" event={"ID":"4c94010b-eab7-490c-8843-0f4859c4d6fd","Type":"ContainerStarted","Data":"82df3a5a7c6f9622f10ca29d04daaed091ac18d0639b367ed2b60f5423313f7e"} Apr 17 18:50:44.209131 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:44.209095 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-t5hn4_3d42e5aa-a588-4ce1-a264-4581a72945cb/node-ca/0.log" Apr 17 18:50:44.226526 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:44.226506 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-zvxl8"] Apr 17 18:50:44.228020 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:44.228005 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-zvxl8" Apr 17 18:50:44.231871 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:44.231855 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-8cgv2\"" Apr 17 18:50:44.231970 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:44.231930 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 17 18:50:44.232434 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:44.232418 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 17 18:50:44.232511 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:44.232422 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 17 18:50:44.233583 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:44.233564 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 17 18:50:44.248247 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:44.248220 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-zvxl8"] Apr 17 18:50:44.309502 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:44.309449 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/71a7c302-ef53-45bc-a694-b7520d3ee7e1-signing-cabundle\") pod \"service-ca-865cb79987-zvxl8\" (UID: \"71a7c302-ef53-45bc-a694-b7520d3ee7e1\") " pod="openshift-service-ca/service-ca-865cb79987-zvxl8" Apr 17 18:50:44.309622 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:44.309508 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/71a7c302-ef53-45bc-a694-b7520d3ee7e1-signing-key\") pod \"service-ca-865cb79987-zvxl8\" (UID: \"71a7c302-ef53-45bc-a694-b7520d3ee7e1\") " pod="openshift-service-ca/service-ca-865cb79987-zvxl8" Apr 17 18:50:44.309622 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:44.309531 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zx6k\" (UniqueName: \"kubernetes.io/projected/71a7c302-ef53-45bc-a694-b7520d3ee7e1-kube-api-access-8zx6k\") pod \"service-ca-865cb79987-zvxl8\" (UID: \"71a7c302-ef53-45bc-a694-b7520d3ee7e1\") " pod="openshift-service-ca/service-ca-865cb79987-zvxl8" Apr 17 18:50:44.410304 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:44.410270 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/71a7c302-ef53-45bc-a694-b7520d3ee7e1-signing-cabundle\") pod \"service-ca-865cb79987-zvxl8\" (UID: \"71a7c302-ef53-45bc-a694-b7520d3ee7e1\") " pod="openshift-service-ca/service-ca-865cb79987-zvxl8" Apr 17 18:50:44.410439 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:44.410309 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/71a7c302-ef53-45bc-a694-b7520d3ee7e1-signing-key\") pod \"service-ca-865cb79987-zvxl8\" (UID: \"71a7c302-ef53-45bc-a694-b7520d3ee7e1\") " pod="openshift-service-ca/service-ca-865cb79987-zvxl8" Apr 17 18:50:44.410439 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:44.410331 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8zx6k\" (UniqueName: \"kubernetes.io/projected/71a7c302-ef53-45bc-a694-b7520d3ee7e1-kube-api-access-8zx6k\") pod \"service-ca-865cb79987-zvxl8\" (UID: \"71a7c302-ef53-45bc-a694-b7520d3ee7e1\") " pod="openshift-service-ca/service-ca-865cb79987-zvxl8" Apr 17 18:50:44.411009 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:44.410991 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/71a7c302-ef53-45bc-a694-b7520d3ee7e1-signing-cabundle\") pod \"service-ca-865cb79987-zvxl8\" (UID: \"71a7c302-ef53-45bc-a694-b7520d3ee7e1\") " pod="openshift-service-ca/service-ca-865cb79987-zvxl8" Apr 17 18:50:44.412699 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:44.412679 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/71a7c302-ef53-45bc-a694-b7520d3ee7e1-signing-key\") pod \"service-ca-865cb79987-zvxl8\" (UID: \"71a7c302-ef53-45bc-a694-b7520d3ee7e1\") " pod="openshift-service-ca/service-ca-865cb79987-zvxl8" Apr 17 18:50:44.417864 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:44.417841 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zx6k\" (UniqueName: \"kubernetes.io/projected/71a7c302-ef53-45bc-a694-b7520d3ee7e1-kube-api-access-8zx6k\") pod \"service-ca-865cb79987-zvxl8\" (UID: \"71a7c302-ef53-45bc-a694-b7520d3ee7e1\") " pod="openshift-service-ca/service-ca-865cb79987-zvxl8" Apr 17 18:50:44.533104 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:44.533069 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mflf9" event={"ID":"4c94010b-eab7-490c-8843-0f4859c4d6fd","Type":"ContainerStarted","Data":"b44646be5ef2af89490039c5598793a06e4ecbadf776b741c662dcb5b73f2137"} Apr 17 18:50:44.533245 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:44.533118 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mflf9" event={"ID":"4c94010b-eab7-490c-8843-0f4859c4d6fd","Type":"ContainerStarted","Data":"bfd0b190a23c20c556e3c79ed72708f2cd7151d5fe428c8eec9a61277b93f6f1"} Apr 17 18:50:44.536273 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:44.536252 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-zvxl8" Apr 17 18:50:44.548042 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:44.548001 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mflf9" podStartSLOduration=1.330498803 podStartE2EDuration="2.547987905s" podCreationTimestamp="2026-04-17 18:50:42 +0000 UTC" firstStartedPulling="2026-04-17 18:50:42.958780229 +0000 UTC m=+82.232797501" lastFinishedPulling="2026-04-17 18:50:44.176269332 +0000 UTC m=+83.450286603" observedRunningTime="2026-04-17 18:50:44.54653647 +0000 UTC m=+83.820553762" watchObservedRunningTime="2026-04-17 18:50:44.547987905 +0000 UTC m=+83.822005197" Apr 17 18:50:44.646339 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:44.646311 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-zvxl8"] Apr 17 18:50:44.649952 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:50:44.649922 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71a7c302_ef53_45bc_a694_b7520d3ee7e1.slice/crio-0849cb7116bf13438ded3c63fbd6b48969aa77d30f2e46681b8a6e6f2d616463 WatchSource:0}: Error finding container 0849cb7116bf13438ded3c63fbd6b48969aa77d30f2e46681b8a6e6f2d616463: Status 404 returned error can't find the container with id 0849cb7116bf13438ded3c63fbd6b48969aa77d30f2e46681b8a6e6f2d616463 Apr 17 18:50:45.015124 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:45.015095 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b34c122-9e66-4876-ad10-d90898363166-registry-tls\") pod \"image-registry-7bb99cc44b-rr54f\" (UID: \"1b34c122-9e66-4876-ad10-d90898363166\") " pod="openshift-image-registry/image-registry-7bb99cc44b-rr54f" Apr 17 18:50:45.015273 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:50:45.015233 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 18:50:45.015273 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:50:45.015250 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bb99cc44b-rr54f: secret "image-registry-tls" not found Apr 17 18:50:45.015340 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:50:45.015304 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1b34c122-9e66-4876-ad10-d90898363166-registry-tls podName:1b34c122-9e66-4876-ad10-d90898363166 nodeName:}" failed. No retries permitted until 2026-04-17 18:50:53.015288492 +0000 UTC m=+92.289305763 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1b34c122-9e66-4876-ad10-d90898363166-registry-tls") pod "image-registry-7bb99cc44b-rr54f" (UID: "1b34c122-9e66-4876-ad10-d90898363166") : secret "image-registry-tls" not found Apr 17 18:50:45.537409 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:45.537371 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-zvxl8" event={"ID":"71a7c302-ef53-45bc-a694-b7520d3ee7e1","Type":"ContainerStarted","Data":"0849cb7116bf13438ded3c63fbd6b48969aa77d30f2e46681b8a6e6f2d616463"} Apr 17 18:50:47.544508 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:47.544470 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-zvxl8" event={"ID":"71a7c302-ef53-45bc-a694-b7520d3ee7e1","Type":"ContainerStarted","Data":"50aebff7fa5f599e1ec9ce8e830ef19abcef5e8eab24d9c929b043c75df9f96e"} Apr 17 18:50:47.558934 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:47.558888 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-zvxl8" podStartSLOduration=1.7160699830000001 podStartE2EDuration="3.558875957s" podCreationTimestamp="2026-04-17 18:50:44 +0000 UTC" firstStartedPulling="2026-04-17 18:50:44.651731102 +0000 UTC m=+83.925748373" lastFinishedPulling="2026-04-17 18:50:46.494537075 +0000 UTC m=+85.768554347" observedRunningTime="2026-04-17 18:50:47.557857812 +0000 UTC m=+86.831875104" watchObservedRunningTime="2026-04-17 18:50:47.558875957 +0000 UTC m=+86.832893250" Apr 17 18:50:53.071860 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:53.071826 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b34c122-9e66-4876-ad10-d90898363166-registry-tls\") pod \"image-registry-7bb99cc44b-rr54f\" (UID: \"1b34c122-9e66-4876-ad10-d90898363166\") " pod="openshift-image-registry/image-registry-7bb99cc44b-rr54f" Apr 17 18:50:53.074284 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:53.074259 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b34c122-9e66-4876-ad10-d90898363166-registry-tls\") pod \"image-registry-7bb99cc44b-rr54f\" (UID: \"1b34c122-9e66-4876-ad10-d90898363166\") " pod="openshift-image-registry/image-registry-7bb99cc44b-rr54f" Apr 17 18:50:53.233137 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:53.233098 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7bb99cc44b-rr54f" Apr 17 18:50:53.353416 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:53.353351 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7bb99cc44b-rr54f"] Apr 17 18:50:53.356448 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:50:53.356422 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b34c122_9e66_4876_ad10_d90898363166.slice/crio-b98f65ecfb6f034217e3248f3e93a313ec94323e1e80498da55222d61fab2896 WatchSource:0}: Error finding container b98f65ecfb6f034217e3248f3e93a313ec94323e1e80498da55222d61fab2896: Status 404 returned error can't find the container with id b98f65ecfb6f034217e3248f3e93a313ec94323e1e80498da55222d61fab2896 Apr 17 18:50:53.566235 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:53.566190 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7bb99cc44b-rr54f" event={"ID":"1b34c122-9e66-4876-ad10-d90898363166","Type":"ContainerStarted","Data":"8b5eb98742a21996f030af193d6b6c449b6959e6ec1343f5369491e2d8fc9bca"} Apr 17 18:50:53.566235 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:53.566232 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7bb99cc44b-rr54f" event={"ID":"1b34c122-9e66-4876-ad10-d90898363166","Type":"ContainerStarted","Data":"b98f65ecfb6f034217e3248f3e93a313ec94323e1e80498da55222d61fab2896"} Apr 17 18:50:53.566506 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:53.566338 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7bb99cc44b-rr54f" Apr 17 18:50:53.584677 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:53.584628 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7bb99cc44b-rr54f" podStartSLOduration=16.584612986 podStartE2EDuration="16.584612986s" podCreationTimestamp="2026-04-17 18:50:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:50:53.583405222 +0000 UTC m=+92.857422516" watchObservedRunningTime="2026-04-17 18:50:53.584612986 +0000 UTC m=+92.858630280" Apr 17 18:50:56.495873 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:56.495838 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/029e659d-f8ff-4796-ba06-aba3f3e2a830-metrics-tls\") pod \"dns-default-gr8p7\" (UID: \"029e659d-f8ff-4796-ba06-aba3f3e2a830\") " pod="openshift-dns/dns-default-gr8p7" Apr 17 18:50:56.497903 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:56.497880 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/029e659d-f8ff-4796-ba06-aba3f3e2a830-metrics-tls\") pod \"dns-default-gr8p7\" (UID: \"029e659d-f8ff-4796-ba06-aba3f3e2a830\") " pod="openshift-dns/dns-default-gr8p7" Apr 17 18:50:56.596592 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:56.596562 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff724f21-8096-4540-9e0b-484999e3ecd1-cert\") pod \"ingress-canary-ntz86\" (UID: \"ff724f21-8096-4540-9e0b-484999e3ecd1\") " pod="openshift-ingress-canary/ingress-canary-ntz86" Apr 17 18:50:56.598690 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:56.598669 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff724f21-8096-4540-9e0b-484999e3ecd1-cert\") pod \"ingress-canary-ntz86\" (UID: \"ff724f21-8096-4540-9e0b-484999e3ecd1\") " pod="openshift-ingress-canary/ingress-canary-ntz86" Apr 17 18:50:56.603614 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:56.603594 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ctvd8\"" Apr 17 18:50:56.612242 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:56.612226 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gr8p7" Apr 17 18:50:56.613997 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:56.613980 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ppj8p\"" Apr 17 18:50:56.622569 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:56.622552 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ntz86" Apr 17 18:50:56.735308 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:56.735263 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-gr8p7"] Apr 17 18:50:56.738417 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:50:56.738383 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod029e659d_f8ff_4796_ba06_aba3f3e2a830.slice/crio-7330db24e60f264c3a3a944f7a9579472f67f151a7ee3dfa004f75a41999d317 WatchSource:0}: Error finding container 7330db24e60f264c3a3a944f7a9579472f67f151a7ee3dfa004f75a41999d317: Status 404 returned error can't find the container with id 7330db24e60f264c3a3a944f7a9579472f67f151a7ee3dfa004f75a41999d317 Apr 17 18:50:56.751427 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:56.751376 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ntz86"] Apr 17 18:50:56.754819 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:50:56.754793 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff724f21_8096_4540_9e0b_484999e3ecd1.slice/crio-cb382b351ffff66c09436a7e2aa9abb644e1b3efc0e379d0c147b491ce31c997 WatchSource:0}: Error finding container cb382b351ffff66c09436a7e2aa9abb644e1b3efc0e379d0c147b491ce31c997: Status 404 returned error can't find the container with id cb382b351ffff66c09436a7e2aa9abb644e1b3efc0e379d0c147b491ce31c997 Apr 17 18:50:57.579967 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:57.579926 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ntz86" event={"ID":"ff724f21-8096-4540-9e0b-484999e3ecd1","Type":"ContainerStarted","Data":"cb382b351ffff66c09436a7e2aa9abb644e1b3efc0e379d0c147b491ce31c997"} Apr 17 18:50:57.581738 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:57.581707 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gr8p7" event={"ID":"029e659d-f8ff-4796-ba06-aba3f3e2a830","Type":"ContainerStarted","Data":"7330db24e60f264c3a3a944f7a9579472f67f151a7ee3dfa004f75a41999d317"} Apr 17 18:50:59.587535 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:59.587496 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gr8p7" event={"ID":"029e659d-f8ff-4796-ba06-aba3f3e2a830","Type":"ContainerStarted","Data":"07e2cffeb0857ab14f9e08449daff1b85dea56456e6a4c672ce3f8936c30f31e"} Apr 17 18:50:59.587535 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:59.587539 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gr8p7" event={"ID":"029e659d-f8ff-4796-ba06-aba3f3e2a830","Type":"ContainerStarted","Data":"3a44e83b2955faae847048c12df0eadf4bf07ef735533e3dcf353de9f2495102"} Apr 17 18:50:59.587993 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:59.587657 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-gr8p7" Apr 17 18:50:59.588825 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:59.588803 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ntz86" event={"ID":"ff724f21-8096-4540-9e0b-484999e3ecd1","Type":"ContainerStarted","Data":"336fa85d60953915bde83fb95efc3f5a229afbc41f2b741dc3038b70ea2c1722"} Apr 17 18:50:59.603780 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:50:59.603739 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-gr8p7" podStartSLOduration=65.715746088 podStartE2EDuration="1m7.603724994s" podCreationTimestamp="2026-04-17 18:49:52 +0000 UTC" firstStartedPulling="2026-04-17 18:50:56.740694787 +0000 UTC m=+96.014712058" lastFinishedPulling="2026-04-17 18:50:58.628673692 +0000 UTC m=+97.902690964" observedRunningTime="2026-04-17 18:50:59.603490474 +0000 UTC m=+98.877507766" watchObservedRunningTime="2026-04-17 18:50:59.603724994 +0000 UTC m=+98.877742286" Apr 17 18:51:08.032976 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:08.032925 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-ntz86" podStartSLOduration=74.156601723 podStartE2EDuration="1m16.032909988s" podCreationTimestamp="2026-04-17 18:49:52 +0000 UTC" firstStartedPulling="2026-04-17 18:50:56.756531123 +0000 UTC m=+96.030548394" lastFinishedPulling="2026-04-17 18:50:58.632839388 +0000 UTC m=+97.906856659" observedRunningTime="2026-04-17 18:50:59.615702658 +0000 UTC m=+98.889719964" watchObservedRunningTime="2026-04-17 18:51:08.032909988 +0000 UTC m=+107.306927308" Apr 17 18:51:08.033749 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:08.033726 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7bb99cc44b-rr54f"] Apr 17 18:51:08.038642 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:08.038614 2571 patch_prober.go:28] interesting pod/image-registry-7bb99cc44b-rr54f container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 18:51:08.038755 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:08.038661 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-7bb99cc44b-rr54f" podUID="1b34c122-9e66-4876-ad10-d90898363166" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 18:51:08.105995 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:08.105965 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-2f7gh"] Apr 17 18:51:08.109028 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:08.109014 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2f7gh" Apr 17 18:51:08.111651 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:08.111626 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 18:51:08.111651 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:08.111639 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-njpjx\"" Apr 17 18:51:08.111803 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:08.111725 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 18:51:08.121632 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:08.121610 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2f7gh"] Apr 17 18:51:08.182066 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:08.182035 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b37994a4-c3f2-4b31-a4bf-86096fa268fb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2f7gh\" (UID: \"b37994a4-c3f2-4b31-a4bf-86096fa268fb\") " pod="openshift-insights/insights-runtime-extractor-2f7gh" Apr 17 18:51:08.182195 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:08.182071 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b37994a4-c3f2-4b31-a4bf-86096fa268fb-crio-socket\") pod \"insights-runtime-extractor-2f7gh\" (UID: \"b37994a4-c3f2-4b31-a4bf-86096fa268fb\") " pod="openshift-insights/insights-runtime-extractor-2f7gh" Apr 17 18:51:08.182195 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:08.182089 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b37994a4-c3f2-4b31-a4bf-86096fa268fb-data-volume\") pod \"insights-runtime-extractor-2f7gh\" (UID: \"b37994a4-c3f2-4b31-a4bf-86096fa268fb\") " pod="openshift-insights/insights-runtime-extractor-2f7gh" Apr 17 18:51:08.182195 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:08.182118 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjbcd\" (UniqueName: \"kubernetes.io/projected/b37994a4-c3f2-4b31-a4bf-86096fa268fb-kube-api-access-xjbcd\") pod \"insights-runtime-extractor-2f7gh\" (UID: \"b37994a4-c3f2-4b31-a4bf-86096fa268fb\") " pod="openshift-insights/insights-runtime-extractor-2f7gh" Apr 17 18:51:08.182339 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:08.182219 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b37994a4-c3f2-4b31-a4bf-86096fa268fb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2f7gh\" (UID: \"b37994a4-c3f2-4b31-a4bf-86096fa268fb\") " pod="openshift-insights/insights-runtime-extractor-2f7gh" Apr 17 18:51:08.282688 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:08.282660 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xjbcd\" (UniqueName: \"kubernetes.io/projected/b37994a4-c3f2-4b31-a4bf-86096fa268fb-kube-api-access-xjbcd\") pod \"insights-runtime-extractor-2f7gh\" (UID: \"b37994a4-c3f2-4b31-a4bf-86096fa268fb\") " pod="openshift-insights/insights-runtime-extractor-2f7gh" Apr 17 18:51:08.282804 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:08.282698 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b37994a4-c3f2-4b31-a4bf-86096fa268fb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2f7gh\" (UID: \"b37994a4-c3f2-4b31-a4bf-86096fa268fb\") " pod="openshift-insights/insights-runtime-extractor-2f7gh" Apr 17 18:51:08.282804 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:08.282747 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b37994a4-c3f2-4b31-a4bf-86096fa268fb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2f7gh\" (UID: \"b37994a4-c3f2-4b31-a4bf-86096fa268fb\") " pod="openshift-insights/insights-runtime-extractor-2f7gh" Apr 17 18:51:08.282804 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:08.282767 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b37994a4-c3f2-4b31-a4bf-86096fa268fb-crio-socket\") pod \"insights-runtime-extractor-2f7gh\" (UID: \"b37994a4-c3f2-4b31-a4bf-86096fa268fb\") " pod="openshift-insights/insights-runtime-extractor-2f7gh" Apr 17 18:51:08.282804 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:08.282786 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b37994a4-c3f2-4b31-a4bf-86096fa268fb-data-volume\") pod \"insights-runtime-extractor-2f7gh\" (UID: \"b37994a4-c3f2-4b31-a4bf-86096fa268fb\") " pod="openshift-insights/insights-runtime-extractor-2f7gh" Apr 17 18:51:08.282939 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:08.282903 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b37994a4-c3f2-4b31-a4bf-86096fa268fb-crio-socket\") pod \"insights-runtime-extractor-2f7gh\" (UID: \"b37994a4-c3f2-4b31-a4bf-86096fa268fb\") " pod="openshift-insights/insights-runtime-extractor-2f7gh" Apr 17 18:51:08.283123 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:08.283086 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b37994a4-c3f2-4b31-a4bf-86096fa268fb-data-volume\") pod \"insights-runtime-extractor-2f7gh\" (UID: \"b37994a4-c3f2-4b31-a4bf-86096fa268fb\") " pod="openshift-insights/insights-runtime-extractor-2f7gh" Apr 17 18:51:08.283302 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:08.283280 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b37994a4-c3f2-4b31-a4bf-86096fa268fb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2f7gh\" (UID: \"b37994a4-c3f2-4b31-a4bf-86096fa268fb\") " pod="openshift-insights/insights-runtime-extractor-2f7gh" Apr 17 18:51:08.285195 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:08.285172 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b37994a4-c3f2-4b31-a4bf-86096fa268fb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2f7gh\" (UID: \"b37994a4-c3f2-4b31-a4bf-86096fa268fb\") " pod="openshift-insights/insights-runtime-extractor-2f7gh" Apr 17 18:51:08.290449 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:08.290426 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjbcd\" (UniqueName: \"kubernetes.io/projected/b37994a4-c3f2-4b31-a4bf-86096fa268fb-kube-api-access-xjbcd\") pod \"insights-runtime-extractor-2f7gh\" (UID: \"b37994a4-c3f2-4b31-a4bf-86096fa268fb\") " pod="openshift-insights/insights-runtime-extractor-2f7gh" Apr 17 18:51:08.417861 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:08.417832 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2f7gh" Apr 17 18:51:08.531606 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:08.531578 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2f7gh"] Apr 17 18:51:08.534696 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:51:08.534639 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb37994a4_c3f2_4b31_a4bf_86096fa268fb.slice/crio-e5c599d780828f4f828a8aef331660709c9888a13119082ca995c524c6fe111a WatchSource:0}: Error finding container e5c599d780828f4f828a8aef331660709c9888a13119082ca995c524c6fe111a: Status 404 returned error can't find the container with id e5c599d780828f4f828a8aef331660709c9888a13119082ca995c524c6fe111a Apr 17 18:51:08.609724 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:08.609697 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2f7gh" event={"ID":"b37994a4-c3f2-4b31-a4bf-86096fa268fb","Type":"ContainerStarted","Data":"a803e8677f1cc2a228044897b65c275410a5169643fba8e0317fed6dccd66bcb"} Apr 17 18:51:08.609819 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:08.609729 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2f7gh" event={"ID":"b37994a4-c3f2-4b31-a4bf-86096fa268fb","Type":"ContainerStarted","Data":"e5c599d780828f4f828a8aef331660709c9888a13119082ca995c524c6fe111a"} Apr 17 18:51:09.593335 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:09.593309 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-gr8p7" Apr 17 18:51:09.613749 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:09.613720 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2f7gh" event={"ID":"b37994a4-c3f2-4b31-a4bf-86096fa268fb","Type":"ContainerStarted","Data":"a2fb82280ff5963eadd5cd2f6670e0a0726bb0e0198545617dfdc3be2630f183"} Apr 17 18:51:12.623057 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:12.623019 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2f7gh" event={"ID":"b37994a4-c3f2-4b31-a4bf-86096fa268fb","Type":"ContainerStarted","Data":"8539feb00282435ca5e73dff4b0dd730dcb87e236fedc9773d7364e44b12d1da"} Apr 17 18:51:12.642343 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:12.642302 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-2f7gh" podStartSLOduration=1.3653946129999999 podStartE2EDuration="4.642286928s" podCreationTimestamp="2026-04-17 18:51:08 +0000 UTC" firstStartedPulling="2026-04-17 18:51:08.589452682 +0000 UTC m=+107.863469954" lastFinishedPulling="2026-04-17 18:51:11.866344991 +0000 UTC m=+111.140362269" observedRunningTime="2026-04-17 18:51:12.642153051 +0000 UTC m=+111.916170344" watchObservedRunningTime="2026-04-17 18:51:12.642286928 +0000 UTC m=+111.916304222" Apr 17 18:51:18.037842 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:18.037810 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7bb99cc44b-rr54f" Apr 17 18:51:19.949259 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:19.949229 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-jjbzz"] Apr 17 18:51:19.954846 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:19.954819 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jjbzz" Apr 17 18:51:19.957281 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:19.957255 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 18:51:19.957418 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:19.957280 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 18:51:19.957418 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:19.957282 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 18:51:19.957418 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:19.957352 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-7jr9c\"" Apr 17 18:51:19.957637 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:19.957619 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 18:51:19.958015 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:19.957995 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 18:51:19.958187 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:19.958167 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 18:51:20.067729 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:20.067695 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5a3e7ac2-a313-4475-835e-44fbfe441ae1-sys\") pod \"node-exporter-jjbzz\" (UID: \"5a3e7ac2-a313-4475-835e-44fbfe441ae1\") " pod="openshift-monitoring/node-exporter-jjbzz" Apr 17 18:51:20.067898 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:20.067736 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5a3e7ac2-a313-4475-835e-44fbfe441ae1-node-exporter-accelerators-collector-config\") pod \"node-exporter-jjbzz\" (UID: \"5a3e7ac2-a313-4475-835e-44fbfe441ae1\") " pod="openshift-monitoring/node-exporter-jjbzz" Apr 17 18:51:20.067898 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:20.067758 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5a3e7ac2-a313-4475-835e-44fbfe441ae1-node-exporter-tls\") pod \"node-exporter-jjbzz\" (UID: \"5a3e7ac2-a313-4475-835e-44fbfe441ae1\") " pod="openshift-monitoring/node-exporter-jjbzz" Apr 17 18:51:20.067898 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:20.067815 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5a3e7ac2-a313-4475-835e-44fbfe441ae1-node-exporter-textfile\") pod \"node-exporter-jjbzz\" (UID: \"5a3e7ac2-a313-4475-835e-44fbfe441ae1\") " pod="openshift-monitoring/node-exporter-jjbzz" Apr 17 18:51:20.067898 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:20.067847 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g92w6\" (UniqueName: \"kubernetes.io/projected/5a3e7ac2-a313-4475-835e-44fbfe441ae1-kube-api-access-g92w6\") pod \"node-exporter-jjbzz\" (UID: \"5a3e7ac2-a313-4475-835e-44fbfe441ae1\") " pod="openshift-monitoring/node-exporter-jjbzz" Apr 17 18:51:20.067898 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:20.067878 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5a3e7ac2-a313-4475-835e-44fbfe441ae1-root\") pod \"node-exporter-jjbzz\" (UID: \"5a3e7ac2-a313-4475-835e-44fbfe441ae1\") " pod="openshift-monitoring/node-exporter-jjbzz" Apr 17 18:51:20.067898 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:20.067898 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5a3e7ac2-a313-4475-835e-44fbfe441ae1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jjbzz\" (UID: \"5a3e7ac2-a313-4475-835e-44fbfe441ae1\") " pod="openshift-monitoring/node-exporter-jjbzz" Apr 17 18:51:20.068217 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:20.067928 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5a3e7ac2-a313-4475-835e-44fbfe441ae1-metrics-client-ca\") pod \"node-exporter-jjbzz\" (UID: \"5a3e7ac2-a313-4475-835e-44fbfe441ae1\") " pod="openshift-monitoring/node-exporter-jjbzz" Apr 17 18:51:20.068217 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:20.067956 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5a3e7ac2-a313-4475-835e-44fbfe441ae1-node-exporter-wtmp\") pod \"node-exporter-jjbzz\" (UID: \"5a3e7ac2-a313-4475-835e-44fbfe441ae1\") " pod="openshift-monitoring/node-exporter-jjbzz" Apr 17 18:51:20.169105 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:20.169072 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5a3e7ac2-a313-4475-835e-44fbfe441ae1-sys\") pod \"node-exporter-jjbzz\" (UID: \"5a3e7ac2-a313-4475-835e-44fbfe441ae1\") " pod="openshift-monitoring/node-exporter-jjbzz" Apr 17 18:51:20.169289 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:20.169116 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5a3e7ac2-a313-4475-835e-44fbfe441ae1-node-exporter-accelerators-collector-config\") pod \"node-exporter-jjbzz\" (UID: \"5a3e7ac2-a313-4475-835e-44fbfe441ae1\") " pod="openshift-monitoring/node-exporter-jjbzz" Apr 17 18:51:20.169289 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:20.169149 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5a3e7ac2-a313-4475-835e-44fbfe441ae1-node-exporter-tls\") pod \"node-exporter-jjbzz\" (UID: \"5a3e7ac2-a313-4475-835e-44fbfe441ae1\") " pod="openshift-monitoring/node-exporter-jjbzz" Apr 17 18:51:20.169289 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:20.169190 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5a3e7ac2-a313-4475-835e-44fbfe441ae1-node-exporter-textfile\") pod \"node-exporter-jjbzz\" (UID: \"5a3e7ac2-a313-4475-835e-44fbfe441ae1\") " pod="openshift-monitoring/node-exporter-jjbzz" Apr 17 18:51:20.169289 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:20.169199 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5a3e7ac2-a313-4475-835e-44fbfe441ae1-sys\") pod \"node-exporter-jjbzz\" (UID: \"5a3e7ac2-a313-4475-835e-44fbfe441ae1\") " pod="openshift-monitoring/node-exporter-jjbzz" Apr 17 18:51:20.169289 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:20.169213 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g92w6\" (UniqueName: \"kubernetes.io/projected/5a3e7ac2-a313-4475-835e-44fbfe441ae1-kube-api-access-g92w6\") pod \"node-exporter-jjbzz\" (UID: \"5a3e7ac2-a313-4475-835e-44fbfe441ae1\") " pod="openshift-monitoring/node-exporter-jjbzz" Apr 17 18:51:20.169289 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:20.169238 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5a3e7ac2-a313-4475-835e-44fbfe441ae1-root\") pod \"node-exporter-jjbzz\" (UID: \"5a3e7ac2-a313-4475-835e-44fbfe441ae1\") " pod="openshift-monitoring/node-exporter-jjbzz" Apr 17 18:51:20.169289 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:20.169267 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5a3e7ac2-a313-4475-835e-44fbfe441ae1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jjbzz\" (UID: \"5a3e7ac2-a313-4475-835e-44fbfe441ae1\") " pod="openshift-monitoring/node-exporter-jjbzz" Apr 17 18:51:20.169676 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:20.169311 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5a3e7ac2-a313-4475-835e-44fbfe441ae1-metrics-client-ca\") pod \"node-exporter-jjbzz\" (UID: \"5a3e7ac2-a313-4475-835e-44fbfe441ae1\") " pod="openshift-monitoring/node-exporter-jjbzz" Apr 17 18:51:20.169676 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:51:20.169330 2571 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 18:51:20.169676 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:20.169343 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5a3e7ac2-a313-4475-835e-44fbfe441ae1-node-exporter-wtmp\") pod \"node-exporter-jjbzz\" (UID: \"5a3e7ac2-a313-4475-835e-44fbfe441ae1\") " pod="openshift-monitoring/node-exporter-jjbzz" Apr 17 18:51:20.169676 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:51:20.169401 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a3e7ac2-a313-4475-835e-44fbfe441ae1-node-exporter-tls podName:5a3e7ac2-a313-4475-835e-44fbfe441ae1 nodeName:}" failed. No retries permitted until 2026-04-17 18:51:20.669379693 +0000 UTC m=+119.943396985 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/5a3e7ac2-a313-4475-835e-44fbfe441ae1-node-exporter-tls") pod "node-exporter-jjbzz" (UID: "5a3e7ac2-a313-4475-835e-44fbfe441ae1") : secret "node-exporter-tls" not found Apr 17 18:51:20.169676 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:20.169528 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5a3e7ac2-a313-4475-835e-44fbfe441ae1-node-exporter-wtmp\") pod \"node-exporter-jjbzz\" (UID: \"5a3e7ac2-a313-4475-835e-44fbfe441ae1\") " pod="openshift-monitoring/node-exporter-jjbzz" Apr 17 18:51:20.169984 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:20.169846 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5a3e7ac2-a313-4475-835e-44fbfe441ae1-root\") pod \"node-exporter-jjbzz\" (UID: \"5a3e7ac2-a313-4475-835e-44fbfe441ae1\") " pod="openshift-monitoring/node-exporter-jjbzz" Apr 17 18:51:20.170640 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:20.170614 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5a3e7ac2-a313-4475-835e-44fbfe441ae1-node-exporter-accelerators-collector-config\") pod \"node-exporter-jjbzz\" (UID: \"5a3e7ac2-a313-4475-835e-44fbfe441ae1\") " pod="openshift-monitoring/node-exporter-jjbzz" Apr 17 18:51:20.170772 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:20.170715 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5a3e7ac2-a313-4475-835e-44fbfe441ae1-node-exporter-textfile\") pod \"node-exporter-jjbzz\" (UID: \"5a3e7ac2-a313-4475-835e-44fbfe441ae1\") " pod="openshift-monitoring/node-exporter-jjbzz" Apr 17 18:51:20.171208 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:20.171188 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5a3e7ac2-a313-4475-835e-44fbfe441ae1-metrics-client-ca\") pod \"node-exporter-jjbzz\" (UID: \"5a3e7ac2-a313-4475-835e-44fbfe441ae1\") " pod="openshift-monitoring/node-exporter-jjbzz" Apr 17 18:51:20.173250 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:20.173225 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5a3e7ac2-a313-4475-835e-44fbfe441ae1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jjbzz\" (UID: \"5a3e7ac2-a313-4475-835e-44fbfe441ae1\") " pod="openshift-monitoring/node-exporter-jjbzz" Apr 17 18:51:20.182724 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:20.182702 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g92w6\" (UniqueName: \"kubernetes.io/projected/5a3e7ac2-a313-4475-835e-44fbfe441ae1-kube-api-access-g92w6\") pod \"node-exporter-jjbzz\" (UID: \"5a3e7ac2-a313-4475-835e-44fbfe441ae1\") " pod="openshift-monitoring/node-exporter-jjbzz" Apr 17 18:51:20.674207 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:20.674174 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5a3e7ac2-a313-4475-835e-44fbfe441ae1-node-exporter-tls\") pod \"node-exporter-jjbzz\" (UID: \"5a3e7ac2-a313-4475-835e-44fbfe441ae1\") " pod="openshift-monitoring/node-exporter-jjbzz" Apr 17 18:51:20.676447 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:20.676414 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5a3e7ac2-a313-4475-835e-44fbfe441ae1-node-exporter-tls\") pod \"node-exporter-jjbzz\" (UID: \"5a3e7ac2-a313-4475-835e-44fbfe441ae1\") " pod="openshift-monitoring/node-exporter-jjbzz" Apr 17 18:51:20.864911 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:20.864877 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jjbzz" Apr 17 18:51:20.872474 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:51:20.872437 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a3e7ac2_a313_4475_835e_44fbfe441ae1.slice/crio-3800803756c69e03f38d3b6160233e5d5812f512dc276fff51fd419f0071b563 WatchSource:0}: Error finding container 3800803756c69e03f38d3b6160233e5d5812f512dc276fff51fd419f0071b563: Status 404 returned error can't find the container with id 3800803756c69e03f38d3b6160233e5d5812f512dc276fff51fd419f0071b563 Apr 17 18:51:20.996270 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:20.996177 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 18:51:21.001834 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.001815 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:51:21.004530 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.004504 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 18:51:21.004654 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.004505 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 18:51:21.004654 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.004506 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 18:51:21.004654 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.004506 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 18:51:21.004804 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.004745 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-57tht\"" Apr 17 18:51:21.004804 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.004748 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 18:51:21.004804 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.004748 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 18:51:21.004804 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.004787 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 18:51:21.004996 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.004870 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 18:51:21.005074 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.005059 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 18:51:21.010589 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.010426 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 18:51:21.077366 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.077336 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/742a2272-576d-4cfe-87a1-e174cb5be061-config-out\") pod \"alertmanager-main-0\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:51:21.077541 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.077375 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shp2l\" (UniqueName: \"kubernetes.io/projected/742a2272-576d-4cfe-87a1-e174cb5be061-kube-api-access-shp2l\") pod \"alertmanager-main-0\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:51:21.077541 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.077399 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/742a2272-576d-4cfe-87a1-e174cb5be061-config-volume\") pod \"alertmanager-main-0\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:51:21.077541 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.077441 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/742a2272-576d-4cfe-87a1-e174cb5be061-tls-assets\") pod \"alertmanager-main-0\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:51:21.077541 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.077497 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/742a2272-576d-4cfe-87a1-e174cb5be061-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:51:21.077541 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.077520 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/742a2272-576d-4cfe-87a1-e174cb5be061-web-config\") pod \"alertmanager-main-0\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:51:21.077541 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.077537 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/742a2272-576d-4cfe-87a1-e174cb5be061-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:51:21.077780 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.077555 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/742a2272-576d-4cfe-87a1-e174cb5be061-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:51:21.077780 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.077577 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/742a2272-576d-4cfe-87a1-e174cb5be061-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:51:21.077780 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.077612 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/742a2272-576d-4cfe-87a1-e174cb5be061-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:51:21.077780 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.077646 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/742a2272-576d-4cfe-87a1-e174cb5be061-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:51:21.077780 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.077694 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/742a2272-576d-4cfe-87a1-e174cb5be061-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:51:21.077780 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.077729 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/742a2272-576d-4cfe-87a1-e174cb5be061-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:51:21.178387 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.178364 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/742a2272-576d-4cfe-87a1-e174cb5be061-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:51:21.178552 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.178393 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/742a2272-576d-4cfe-87a1-e174cb5be061-web-config\") pod \"alertmanager-main-0\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:51:21.178552 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.178412 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/742a2272-576d-4cfe-87a1-e174cb5be061-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:51:21.178552 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.178430 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/742a2272-576d-4cfe-87a1-e174cb5be061-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:51:21.178552 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.178508 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/742a2272-576d-4cfe-87a1-e174cb5be061-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:51:21.178552 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.178550 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/742a2272-576d-4cfe-87a1-e174cb5be061-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:51:21.178835 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.178580 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/742a2272-576d-4cfe-87a1-e174cb5be061-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:51:21.178835 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.178611 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/742a2272-576d-4cfe-87a1-e174cb5be061-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:51:21.178835 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.178634 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/742a2272-576d-4cfe-87a1-e174cb5be061-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:51:21.178835 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.178673 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/742a2272-576d-4cfe-87a1-e174cb5be061-config-out\") pod \"alertmanager-main-0\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:51:21.178835 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.178715 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-shp2l\" (UniqueName: \"kubernetes.io/projected/742a2272-576d-4cfe-87a1-e174cb5be061-kube-api-access-shp2l\") pod \"alertmanager-main-0\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:51:21.178835 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.178760 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/742a2272-576d-4cfe-87a1-e174cb5be061-config-volume\") pod \"alertmanager-main-0\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:51:21.178835 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.178803 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/742a2272-576d-4cfe-87a1-e174cb5be061-tls-assets\") pod \"alertmanager-main-0\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:51:21.179220 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.179177 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/742a2272-576d-4cfe-87a1-e174cb5be061-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:51:21.179294 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.179274 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/742a2272-576d-4cfe-87a1-e174cb5be061-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:51:21.180850 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.180829 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/742a2272-576d-4cfe-87a1-e174cb5be061-config-out\") pod \"alertmanager-main-0\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:51:21.181143 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.181123 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 18:51:21.181280 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.181139 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 18:51:21.181280 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.181191 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 18:51:21.181280 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.181243 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 18:51:21.181280 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.181246 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 18:51:21.181280 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.181197 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 18:51:21.181486 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.181373 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 18:51:21.181486 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.181384 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 18:51:21.186969 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.186949 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 18:51:21.187443 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.187428 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-shp2l\" (UniqueName: \"kubernetes.io/projected/742a2272-576d-4cfe-87a1-e174cb5be061-kube-api-access-shp2l\") pod \"alertmanager-main-0\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:51:21.190767 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.190487 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/742a2272-576d-4cfe-87a1-e174cb5be061-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:51:21.192154 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.192128 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/742a2272-576d-4cfe-87a1-e174cb5be061-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:51:21.192251 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.192158 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/742a2272-576d-4cfe-87a1-e174cb5be061-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:51:21.192312 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.192284 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/742a2272-576d-4cfe-87a1-e174cb5be061-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:51:21.192312 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.192292 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/742a2272-576d-4cfe-87a1-e174cb5be061-tls-assets\") pod \"alertmanager-main-0\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:51:21.192952 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.192905 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/742a2272-576d-4cfe-87a1-e174cb5be061-config-volume\") pod \"alertmanager-main-0\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:51:21.193383 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.193349 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/742a2272-576d-4cfe-87a1-e174cb5be061-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:51:21.193778 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.193752 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/742a2272-576d-4cfe-87a1-e174cb5be061-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:51:21.194113 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.194090 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/742a2272-576d-4cfe-87a1-e174cb5be061-web-config\") pod \"alertmanager-main-0\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:51:21.315775 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.315753 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-57tht\"" Apr 17 18:51:21.324519 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.324497 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:51:21.450419 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.450383 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 18:51:21.454498 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:51:21.454452 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod742a2272_576d_4cfe_87a1_e174cb5be061.slice/crio-e6749f4fac3256e5c4ee844bb2a712973ed8c6484aba993d5b4434286142cb34 WatchSource:0}: Error finding container e6749f4fac3256e5c4ee844bb2a712973ed8c6484aba993d5b4434286142cb34: Status 404 returned error can't find the container with id e6749f4fac3256e5c4ee844bb2a712973ed8c6484aba993d5b4434286142cb34 Apr 17 18:51:21.647973 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.647880 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"742a2272-576d-4cfe-87a1-e174cb5be061","Type":"ContainerStarted","Data":"e6749f4fac3256e5c4ee844bb2a712973ed8c6484aba993d5b4434286142cb34"} Apr 17 18:51:21.649820 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:21.649792 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jjbzz" event={"ID":"5a3e7ac2-a313-4475-835e-44fbfe441ae1","Type":"ContainerStarted","Data":"3800803756c69e03f38d3b6160233e5d5812f512dc276fff51fd419f0071b563"} Apr 17 18:51:22.653427 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:22.653391 2571 generic.go:358] "Generic (PLEG): container finished" podID="742a2272-576d-4cfe-87a1-e174cb5be061" containerID="b5d8591ed6b887d543fe4142aca69cbbf8a3979dcaffbe49b0df8db874c3abe0" exitCode=0 Apr 17 18:51:22.653844 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:22.653484 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"742a2272-576d-4cfe-87a1-e174cb5be061","Type":"ContainerDied","Data":"b5d8591ed6b887d543fe4142aca69cbbf8a3979dcaffbe49b0df8db874c3abe0"} Apr 17 18:51:22.655059 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:22.655037 2571 generic.go:358] "Generic (PLEG): container finished" podID="5a3e7ac2-a313-4475-835e-44fbfe441ae1" containerID="b9bff7f0552e3f72310cdcb04e75030601e608c34a220a0e2baf2622bc719b9e" exitCode=0 Apr 17 18:51:22.655127 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:22.655094 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jjbzz" event={"ID":"5a3e7ac2-a313-4475-835e-44fbfe441ae1","Type":"ContainerDied","Data":"b9bff7f0552e3f72310cdcb04e75030601e608c34a220a0e2baf2622bc719b9e"} Apr 17 18:51:23.660147 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:23.660108 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jjbzz" event={"ID":"5a3e7ac2-a313-4475-835e-44fbfe441ae1","Type":"ContainerStarted","Data":"4821c4a29996eeca613c777294b08a7b8ee5542dbf6b23aacb87cb68119ab833"} Apr 17 18:51:23.660147 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:23.660154 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jjbzz" event={"ID":"5a3e7ac2-a313-4475-835e-44fbfe441ae1","Type":"ContainerStarted","Data":"eb1ef3d0246d82e12d9b0e1ca22b91694fba19842dcbffd81d8e27487ff0355f"} Apr 17 18:51:23.677264 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:23.677218 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-jjbzz" podStartSLOduration=3.783265435 podStartE2EDuration="4.677204101s" podCreationTimestamp="2026-04-17 18:51:19 +0000 UTC" firstStartedPulling="2026-04-17 18:51:20.874085424 +0000 UTC m=+120.148102695" lastFinishedPulling="2026-04-17 18:51:21.768024083 +0000 UTC m=+121.042041361" observedRunningTime="2026-04-17 18:51:23.67607037 +0000 UTC m=+122.950087664" watchObservedRunningTime="2026-04-17 18:51:23.677204101 +0000 UTC m=+122.951221432" Apr 17 18:51:24.715701 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:24.715676 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-r9qw6"] Apr 17 18:51:24.718678 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:24.718658 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r9qw6" Apr 17 18:51:24.720777 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:24.720756 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 17 18:51:24.720894 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:24.720824 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-9pkpm\"" Apr 17 18:51:24.725622 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:24.725581 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-r9qw6"] Apr 17 18:51:24.808503 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:24.808480 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d2259eb2-1dc7-4fbb-95f8-174464456871-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-r9qw6\" (UID: \"d2259eb2-1dc7-4fbb-95f8-174464456871\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r9qw6" Apr 17 18:51:24.909645 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:24.909617 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d2259eb2-1dc7-4fbb-95f8-174464456871-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-r9qw6\" (UID: \"d2259eb2-1dc7-4fbb-95f8-174464456871\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r9qw6" Apr 17 18:51:24.909801 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:51:24.909784 2571 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 17 18:51:24.909872 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:51:24.909843 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2259eb2-1dc7-4fbb-95f8-174464456871-monitoring-plugin-cert podName:d2259eb2-1dc7-4fbb-95f8-174464456871 nodeName:}" failed. No retries permitted until 2026-04-17 18:51:25.409823865 +0000 UTC m=+124.683841136 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/d2259eb2-1dc7-4fbb-95f8-174464456871-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-r9qw6" (UID: "d2259eb2-1dc7-4fbb-95f8-174464456871") : secret "monitoring-plugin-cert" not found Apr 17 18:51:25.413939 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:25.413902 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d2259eb2-1dc7-4fbb-95f8-174464456871-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-r9qw6\" (UID: \"d2259eb2-1dc7-4fbb-95f8-174464456871\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r9qw6" Apr 17 18:51:25.416550 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:25.416524 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d2259eb2-1dc7-4fbb-95f8-174464456871-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-r9qw6\" (UID: \"d2259eb2-1dc7-4fbb-95f8-174464456871\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r9qw6" Apr 17 18:51:25.642851 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:25.642822 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r9qw6" Apr 17 18:51:25.668875 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:25.668818 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"742a2272-576d-4cfe-87a1-e174cb5be061","Type":"ContainerStarted","Data":"cf0a15749bc3428312687f2fb086a62fe72bdf82b56b1167255e7d74b74a0ca5"} Apr 17 18:51:25.668875 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:25.668850 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"742a2272-576d-4cfe-87a1-e174cb5be061","Type":"ContainerStarted","Data":"167eb6169d238e429885b20462c03f0ccb269d06f28885296236053992087556"} Apr 17 18:51:25.668875 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:25.668859 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"742a2272-576d-4cfe-87a1-e174cb5be061","Type":"ContainerStarted","Data":"8a1cb5335182ba7dcebb09cc75b027d9630c78179239680f75dcdd536180a168"} Apr 17 18:51:25.668875 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:25.668868 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"742a2272-576d-4cfe-87a1-e174cb5be061","Type":"ContainerStarted","Data":"54d92f2c7e520196a1361e5e4df5fc084030f060162526915ebec5cc1b6daa32"} Apr 17 18:51:25.668875 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:25.668875 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"742a2272-576d-4cfe-87a1-e174cb5be061","Type":"ContainerStarted","Data":"f0cee600e4140634e04f660c8b99f0c244aff7cf2696b3abcd8d2c9adc7f9676"} Apr 17 18:51:25.879378 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:25.879352 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-r9qw6"] Apr 17 18:51:25.881815 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:51:25.881790 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2259eb2_1dc7_4fbb_95f8_174464456871.slice/crio-2410b3db2e66b53aff6760b2ac9a5889bd7f2614489da61404c11abd87ca2654 WatchSource:0}: Error finding container 2410b3db2e66b53aff6760b2ac9a5889bd7f2614489da61404c11abd87ca2654: Status 404 returned error can't find the container with id 2410b3db2e66b53aff6760b2ac9a5889bd7f2614489da61404c11abd87ca2654 Apr 17 18:51:26.675589 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:26.675552 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"742a2272-576d-4cfe-87a1-e174cb5be061","Type":"ContainerStarted","Data":"cd8d199e58925b1647cb5063ba4c9ec2a59b5ad2dae5408ce5497d8e0d76a1f5"} Apr 17 18:51:26.676878 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:26.676848 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r9qw6" event={"ID":"d2259eb2-1dc7-4fbb-95f8-174464456871","Type":"ContainerStarted","Data":"2410b3db2e66b53aff6760b2ac9a5889bd7f2614489da61404c11abd87ca2654"} Apr 17 18:51:26.706648 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:26.706603 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.356688638 podStartE2EDuration="6.706585358s" podCreationTimestamp="2026-04-17 18:51:20 +0000 UTC" firstStartedPulling="2026-04-17 18:51:21.456930821 +0000 UTC m=+120.730948093" lastFinishedPulling="2026-04-17 18:51:25.806827539 +0000 UTC m=+125.080844813" observedRunningTime="2026-04-17 18:51:26.705366542 +0000 UTC m=+125.979383835" watchObservedRunningTime="2026-04-17 18:51:26.706585358 +0000 UTC m=+125.980602650" Apr 17 18:51:27.681004 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:27.680915 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r9qw6" event={"ID":"d2259eb2-1dc7-4fbb-95f8-174464456871","Type":"ContainerStarted","Data":"45a18892f90b4a40076a1a048ec31bafd7af3be62a58f4c18062cd64a98ba781"} Apr 17 18:51:27.681428 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:27.681394 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r9qw6" Apr 17 18:51:27.686438 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:27.686414 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r9qw6" Apr 17 18:51:27.694704 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:27.694666 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r9qw6" podStartSLOduration=2.316602656 podStartE2EDuration="3.694653663s" podCreationTimestamp="2026-04-17 18:51:24 +0000 UTC" firstStartedPulling="2026-04-17 18:51:25.883605843 +0000 UTC m=+125.157623114" lastFinishedPulling="2026-04-17 18:51:27.261656835 +0000 UTC m=+126.535674121" observedRunningTime="2026-04-17 18:51:27.694313181 +0000 UTC m=+126.968330475" watchObservedRunningTime="2026-04-17 18:51:27.694653663 +0000 UTC m=+126.968670956" Apr 17 18:51:29.949754 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:29.949719 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9cb68ed8-ce9b-48b8-9980-07d87baf968b-metrics-certs\") pod \"network-metrics-daemon-d99zz\" (UID: \"9cb68ed8-ce9b-48b8-9980-07d87baf968b\") " pod="openshift-multus/network-metrics-daemon-d99zz" Apr 17 18:51:29.952013 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:29.951988 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9cb68ed8-ce9b-48b8-9980-07d87baf968b-metrics-certs\") pod \"network-metrics-daemon-d99zz\" (UID: \"9cb68ed8-ce9b-48b8-9980-07d87baf968b\") " pod="openshift-multus/network-metrics-daemon-d99zz" Apr 17 18:51:30.146047 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:30.146017 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-dwb49\"" Apr 17 18:51:30.155106 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:30.155086 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d99zz" Apr 17 18:51:30.270624 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:30.270587 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-d99zz"] Apr 17 18:51:30.274119 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:51:30.274094 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cb68ed8_ce9b_48b8_9980_07d87baf968b.slice/crio-19df2ee858feb110f47a341240bc0a054263e8f36db4022dfad1f43d677f0fda WatchSource:0}: Error finding container 19df2ee858feb110f47a341240bc0a054263e8f36db4022dfad1f43d677f0fda: Status 404 returned error can't find the container with id 19df2ee858feb110f47a341240bc0a054263e8f36db4022dfad1f43d677f0fda Apr 17 18:51:30.690129 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:30.690081 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-d99zz" event={"ID":"9cb68ed8-ce9b-48b8-9980-07d87baf968b","Type":"ContainerStarted","Data":"19df2ee858feb110f47a341240bc0a054263e8f36db4022dfad1f43d677f0fda"} Apr 17 18:51:31.694671 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:31.694637 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-d99zz" event={"ID":"9cb68ed8-ce9b-48b8-9980-07d87baf968b","Type":"ContainerStarted","Data":"c2dfd50de50893d7a2c842e85fefc588cbe29ed37681dae07bff719b86c04aad"} Apr 17 18:51:31.694671 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:31.694674 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-d99zz" event={"ID":"9cb68ed8-ce9b-48b8-9980-07d87baf968b","Type":"ContainerStarted","Data":"ab8ad35beccb65579b579c2173db856af3284a0d9c7d142f562fc46df903042f"} Apr 17 18:51:31.710657 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:31.710602 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-d99zz" podStartSLOduration=129.788256498 podStartE2EDuration="2m10.71058427s" podCreationTimestamp="2026-04-17 18:49:21 +0000 UTC" firstStartedPulling="2026-04-17 18:51:30.276013404 +0000 UTC m=+129.550030690" lastFinishedPulling="2026-04-17 18:51:31.198341191 +0000 UTC m=+130.472358462" observedRunningTime="2026-04-17 18:51:31.709235054 +0000 UTC m=+130.983252352" watchObservedRunningTime="2026-04-17 18:51:31.71058427 +0000 UTC m=+130.984601564" Apr 17 18:51:33.056513 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:33.056477 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-7bb99cc44b-rr54f" podUID="1b34c122-9e66-4876-ad10-d90898363166" containerName="registry" containerID="cri-o://8b5eb98742a21996f030af193d6b6c449b6959e6ec1343f5369491e2d8fc9bca" gracePeriod=30 Apr 17 18:51:33.286726 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:33.286705 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7bb99cc44b-rr54f" Apr 17 18:51:33.376070 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:33.376009 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1b34c122-9e66-4876-ad10-d90898363166-trusted-ca\") pod \"1b34c122-9e66-4876-ad10-d90898363166\" (UID: \"1b34c122-9e66-4876-ad10-d90898363166\") " Apr 17 18:51:33.376070 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:33.376056 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1b34c122-9e66-4876-ad10-d90898363166-registry-certificates\") pod \"1b34c122-9e66-4876-ad10-d90898363166\" (UID: \"1b34c122-9e66-4876-ad10-d90898363166\") " Apr 17 18:51:33.376240 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:33.376081 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1b34c122-9e66-4876-ad10-d90898363166-installation-pull-secrets\") pod \"1b34c122-9e66-4876-ad10-d90898363166\" (UID: \"1b34c122-9e66-4876-ad10-d90898363166\") " Apr 17 18:51:33.376240 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:33.376107 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dw4z7\" (UniqueName: \"kubernetes.io/projected/1b34c122-9e66-4876-ad10-d90898363166-kube-api-access-dw4z7\") pod \"1b34c122-9e66-4876-ad10-d90898363166\" (UID: \"1b34c122-9e66-4876-ad10-d90898363166\") " Apr 17 18:51:33.376240 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:33.376136 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b34c122-9e66-4876-ad10-d90898363166-registry-tls\") pod \"1b34c122-9e66-4876-ad10-d90898363166\" (UID: \"1b34c122-9e66-4876-ad10-d90898363166\") " Apr 17 18:51:33.376240 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:33.376150 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b34c122-9e66-4876-ad10-d90898363166-bound-sa-token\") pod \"1b34c122-9e66-4876-ad10-d90898363166\" (UID: \"1b34c122-9e66-4876-ad10-d90898363166\") " Apr 17 18:51:33.376240 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:33.376177 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1b34c122-9e66-4876-ad10-d90898363166-ca-trust-extracted\") pod \"1b34c122-9e66-4876-ad10-d90898363166\" (UID: \"1b34c122-9e66-4876-ad10-d90898363166\") " Apr 17 18:51:33.376497 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:33.376450 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b34c122-9e66-4876-ad10-d90898363166-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "1b34c122-9e66-4876-ad10-d90898363166" (UID: "1b34c122-9e66-4876-ad10-d90898363166"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:51:33.376575 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:33.376541 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1b34c122-9e66-4876-ad10-d90898363166-image-registry-private-configuration\") pod \"1b34c122-9e66-4876-ad10-d90898363166\" (UID: \"1b34c122-9e66-4876-ad10-d90898363166\") " Apr 17 18:51:33.377232 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:33.377196 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b34c122-9e66-4876-ad10-d90898363166-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "1b34c122-9e66-4876-ad10-d90898363166" (UID: "1b34c122-9e66-4876-ad10-d90898363166"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:51:33.379644 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:33.378402 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1b34c122-9e66-4876-ad10-d90898363166-trusted-ca\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:51:33.379644 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:33.378437 2571 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1b34c122-9e66-4876-ad10-d90898363166-registry-certificates\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:51:33.383189 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:33.383149 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b34c122-9e66-4876-ad10-d90898363166-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "1b34c122-9e66-4876-ad10-d90898363166" (UID: "1b34c122-9e66-4876-ad10-d90898363166"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:51:33.383308 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:33.383186 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b34c122-9e66-4876-ad10-d90898363166-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "1b34c122-9e66-4876-ad10-d90898363166" (UID: "1b34c122-9e66-4876-ad10-d90898363166"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:51:33.383745 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:33.383720 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b34c122-9e66-4876-ad10-d90898363166-kube-api-access-dw4z7" (OuterVolumeSpecName: "kube-api-access-dw4z7") pod "1b34c122-9e66-4876-ad10-d90898363166" (UID: "1b34c122-9e66-4876-ad10-d90898363166"). InnerVolumeSpecName "kube-api-access-dw4z7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:51:33.383842 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:33.383766 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b34c122-9e66-4876-ad10-d90898363166-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "1b34c122-9e66-4876-ad10-d90898363166" (UID: "1b34c122-9e66-4876-ad10-d90898363166"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:51:33.383842 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:33.383794 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b34c122-9e66-4876-ad10-d90898363166-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "1b34c122-9e66-4876-ad10-d90898363166" (UID: "1b34c122-9e66-4876-ad10-d90898363166"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:51:33.389549 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:33.389528 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b34c122-9e66-4876-ad10-d90898363166-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "1b34c122-9e66-4876-ad10-d90898363166" (UID: "1b34c122-9e66-4876-ad10-d90898363166"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:51:33.479317 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:33.479294 2571 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b34c122-9e66-4876-ad10-d90898363166-registry-tls\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:51:33.479317 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:33.479316 2571 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b34c122-9e66-4876-ad10-d90898363166-bound-sa-token\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:51:33.479428 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:33.479325 2571 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1b34c122-9e66-4876-ad10-d90898363166-ca-trust-extracted\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:51:33.479428 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:33.479336 2571 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1b34c122-9e66-4876-ad10-d90898363166-image-registry-private-configuration\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:51:33.479428 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:33.479345 2571 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1b34c122-9e66-4876-ad10-d90898363166-installation-pull-secrets\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:51:33.479428 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:33.479354 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dw4z7\" (UniqueName: \"kubernetes.io/projected/1b34c122-9e66-4876-ad10-d90898363166-kube-api-access-dw4z7\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:51:33.701104 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:33.701022 2571 generic.go:358] "Generic (PLEG): container finished" podID="1b34c122-9e66-4876-ad10-d90898363166" containerID="8b5eb98742a21996f030af193d6b6c449b6959e6ec1343f5369491e2d8fc9bca" exitCode=0 Apr 17 18:51:33.701104 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:33.701097 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7bb99cc44b-rr54f" Apr 17 18:51:33.701324 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:33.701112 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7bb99cc44b-rr54f" event={"ID":"1b34c122-9e66-4876-ad10-d90898363166","Type":"ContainerDied","Data":"8b5eb98742a21996f030af193d6b6c449b6959e6ec1343f5369491e2d8fc9bca"} Apr 17 18:51:33.701324 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:33.701159 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7bb99cc44b-rr54f" event={"ID":"1b34c122-9e66-4876-ad10-d90898363166","Type":"ContainerDied","Data":"b98f65ecfb6f034217e3248f3e93a313ec94323e1e80498da55222d61fab2896"} Apr 17 18:51:33.701324 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:33.701190 2571 scope.go:117] "RemoveContainer" containerID="8b5eb98742a21996f030af193d6b6c449b6959e6ec1343f5369491e2d8fc9bca" Apr 17 18:51:33.709205 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:33.709189 2571 scope.go:117] "RemoveContainer" containerID="8b5eb98742a21996f030af193d6b6c449b6959e6ec1343f5369491e2d8fc9bca" Apr 17 18:51:33.709510 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:51:33.709484 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b5eb98742a21996f030af193d6b6c449b6959e6ec1343f5369491e2d8fc9bca\": container with ID starting with 8b5eb98742a21996f030af193d6b6c449b6959e6ec1343f5369491e2d8fc9bca not found: ID does not exist" containerID="8b5eb98742a21996f030af193d6b6c449b6959e6ec1343f5369491e2d8fc9bca" Apr 17 18:51:33.709600 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:33.709517 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b5eb98742a21996f030af193d6b6c449b6959e6ec1343f5369491e2d8fc9bca"} err="failed to get container status \"8b5eb98742a21996f030af193d6b6c449b6959e6ec1343f5369491e2d8fc9bca\": rpc error: code = NotFound desc = could not find container \"8b5eb98742a21996f030af193d6b6c449b6959e6ec1343f5369491e2d8fc9bca\": container with ID starting with 8b5eb98742a21996f030af193d6b6c449b6959e6ec1343f5369491e2d8fc9bca not found: ID does not exist" Apr 17 18:51:33.722377 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:33.722354 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7bb99cc44b-rr54f"] Apr 17 18:51:33.725812 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:33.725793 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7bb99cc44b-rr54f"] Apr 17 18:51:35.236784 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:35.236747 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b34c122-9e66-4876-ad10-d90898363166" path="/var/lib/kubelet/pods/1b34c122-9e66-4876-ad10-d90898363166/volumes" Apr 17 18:51:38.885053 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:38.885019 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-c5646dc57-f5zjs"] Apr 17 18:51:38.885546 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:38.885405 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b34c122-9e66-4876-ad10-d90898363166" containerName="registry" Apr 17 18:51:38.885546 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:38.885436 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b34c122-9e66-4876-ad10-d90898363166" containerName="registry" Apr 17 18:51:38.885546 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:38.885519 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="1b34c122-9e66-4876-ad10-d90898363166" containerName="registry" Apr 17 18:51:38.888658 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:38.888639 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c5646dc57-f5zjs" Apr 17 18:51:38.890717 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:38.890691 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 18:51:38.890717 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:38.890709 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-gkfmj\"" Apr 17 18:51:38.891413 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:38.891392 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 18:51:38.891536 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:38.891428 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 18:51:38.891631 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:38.891617 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 18:51:38.891704 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:38.891689 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 18:51:38.891759 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:38.891716 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 18:51:38.891759 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:38.891719 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 18:51:38.894853 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:38.894825 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c5646dc57-f5zjs"] Apr 17 18:51:39.020635 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:39.020605 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/18bd33cc-af31-4a3f-85be-462b918b953f-console-oauth-config\") pod \"console-c5646dc57-f5zjs\" (UID: \"18bd33cc-af31-4a3f-85be-462b918b953f\") " pod="openshift-console/console-c5646dc57-f5zjs" Apr 17 18:51:39.020782 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:39.020644 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t2t5\" (UniqueName: \"kubernetes.io/projected/18bd33cc-af31-4a3f-85be-462b918b953f-kube-api-access-5t2t5\") pod \"console-c5646dc57-f5zjs\" (UID: \"18bd33cc-af31-4a3f-85be-462b918b953f\") " pod="openshift-console/console-c5646dc57-f5zjs" Apr 17 18:51:39.020782 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:39.020672 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/18bd33cc-af31-4a3f-85be-462b918b953f-service-ca\") pod \"console-c5646dc57-f5zjs\" (UID: \"18bd33cc-af31-4a3f-85be-462b918b953f\") " pod="openshift-console/console-c5646dc57-f5zjs" Apr 17 18:51:39.020782 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:39.020729 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/18bd33cc-af31-4a3f-85be-462b918b953f-console-serving-cert\") pod \"console-c5646dc57-f5zjs\" (UID: \"18bd33cc-af31-4a3f-85be-462b918b953f\") " pod="openshift-console/console-c5646dc57-f5zjs" Apr 17 18:51:39.020782 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:39.020763 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/18bd33cc-af31-4a3f-85be-462b918b953f-console-config\") pod \"console-c5646dc57-f5zjs\" (UID: \"18bd33cc-af31-4a3f-85be-462b918b953f\") " pod="openshift-console/console-c5646dc57-f5zjs" Apr 17 18:51:39.020953 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:39.020822 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/18bd33cc-af31-4a3f-85be-462b918b953f-oauth-serving-cert\") pod \"console-c5646dc57-f5zjs\" (UID: \"18bd33cc-af31-4a3f-85be-462b918b953f\") " pod="openshift-console/console-c5646dc57-f5zjs" Apr 17 18:51:39.121585 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:39.121558 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/18bd33cc-af31-4a3f-85be-462b918b953f-oauth-serving-cert\") pod \"console-c5646dc57-f5zjs\" (UID: \"18bd33cc-af31-4a3f-85be-462b918b953f\") " pod="openshift-console/console-c5646dc57-f5zjs" Apr 17 18:51:39.121692 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:39.121594 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/18bd33cc-af31-4a3f-85be-462b918b953f-console-oauth-config\") pod \"console-c5646dc57-f5zjs\" (UID: \"18bd33cc-af31-4a3f-85be-462b918b953f\") " pod="openshift-console/console-c5646dc57-f5zjs" Apr 17 18:51:39.121692 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:39.121615 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5t2t5\" (UniqueName: \"kubernetes.io/projected/18bd33cc-af31-4a3f-85be-462b918b953f-kube-api-access-5t2t5\") pod \"console-c5646dc57-f5zjs\" (UID: \"18bd33cc-af31-4a3f-85be-462b918b953f\") " pod="openshift-console/console-c5646dc57-f5zjs" Apr 17 18:51:39.121692 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:39.121641 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/18bd33cc-af31-4a3f-85be-462b918b953f-service-ca\") pod \"console-c5646dc57-f5zjs\" (UID: \"18bd33cc-af31-4a3f-85be-462b918b953f\") " pod="openshift-console/console-c5646dc57-f5zjs" Apr 17 18:51:39.121692 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:39.121685 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/18bd33cc-af31-4a3f-85be-462b918b953f-console-serving-cert\") pod \"console-c5646dc57-f5zjs\" (UID: \"18bd33cc-af31-4a3f-85be-462b918b953f\") " pod="openshift-console/console-c5646dc57-f5zjs" Apr 17 18:51:39.121872 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:39.121709 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/18bd33cc-af31-4a3f-85be-462b918b953f-console-config\") pod \"console-c5646dc57-f5zjs\" (UID: \"18bd33cc-af31-4a3f-85be-462b918b953f\") " pod="openshift-console/console-c5646dc57-f5zjs" Apr 17 18:51:39.122317 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:39.122294 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/18bd33cc-af31-4a3f-85be-462b918b953f-service-ca\") pod \"console-c5646dc57-f5zjs\" (UID: \"18bd33cc-af31-4a3f-85be-462b918b953f\") " pod="openshift-console/console-c5646dc57-f5zjs" Apr 17 18:51:39.122420 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:39.122303 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/18bd33cc-af31-4a3f-85be-462b918b953f-oauth-serving-cert\") pod \"console-c5646dc57-f5zjs\" (UID: \"18bd33cc-af31-4a3f-85be-462b918b953f\") " pod="openshift-console/console-c5646dc57-f5zjs" Apr 17 18:51:39.122420 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:39.122407 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/18bd33cc-af31-4a3f-85be-462b918b953f-console-config\") pod \"console-c5646dc57-f5zjs\" (UID: \"18bd33cc-af31-4a3f-85be-462b918b953f\") " pod="openshift-console/console-c5646dc57-f5zjs" Apr 17 18:51:39.124078 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:39.124059 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/18bd33cc-af31-4a3f-85be-462b918b953f-console-serving-cert\") pod \"console-c5646dc57-f5zjs\" (UID: \"18bd33cc-af31-4a3f-85be-462b918b953f\") " pod="openshift-console/console-c5646dc57-f5zjs" Apr 17 18:51:39.124184 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:39.124164 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/18bd33cc-af31-4a3f-85be-462b918b953f-console-oauth-config\") pod \"console-c5646dc57-f5zjs\" (UID: \"18bd33cc-af31-4a3f-85be-462b918b953f\") " pod="openshift-console/console-c5646dc57-f5zjs" Apr 17 18:51:39.129346 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:39.129323 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t2t5\" (UniqueName: \"kubernetes.io/projected/18bd33cc-af31-4a3f-85be-462b918b953f-kube-api-access-5t2t5\") pod \"console-c5646dc57-f5zjs\" (UID: \"18bd33cc-af31-4a3f-85be-462b918b953f\") " pod="openshift-console/console-c5646dc57-f5zjs" Apr 17 18:51:39.198602 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:39.198546 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c5646dc57-f5zjs" Apr 17 18:51:39.313642 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:39.313501 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c5646dc57-f5zjs"] Apr 17 18:51:39.316193 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:51:39.316167 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18bd33cc_af31_4a3f_85be_462b918b953f.slice/crio-0713ffdb803b023f21cbe5dcc031184ba72d1172a73a5e23e0363aeed8c2c254 WatchSource:0}: Error finding container 0713ffdb803b023f21cbe5dcc031184ba72d1172a73a5e23e0363aeed8c2c254: Status 404 returned error can't find the container with id 0713ffdb803b023f21cbe5dcc031184ba72d1172a73a5e23e0363aeed8c2c254 Apr 17 18:51:39.719037 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:39.719005 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c5646dc57-f5zjs" event={"ID":"18bd33cc-af31-4a3f-85be-462b918b953f","Type":"ContainerStarted","Data":"0713ffdb803b023f21cbe5dcc031184ba72d1172a73a5e23e0363aeed8c2c254"} Apr 17 18:51:42.729899 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:42.729859 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c5646dc57-f5zjs" event={"ID":"18bd33cc-af31-4a3f-85be-462b918b953f","Type":"ContainerStarted","Data":"19ee5ac58a0c756e30795850a8dd7ffa2cab9522eabd1617ab481e28f75dfbb3"} Apr 17 18:51:42.766416 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:42.766367 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-c5646dc57-f5zjs" podStartSLOduration=1.9344846489999998 podStartE2EDuration="4.766350429s" podCreationTimestamp="2026-04-17 18:51:38 +0000 UTC" firstStartedPulling="2026-04-17 18:51:39.318121322 +0000 UTC m=+138.592138593" lastFinishedPulling="2026-04-17 18:51:42.149987103 +0000 UTC m=+141.424004373" observedRunningTime="2026-04-17 18:51:42.765650204 +0000 UTC m=+142.039667494" watchObservedRunningTime="2026-04-17 18:51:42.766350429 +0000 UTC m=+142.040367724" Apr 17 18:51:48.479023 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:48.478990 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-74d689b959-wdc6l"] Apr 17 18:51:48.481537 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:48.481521 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74d689b959-wdc6l" Apr 17 18:51:48.486940 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:48.486920 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3-service-ca\") pod \"console-74d689b959-wdc6l\" (UID: \"035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3\") " pod="openshift-console/console-74d689b959-wdc6l" Apr 17 18:51:48.487070 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:48.486966 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3-console-serving-cert\") pod \"console-74d689b959-wdc6l\" (UID: \"035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3\") " pod="openshift-console/console-74d689b959-wdc6l" Apr 17 18:51:48.487070 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:48.487049 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3-console-config\") pod \"console-74d689b959-wdc6l\" (UID: \"035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3\") " pod="openshift-console/console-74d689b959-wdc6l" Apr 17 18:51:48.487189 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:48.487133 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwrt6\" (UniqueName: \"kubernetes.io/projected/035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3-kube-api-access-fwrt6\") pod \"console-74d689b959-wdc6l\" (UID: \"035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3\") " pod="openshift-console/console-74d689b959-wdc6l" Apr 17 18:51:48.487189 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:48.487165 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3-console-oauth-config\") pod \"console-74d689b959-wdc6l\" (UID: \"035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3\") " pod="openshift-console/console-74d689b959-wdc6l" Apr 17 18:51:48.487285 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:48.487191 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3-oauth-serving-cert\") pod \"console-74d689b959-wdc6l\" (UID: \"035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3\") " pod="openshift-console/console-74d689b959-wdc6l" Apr 17 18:51:48.487285 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:48.487226 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3-trusted-ca-bundle\") pod \"console-74d689b959-wdc6l\" (UID: \"035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3\") " pod="openshift-console/console-74d689b959-wdc6l" Apr 17 18:51:48.488588 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:48.488564 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 18:51:48.491059 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:48.491038 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74d689b959-wdc6l"] Apr 17 18:51:48.587884 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:48.587853 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3-service-ca\") pod \"console-74d689b959-wdc6l\" (UID: \"035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3\") " pod="openshift-console/console-74d689b959-wdc6l" Apr 17 18:51:48.588009 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:48.587890 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3-console-serving-cert\") pod \"console-74d689b959-wdc6l\" (UID: \"035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3\") " pod="openshift-console/console-74d689b959-wdc6l" Apr 17 18:51:48.588009 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:48.587906 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3-console-config\") pod \"console-74d689b959-wdc6l\" (UID: \"035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3\") " pod="openshift-console/console-74d689b959-wdc6l" Apr 17 18:51:48.588009 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:48.587937 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fwrt6\" (UniqueName: \"kubernetes.io/projected/035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3-kube-api-access-fwrt6\") pod \"console-74d689b959-wdc6l\" (UID: \"035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3\") " pod="openshift-console/console-74d689b959-wdc6l" Apr 17 18:51:48.588153 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:48.588044 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3-console-oauth-config\") pod \"console-74d689b959-wdc6l\" (UID: \"035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3\") " pod="openshift-console/console-74d689b959-wdc6l" Apr 17 18:51:48.588153 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:48.588076 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3-oauth-serving-cert\") pod \"console-74d689b959-wdc6l\" (UID: \"035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3\") " pod="openshift-console/console-74d689b959-wdc6l" Apr 17 18:51:48.588153 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:48.588110 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3-trusted-ca-bundle\") pod \"console-74d689b959-wdc6l\" (UID: \"035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3\") " pod="openshift-console/console-74d689b959-wdc6l" Apr 17 18:51:48.588790 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:48.588741 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3-oauth-serving-cert\") pod \"console-74d689b959-wdc6l\" (UID: \"035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3\") " pod="openshift-console/console-74d689b959-wdc6l" Apr 17 18:51:48.588790 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:48.588780 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3-service-ca\") pod \"console-74d689b959-wdc6l\" (UID: \"035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3\") " pod="openshift-console/console-74d689b959-wdc6l" Apr 17 18:51:48.588921 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:48.588780 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3-console-config\") pod \"console-74d689b959-wdc6l\" (UID: \"035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3\") " pod="openshift-console/console-74d689b959-wdc6l" Apr 17 18:51:48.589387 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:48.589366 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3-trusted-ca-bundle\") pod \"console-74d689b959-wdc6l\" (UID: \"035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3\") " pod="openshift-console/console-74d689b959-wdc6l" Apr 17 18:51:48.590438 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:48.590410 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3-console-oauth-config\") pod \"console-74d689b959-wdc6l\" (UID: \"035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3\") " pod="openshift-console/console-74d689b959-wdc6l" Apr 17 18:51:48.590601 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:48.590585 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3-console-serving-cert\") pod \"console-74d689b959-wdc6l\" (UID: \"035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3\") " pod="openshift-console/console-74d689b959-wdc6l" Apr 17 18:51:48.594702 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:48.594683 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwrt6\" (UniqueName: \"kubernetes.io/projected/035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3-kube-api-access-fwrt6\") pod \"console-74d689b959-wdc6l\" (UID: \"035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3\") " pod="openshift-console/console-74d689b959-wdc6l" Apr 17 18:51:48.790933 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:48.790904 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74d689b959-wdc6l" Apr 17 18:51:48.905081 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:48.905054 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74d689b959-wdc6l"] Apr 17 18:51:48.907958 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:51:48.907928 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod035c0fd9_ae50_4ca0_b94e_c7ab2946d3c3.slice/crio-1bbc2e788543df7f9d7104b0a1625e50e30370b0f23f751c7a6ddac15c399509 WatchSource:0}: Error finding container 1bbc2e788543df7f9d7104b0a1625e50e30370b0f23f751c7a6ddac15c399509: Status 404 returned error can't find the container with id 1bbc2e788543df7f9d7104b0a1625e50e30370b0f23f751c7a6ddac15c399509 Apr 17 18:51:49.199750 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:49.199659 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-c5646dc57-f5zjs" Apr 17 18:51:49.199750 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:49.199706 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-c5646dc57-f5zjs" Apr 17 18:51:49.204432 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:49.204411 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-c5646dc57-f5zjs" Apr 17 18:51:49.749904 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:49.749865 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74d689b959-wdc6l" event={"ID":"035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3","Type":"ContainerStarted","Data":"97048c64c0592bf6b43eb732f594c568a84b4ab6bb5cabce080f03d401e8f441"} Apr 17 18:51:49.749904 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:49.749905 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74d689b959-wdc6l" event={"ID":"035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3","Type":"ContainerStarted","Data":"1bbc2e788543df7f9d7104b0a1625e50e30370b0f23f751c7a6ddac15c399509"} Apr 17 18:51:49.753792 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:49.753764 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-c5646dc57-f5zjs" Apr 17 18:51:49.764903 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:49.764854 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-74d689b959-wdc6l" podStartSLOduration=1.7648370089999998 podStartE2EDuration="1.764837009s" podCreationTimestamp="2026-04-17 18:51:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:51:49.764292485 +0000 UTC m=+149.038309780" watchObservedRunningTime="2026-04-17 18:51:49.764837009 +0000 UTC m=+149.038854301" Apr 17 18:51:58.791205 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:58.791173 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-74d689b959-wdc6l" Apr 17 18:51:58.791591 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:58.791215 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-74d689b959-wdc6l" Apr 17 18:51:58.795894 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:58.795874 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-74d689b959-wdc6l" Apr 17 18:51:59.780974 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:59.780944 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-74d689b959-wdc6l" Apr 17 18:51:59.823036 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:51:59.823003 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c5646dc57-f5zjs"] Apr 17 18:52:00.780726 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:00.780698 2571 generic.go:358] "Generic (PLEG): container finished" podID="c5fd97d3-b332-4c85-9344-c0e9f314aed6" containerID="fcbe228a0b910472d5955143d790361f7c64018ed8249004501032f57b6ba5c9" exitCode=0 Apr 17 18:52:00.780828 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:00.780771 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-88g9b" event={"ID":"c5fd97d3-b332-4c85-9344-c0e9f314aed6","Type":"ContainerDied","Data":"fcbe228a0b910472d5955143d790361f7c64018ed8249004501032f57b6ba5c9"} Apr 17 18:52:00.781161 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:00.781147 2571 scope.go:117] "RemoveContainer" containerID="fcbe228a0b910472d5955143d790361f7c64018ed8249004501032f57b6ba5c9" Apr 17 18:52:01.787078 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:01.787036 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-88g9b" event={"ID":"c5fd97d3-b332-4c85-9344-c0e9f314aed6","Type":"ContainerStarted","Data":"fa5afaa923f931ca74c077e474f3174445cff6c6c957f4cdd1ed6ac8d21e8b10"} Apr 17 18:52:24.841638 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:24.841596 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-c5646dc57-f5zjs" podUID="18bd33cc-af31-4a3f-85be-462b918b953f" containerName="console" containerID="cri-o://19ee5ac58a0c756e30795850a8dd7ffa2cab9522eabd1617ab481e28f75dfbb3" gracePeriod=15 Apr 17 18:52:25.078714 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:25.078694 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c5646dc57-f5zjs_18bd33cc-af31-4a3f-85be-462b918b953f/console/0.log" Apr 17 18:52:25.078810 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:25.078757 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c5646dc57-f5zjs" Apr 17 18:52:25.159212 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:25.159145 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/18bd33cc-af31-4a3f-85be-462b918b953f-console-oauth-config\") pod \"18bd33cc-af31-4a3f-85be-462b918b953f\" (UID: \"18bd33cc-af31-4a3f-85be-462b918b953f\") " Apr 17 18:52:25.159212 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:25.159187 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/18bd33cc-af31-4a3f-85be-462b918b953f-oauth-serving-cert\") pod \"18bd33cc-af31-4a3f-85be-462b918b953f\" (UID: \"18bd33cc-af31-4a3f-85be-462b918b953f\") " Apr 17 18:52:25.159212 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:25.159210 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/18bd33cc-af31-4a3f-85be-462b918b953f-console-serving-cert\") pod \"18bd33cc-af31-4a3f-85be-462b918b953f\" (UID: \"18bd33cc-af31-4a3f-85be-462b918b953f\") " Apr 17 18:52:25.159433 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:25.159230 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5t2t5\" (UniqueName: \"kubernetes.io/projected/18bd33cc-af31-4a3f-85be-462b918b953f-kube-api-access-5t2t5\") pod \"18bd33cc-af31-4a3f-85be-462b918b953f\" (UID: \"18bd33cc-af31-4a3f-85be-462b918b953f\") " Apr 17 18:52:25.159433 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:25.159265 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/18bd33cc-af31-4a3f-85be-462b918b953f-service-ca\") pod \"18bd33cc-af31-4a3f-85be-462b918b953f\" (UID: \"18bd33cc-af31-4a3f-85be-462b918b953f\") " Apr 17 18:52:25.159433 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:25.159291 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/18bd33cc-af31-4a3f-85be-462b918b953f-console-config\") pod \"18bd33cc-af31-4a3f-85be-462b918b953f\" (UID: \"18bd33cc-af31-4a3f-85be-462b918b953f\") " Apr 17 18:52:25.159728 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:25.159685 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18bd33cc-af31-4a3f-85be-462b918b953f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "18bd33cc-af31-4a3f-85be-462b918b953f" (UID: "18bd33cc-af31-4a3f-85be-462b918b953f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:52:25.159728 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:25.159717 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18bd33cc-af31-4a3f-85be-462b918b953f-console-config" (OuterVolumeSpecName: "console-config") pod "18bd33cc-af31-4a3f-85be-462b918b953f" (UID: "18bd33cc-af31-4a3f-85be-462b918b953f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:52:25.159875 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:25.159702 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18bd33cc-af31-4a3f-85be-462b918b953f-service-ca" (OuterVolumeSpecName: "service-ca") pod "18bd33cc-af31-4a3f-85be-462b918b953f" (UID: "18bd33cc-af31-4a3f-85be-462b918b953f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:52:25.161487 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:25.161439 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18bd33cc-af31-4a3f-85be-462b918b953f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "18bd33cc-af31-4a3f-85be-462b918b953f" (UID: "18bd33cc-af31-4a3f-85be-462b918b953f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:52:25.161576 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:25.161520 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18bd33cc-af31-4a3f-85be-462b918b953f-kube-api-access-5t2t5" (OuterVolumeSpecName: "kube-api-access-5t2t5") pod "18bd33cc-af31-4a3f-85be-462b918b953f" (UID: "18bd33cc-af31-4a3f-85be-462b918b953f"). InnerVolumeSpecName "kube-api-access-5t2t5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:52:25.161576 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:25.161524 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18bd33cc-af31-4a3f-85be-462b918b953f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "18bd33cc-af31-4a3f-85be-462b918b953f" (UID: "18bd33cc-af31-4a3f-85be-462b918b953f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:52:25.260566 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:25.260543 2571 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/18bd33cc-af31-4a3f-85be-462b918b953f-oauth-serving-cert\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:52:25.260652 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:25.260568 2571 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/18bd33cc-af31-4a3f-85be-462b918b953f-console-serving-cert\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:52:25.260652 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:25.260579 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5t2t5\" (UniqueName: \"kubernetes.io/projected/18bd33cc-af31-4a3f-85be-462b918b953f-kube-api-access-5t2t5\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:52:25.260652 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:25.260588 2571 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/18bd33cc-af31-4a3f-85be-462b918b953f-service-ca\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:52:25.260652 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:25.260597 2571 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/18bd33cc-af31-4a3f-85be-462b918b953f-console-config\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:52:25.260652 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:25.260606 2571 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/18bd33cc-af31-4a3f-85be-462b918b953f-console-oauth-config\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:52:25.852950 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:25.852920 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c5646dc57-f5zjs_18bd33cc-af31-4a3f-85be-462b918b953f/console/0.log" Apr 17 18:52:25.853322 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:25.852965 2571 generic.go:358] "Generic (PLEG): container finished" podID="18bd33cc-af31-4a3f-85be-462b918b953f" containerID="19ee5ac58a0c756e30795850a8dd7ffa2cab9522eabd1617ab481e28f75dfbb3" exitCode=2 Apr 17 18:52:25.853322 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:25.853049 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c5646dc57-f5zjs" Apr 17 18:52:25.853322 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:25.853055 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c5646dc57-f5zjs" event={"ID":"18bd33cc-af31-4a3f-85be-462b918b953f","Type":"ContainerDied","Data":"19ee5ac58a0c756e30795850a8dd7ffa2cab9522eabd1617ab481e28f75dfbb3"} Apr 17 18:52:25.853322 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:25.853095 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c5646dc57-f5zjs" event={"ID":"18bd33cc-af31-4a3f-85be-462b918b953f","Type":"ContainerDied","Data":"0713ffdb803b023f21cbe5dcc031184ba72d1172a73a5e23e0363aeed8c2c254"} Apr 17 18:52:25.853322 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:25.853112 2571 scope.go:117] "RemoveContainer" containerID="19ee5ac58a0c756e30795850a8dd7ffa2cab9522eabd1617ab481e28f75dfbb3" Apr 17 18:52:25.860510 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:25.860493 2571 scope.go:117] "RemoveContainer" containerID="19ee5ac58a0c756e30795850a8dd7ffa2cab9522eabd1617ab481e28f75dfbb3" Apr 17 18:52:25.860760 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:52:25.860742 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19ee5ac58a0c756e30795850a8dd7ffa2cab9522eabd1617ab481e28f75dfbb3\": container with ID starting with 19ee5ac58a0c756e30795850a8dd7ffa2cab9522eabd1617ab481e28f75dfbb3 not found: ID does not exist" containerID="19ee5ac58a0c756e30795850a8dd7ffa2cab9522eabd1617ab481e28f75dfbb3" Apr 17 18:52:25.860819 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:25.860767 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19ee5ac58a0c756e30795850a8dd7ffa2cab9522eabd1617ab481e28f75dfbb3"} err="failed to get container status \"19ee5ac58a0c756e30795850a8dd7ffa2cab9522eabd1617ab481e28f75dfbb3\": rpc error: code = NotFound desc = could not find container \"19ee5ac58a0c756e30795850a8dd7ffa2cab9522eabd1617ab481e28f75dfbb3\": container with ID starting with 19ee5ac58a0c756e30795850a8dd7ffa2cab9522eabd1617ab481e28f75dfbb3 not found: ID does not exist" Apr 17 18:52:25.867341 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:25.867312 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c5646dc57-f5zjs"] Apr 17 18:52:25.870875 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:25.870853 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-c5646dc57-f5zjs"] Apr 17 18:52:27.237165 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:27.237132 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18bd33cc-af31-4a3f-85be-462b918b953f" path="/var/lib/kubelet/pods/18bd33cc-af31-4a3f-85be-462b918b953f/volumes" Apr 17 18:52:40.198944 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:40.198915 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 18:52:40.199576 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:40.199514 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="742a2272-576d-4cfe-87a1-e174cb5be061" containerName="alertmanager" containerID="cri-o://f0cee600e4140634e04f660c8b99f0c244aff7cf2696b3abcd8d2c9adc7f9676" gracePeriod=120 Apr 17 18:52:40.199769 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:40.199564 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="742a2272-576d-4cfe-87a1-e174cb5be061" containerName="kube-rbac-proxy-metric" containerID="cri-o://cf0a15749bc3428312687f2fb086a62fe72bdf82b56b1167255e7d74b74a0ca5" gracePeriod=120 Apr 17 18:52:40.199769 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:40.199622 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="742a2272-576d-4cfe-87a1-e174cb5be061" containerName="prom-label-proxy" containerID="cri-o://cd8d199e58925b1647cb5063ba4c9ec2a59b5ad2dae5408ce5497d8e0d76a1f5" gracePeriod=120 Apr 17 18:52:40.199769 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:40.199573 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="742a2272-576d-4cfe-87a1-e174cb5be061" containerName="kube-rbac-proxy-web" containerID="cri-o://8a1cb5335182ba7dcebb09cc75b027d9630c78179239680f75dcdd536180a168" gracePeriod=120 Apr 17 18:52:40.199769 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:40.199596 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="742a2272-576d-4cfe-87a1-e174cb5be061" containerName="config-reloader" containerID="cri-o://54d92f2c7e520196a1361e5e4df5fc084030f060162526915ebec5cc1b6daa32" gracePeriod=120 Apr 17 18:52:40.199769 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:40.199614 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="742a2272-576d-4cfe-87a1-e174cb5be061" containerName="kube-rbac-proxy" containerID="cri-o://167eb6169d238e429885b20462c03f0ccb269d06f28885296236053992087556" gracePeriod=120 Apr 17 18:52:40.895573 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:40.895542 2571 generic.go:358] "Generic (PLEG): container finished" podID="742a2272-576d-4cfe-87a1-e174cb5be061" containerID="cd8d199e58925b1647cb5063ba4c9ec2a59b5ad2dae5408ce5497d8e0d76a1f5" exitCode=0 Apr 17 18:52:40.895573 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:40.895567 2571 generic.go:358] "Generic (PLEG): container finished" podID="742a2272-576d-4cfe-87a1-e174cb5be061" containerID="167eb6169d238e429885b20462c03f0ccb269d06f28885296236053992087556" exitCode=0 Apr 17 18:52:40.895573 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:40.895576 2571 generic.go:358] "Generic (PLEG): container finished" podID="742a2272-576d-4cfe-87a1-e174cb5be061" containerID="54d92f2c7e520196a1361e5e4df5fc084030f060162526915ebec5cc1b6daa32" exitCode=0 Apr 17 18:52:40.895573 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:40.895582 2571 generic.go:358] "Generic (PLEG): container finished" podID="742a2272-576d-4cfe-87a1-e174cb5be061" containerID="f0cee600e4140634e04f660c8b99f0c244aff7cf2696b3abcd8d2c9adc7f9676" exitCode=0 Apr 17 18:52:40.895832 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:40.895611 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"742a2272-576d-4cfe-87a1-e174cb5be061","Type":"ContainerDied","Data":"cd8d199e58925b1647cb5063ba4c9ec2a59b5ad2dae5408ce5497d8e0d76a1f5"} Apr 17 18:52:40.895832 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:40.895643 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"742a2272-576d-4cfe-87a1-e174cb5be061","Type":"ContainerDied","Data":"167eb6169d238e429885b20462c03f0ccb269d06f28885296236053992087556"} Apr 17 18:52:40.895832 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:40.895653 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"742a2272-576d-4cfe-87a1-e174cb5be061","Type":"ContainerDied","Data":"54d92f2c7e520196a1361e5e4df5fc084030f060162526915ebec5cc1b6daa32"} Apr 17 18:52:40.895832 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:40.895663 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"742a2272-576d-4cfe-87a1-e174cb5be061","Type":"ContainerDied","Data":"f0cee600e4140634e04f660c8b99f0c244aff7cf2696b3abcd8d2c9adc7f9676"} Apr 17 18:52:41.435974 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.435952 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:52:41.569623 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.569544 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/742a2272-576d-4cfe-87a1-e174cb5be061-config-out\") pod \"742a2272-576d-4cfe-87a1-e174cb5be061\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " Apr 17 18:52:41.569623 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.569587 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/742a2272-576d-4cfe-87a1-e174cb5be061-metrics-client-ca\") pod \"742a2272-576d-4cfe-87a1-e174cb5be061\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " Apr 17 18:52:41.569623 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.569615 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/742a2272-576d-4cfe-87a1-e174cb5be061-secret-alertmanager-kube-rbac-proxy-web\") pod \"742a2272-576d-4cfe-87a1-e174cb5be061\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " Apr 17 18:52:41.569931 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.569638 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/742a2272-576d-4cfe-87a1-e174cb5be061-web-config\") pod \"742a2272-576d-4cfe-87a1-e174cb5be061\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " Apr 17 18:52:41.569931 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.569655 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/742a2272-576d-4cfe-87a1-e174cb5be061-config-volume\") pod \"742a2272-576d-4cfe-87a1-e174cb5be061\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " Apr 17 18:52:41.569931 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.569669 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/742a2272-576d-4cfe-87a1-e174cb5be061-cluster-tls-config\") pod \"742a2272-576d-4cfe-87a1-e174cb5be061\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " Apr 17 18:52:41.569931 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.569691 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/742a2272-576d-4cfe-87a1-e174cb5be061-secret-alertmanager-kube-rbac-proxy\") pod \"742a2272-576d-4cfe-87a1-e174cb5be061\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " Apr 17 18:52:41.569931 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.569720 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/742a2272-576d-4cfe-87a1-e174cb5be061-alertmanager-main-db\") pod \"742a2272-576d-4cfe-87a1-e174cb5be061\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " Apr 17 18:52:41.569931 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.569751 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/742a2272-576d-4cfe-87a1-e174cb5be061-tls-assets\") pod \"742a2272-576d-4cfe-87a1-e174cb5be061\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " Apr 17 18:52:41.569931 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.569776 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/742a2272-576d-4cfe-87a1-e174cb5be061-alertmanager-trusted-ca-bundle\") pod \"742a2272-576d-4cfe-87a1-e174cb5be061\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " Apr 17 18:52:41.569931 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.569818 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/742a2272-576d-4cfe-87a1-e174cb5be061-secret-alertmanager-main-tls\") pod \"742a2272-576d-4cfe-87a1-e174cb5be061\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " Apr 17 18:52:41.569931 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.569844 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shp2l\" (UniqueName: \"kubernetes.io/projected/742a2272-576d-4cfe-87a1-e174cb5be061-kube-api-access-shp2l\") pod \"742a2272-576d-4cfe-87a1-e174cb5be061\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " Apr 17 18:52:41.569931 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.569885 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/742a2272-576d-4cfe-87a1-e174cb5be061-secret-alertmanager-kube-rbac-proxy-metric\") pod \"742a2272-576d-4cfe-87a1-e174cb5be061\" (UID: \"742a2272-576d-4cfe-87a1-e174cb5be061\") " Apr 17 18:52:41.570433 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.569955 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/742a2272-576d-4cfe-87a1-e174cb5be061-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "742a2272-576d-4cfe-87a1-e174cb5be061" (UID: "742a2272-576d-4cfe-87a1-e174cb5be061"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:52:41.570433 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.570138 2571 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/742a2272-576d-4cfe-87a1-e174cb5be061-metrics-client-ca\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:52:41.572291 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.572258 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/742a2272-576d-4cfe-87a1-e174cb5be061-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "742a2272-576d-4cfe-87a1-e174cb5be061" (UID: "742a2272-576d-4cfe-87a1-e174cb5be061"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:52:41.572627 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.572598 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/742a2272-576d-4cfe-87a1-e174cb5be061-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "742a2272-576d-4cfe-87a1-e174cb5be061" (UID: "742a2272-576d-4cfe-87a1-e174cb5be061"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:52:41.573317 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.572873 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/742a2272-576d-4cfe-87a1-e174cb5be061-config-volume" (OuterVolumeSpecName: "config-volume") pod "742a2272-576d-4cfe-87a1-e174cb5be061" (UID: "742a2272-576d-4cfe-87a1-e174cb5be061"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:52:41.573317 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.572905 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/742a2272-576d-4cfe-87a1-e174cb5be061-config-out" (OuterVolumeSpecName: "config-out") pod "742a2272-576d-4cfe-87a1-e174cb5be061" (UID: "742a2272-576d-4cfe-87a1-e174cb5be061"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:52:41.573317 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.572926 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/742a2272-576d-4cfe-87a1-e174cb5be061-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "742a2272-576d-4cfe-87a1-e174cb5be061" (UID: "742a2272-576d-4cfe-87a1-e174cb5be061"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:52:41.573317 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.573181 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/742a2272-576d-4cfe-87a1-e174cb5be061-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "742a2272-576d-4cfe-87a1-e174cb5be061" (UID: "742a2272-576d-4cfe-87a1-e174cb5be061"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:52:41.573317 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.573287 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/742a2272-576d-4cfe-87a1-e174cb5be061-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "742a2272-576d-4cfe-87a1-e174cb5be061" (UID: "742a2272-576d-4cfe-87a1-e174cb5be061"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:52:41.573671 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.573545 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/742a2272-576d-4cfe-87a1-e174cb5be061-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "742a2272-576d-4cfe-87a1-e174cb5be061" (UID: "742a2272-576d-4cfe-87a1-e174cb5be061"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:52:41.574367 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.574337 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/742a2272-576d-4cfe-87a1-e174cb5be061-kube-api-access-shp2l" (OuterVolumeSpecName: "kube-api-access-shp2l") pod "742a2272-576d-4cfe-87a1-e174cb5be061" (UID: "742a2272-576d-4cfe-87a1-e174cb5be061"). InnerVolumeSpecName "kube-api-access-shp2l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:52:41.574514 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.574493 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/742a2272-576d-4cfe-87a1-e174cb5be061-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "742a2272-576d-4cfe-87a1-e174cb5be061" (UID: "742a2272-576d-4cfe-87a1-e174cb5be061"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:52:41.576391 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.576364 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/742a2272-576d-4cfe-87a1-e174cb5be061-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "742a2272-576d-4cfe-87a1-e174cb5be061" (UID: "742a2272-576d-4cfe-87a1-e174cb5be061"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:52:41.581857 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.581833 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/742a2272-576d-4cfe-87a1-e174cb5be061-web-config" (OuterVolumeSpecName: "web-config") pod "742a2272-576d-4cfe-87a1-e174cb5be061" (UID: "742a2272-576d-4cfe-87a1-e174cb5be061"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:52:41.670945 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.670919 2571 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/742a2272-576d-4cfe-87a1-e174cb5be061-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:52:41.670945 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.670941 2571 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/742a2272-576d-4cfe-87a1-e174cb5be061-web-config\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:52:41.671052 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.670952 2571 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/742a2272-576d-4cfe-87a1-e174cb5be061-config-volume\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:52:41.671052 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.670961 2571 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/742a2272-576d-4cfe-87a1-e174cb5be061-cluster-tls-config\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:52:41.671052 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.670970 2571 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/742a2272-576d-4cfe-87a1-e174cb5be061-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:52:41.671052 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.670979 2571 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/742a2272-576d-4cfe-87a1-e174cb5be061-alertmanager-main-db\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:52:41.671052 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.670989 2571 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/742a2272-576d-4cfe-87a1-e174cb5be061-tls-assets\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:52:41.671052 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.670997 2571 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/742a2272-576d-4cfe-87a1-e174cb5be061-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:52:41.671052 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.671006 2571 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/742a2272-576d-4cfe-87a1-e174cb5be061-secret-alertmanager-main-tls\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:52:41.671052 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.671015 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-shp2l\" (UniqueName: \"kubernetes.io/projected/742a2272-576d-4cfe-87a1-e174cb5be061-kube-api-access-shp2l\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:52:41.671052 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.671025 2571 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/742a2272-576d-4cfe-87a1-e174cb5be061-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:52:41.671052 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.671034 2571 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/742a2272-576d-4cfe-87a1-e174cb5be061-config-out\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:52:41.900732 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.900651 2571 generic.go:358] "Generic (PLEG): container finished" podID="742a2272-576d-4cfe-87a1-e174cb5be061" containerID="cf0a15749bc3428312687f2fb086a62fe72bdf82b56b1167255e7d74b74a0ca5" exitCode=0 Apr 17 18:52:41.900732 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.900678 2571 generic.go:358] "Generic (PLEG): container finished" podID="742a2272-576d-4cfe-87a1-e174cb5be061" containerID="8a1cb5335182ba7dcebb09cc75b027d9630c78179239680f75dcdd536180a168" exitCode=0 Apr 17 18:52:41.900732 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.900716 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"742a2272-576d-4cfe-87a1-e174cb5be061","Type":"ContainerDied","Data":"cf0a15749bc3428312687f2fb086a62fe72bdf82b56b1167255e7d74b74a0ca5"} Apr 17 18:52:41.900936 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.900746 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"742a2272-576d-4cfe-87a1-e174cb5be061","Type":"ContainerDied","Data":"8a1cb5335182ba7dcebb09cc75b027d9630c78179239680f75dcdd536180a168"} Apr 17 18:52:41.900936 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.900755 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:52:41.900936 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.900767 2571 scope.go:117] "RemoveContainer" containerID="cd8d199e58925b1647cb5063ba4c9ec2a59b5ad2dae5408ce5497d8e0d76a1f5" Apr 17 18:52:41.900936 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.900758 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"742a2272-576d-4cfe-87a1-e174cb5be061","Type":"ContainerDied","Data":"e6749f4fac3256e5c4ee844bb2a712973ed8c6484aba993d5b4434286142cb34"} Apr 17 18:52:41.908540 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.908396 2571 scope.go:117] "RemoveContainer" containerID="cf0a15749bc3428312687f2fb086a62fe72bdf82b56b1167255e7d74b74a0ca5" Apr 17 18:52:41.915050 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.915034 2571 scope.go:117] "RemoveContainer" containerID="167eb6169d238e429885b20462c03f0ccb269d06f28885296236053992087556" Apr 17 18:52:41.921045 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.921027 2571 scope.go:117] "RemoveContainer" containerID="8a1cb5335182ba7dcebb09cc75b027d9630c78179239680f75dcdd536180a168" Apr 17 18:52:41.923291 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.923255 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 18:52:41.927014 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.926995 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 18:52:41.928050 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.928035 2571 scope.go:117] "RemoveContainer" containerID="54d92f2c7e520196a1361e5e4df5fc084030f060162526915ebec5cc1b6daa32" Apr 17 18:52:41.934163 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.934147 2571 scope.go:117] "RemoveContainer" containerID="f0cee600e4140634e04f660c8b99f0c244aff7cf2696b3abcd8d2c9adc7f9676" Apr 17 18:52:41.940166 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.940149 2571 scope.go:117] "RemoveContainer" containerID="b5d8591ed6b887d543fe4142aca69cbbf8a3979dcaffbe49b0df8db874c3abe0" Apr 17 18:52:41.946126 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.946107 2571 scope.go:117] "RemoveContainer" containerID="cd8d199e58925b1647cb5063ba4c9ec2a59b5ad2dae5408ce5497d8e0d76a1f5" Apr 17 18:52:41.946370 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:52:41.946351 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd8d199e58925b1647cb5063ba4c9ec2a59b5ad2dae5408ce5497d8e0d76a1f5\": container with ID starting with cd8d199e58925b1647cb5063ba4c9ec2a59b5ad2dae5408ce5497d8e0d76a1f5 not found: ID does not exist" containerID="cd8d199e58925b1647cb5063ba4c9ec2a59b5ad2dae5408ce5497d8e0d76a1f5" Apr 17 18:52:41.946418 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.946379 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd8d199e58925b1647cb5063ba4c9ec2a59b5ad2dae5408ce5497d8e0d76a1f5"} err="failed to get container status \"cd8d199e58925b1647cb5063ba4c9ec2a59b5ad2dae5408ce5497d8e0d76a1f5\": rpc error: code = NotFound desc = could not find container \"cd8d199e58925b1647cb5063ba4c9ec2a59b5ad2dae5408ce5497d8e0d76a1f5\": container with ID starting with cd8d199e58925b1647cb5063ba4c9ec2a59b5ad2dae5408ce5497d8e0d76a1f5 not found: ID does not exist" Apr 17 18:52:41.946418 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.946403 2571 scope.go:117] "RemoveContainer" containerID="cf0a15749bc3428312687f2fb086a62fe72bdf82b56b1167255e7d74b74a0ca5" Apr 17 18:52:41.946657 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:52:41.946641 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf0a15749bc3428312687f2fb086a62fe72bdf82b56b1167255e7d74b74a0ca5\": container with ID starting with cf0a15749bc3428312687f2fb086a62fe72bdf82b56b1167255e7d74b74a0ca5 not found: ID does not exist" containerID="cf0a15749bc3428312687f2fb086a62fe72bdf82b56b1167255e7d74b74a0ca5" Apr 17 18:52:41.946706 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.946661 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf0a15749bc3428312687f2fb086a62fe72bdf82b56b1167255e7d74b74a0ca5"} err="failed to get container status \"cf0a15749bc3428312687f2fb086a62fe72bdf82b56b1167255e7d74b74a0ca5\": rpc error: code = NotFound desc = could not find container \"cf0a15749bc3428312687f2fb086a62fe72bdf82b56b1167255e7d74b74a0ca5\": container with ID starting with cf0a15749bc3428312687f2fb086a62fe72bdf82b56b1167255e7d74b74a0ca5 not found: ID does not exist" Apr 17 18:52:41.946706 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.946675 2571 scope.go:117] "RemoveContainer" containerID="167eb6169d238e429885b20462c03f0ccb269d06f28885296236053992087556" Apr 17 18:52:41.946919 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:52:41.946902 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"167eb6169d238e429885b20462c03f0ccb269d06f28885296236053992087556\": container with ID starting with 167eb6169d238e429885b20462c03f0ccb269d06f28885296236053992087556 not found: ID does not exist" containerID="167eb6169d238e429885b20462c03f0ccb269d06f28885296236053992087556" Apr 17 18:52:41.946960 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.946934 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"167eb6169d238e429885b20462c03f0ccb269d06f28885296236053992087556"} err="failed to get container status \"167eb6169d238e429885b20462c03f0ccb269d06f28885296236053992087556\": rpc error: code = NotFound desc = could not find container \"167eb6169d238e429885b20462c03f0ccb269d06f28885296236053992087556\": container with ID starting with 167eb6169d238e429885b20462c03f0ccb269d06f28885296236053992087556 not found: ID does not exist" Apr 17 18:52:41.946960 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.946949 2571 scope.go:117] "RemoveContainer" containerID="8a1cb5335182ba7dcebb09cc75b027d9630c78179239680f75dcdd536180a168" Apr 17 18:52:41.947184 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:52:41.947164 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a1cb5335182ba7dcebb09cc75b027d9630c78179239680f75dcdd536180a168\": container with ID starting with 8a1cb5335182ba7dcebb09cc75b027d9630c78179239680f75dcdd536180a168 not found: ID does not exist" containerID="8a1cb5335182ba7dcebb09cc75b027d9630c78179239680f75dcdd536180a168" Apr 17 18:52:41.947235 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.947189 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a1cb5335182ba7dcebb09cc75b027d9630c78179239680f75dcdd536180a168"} err="failed to get container status \"8a1cb5335182ba7dcebb09cc75b027d9630c78179239680f75dcdd536180a168\": rpc error: code = NotFound desc = could not find container \"8a1cb5335182ba7dcebb09cc75b027d9630c78179239680f75dcdd536180a168\": container with ID starting with 8a1cb5335182ba7dcebb09cc75b027d9630c78179239680f75dcdd536180a168 not found: ID does not exist" Apr 17 18:52:41.947235 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.947204 2571 scope.go:117] "RemoveContainer" containerID="54d92f2c7e520196a1361e5e4df5fc084030f060162526915ebec5cc1b6daa32" Apr 17 18:52:41.947431 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:52:41.947415 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54d92f2c7e520196a1361e5e4df5fc084030f060162526915ebec5cc1b6daa32\": container with ID starting with 54d92f2c7e520196a1361e5e4df5fc084030f060162526915ebec5cc1b6daa32 not found: ID does not exist" containerID="54d92f2c7e520196a1361e5e4df5fc084030f060162526915ebec5cc1b6daa32" Apr 17 18:52:41.947524 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.947436 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54d92f2c7e520196a1361e5e4df5fc084030f060162526915ebec5cc1b6daa32"} err="failed to get container status \"54d92f2c7e520196a1361e5e4df5fc084030f060162526915ebec5cc1b6daa32\": rpc error: code = NotFound desc = could not find container \"54d92f2c7e520196a1361e5e4df5fc084030f060162526915ebec5cc1b6daa32\": container with ID starting with 54d92f2c7e520196a1361e5e4df5fc084030f060162526915ebec5cc1b6daa32 not found: ID does not exist" Apr 17 18:52:41.947524 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.947450 2571 scope.go:117] "RemoveContainer" containerID="f0cee600e4140634e04f660c8b99f0c244aff7cf2696b3abcd8d2c9adc7f9676" Apr 17 18:52:41.947694 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:52:41.947679 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0cee600e4140634e04f660c8b99f0c244aff7cf2696b3abcd8d2c9adc7f9676\": container with ID starting with f0cee600e4140634e04f660c8b99f0c244aff7cf2696b3abcd8d2c9adc7f9676 not found: ID does not exist" containerID="f0cee600e4140634e04f660c8b99f0c244aff7cf2696b3abcd8d2c9adc7f9676" Apr 17 18:52:41.947730 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.947696 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0cee600e4140634e04f660c8b99f0c244aff7cf2696b3abcd8d2c9adc7f9676"} err="failed to get container status \"f0cee600e4140634e04f660c8b99f0c244aff7cf2696b3abcd8d2c9adc7f9676\": rpc error: code = NotFound desc = could not find container \"f0cee600e4140634e04f660c8b99f0c244aff7cf2696b3abcd8d2c9adc7f9676\": container with ID starting with f0cee600e4140634e04f660c8b99f0c244aff7cf2696b3abcd8d2c9adc7f9676 not found: ID does not exist" Apr 17 18:52:41.947730 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.947717 2571 scope.go:117] "RemoveContainer" containerID="b5d8591ed6b887d543fe4142aca69cbbf8a3979dcaffbe49b0df8db874c3abe0" Apr 17 18:52:41.947928 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:52:41.947913 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5d8591ed6b887d543fe4142aca69cbbf8a3979dcaffbe49b0df8db874c3abe0\": container with ID starting with b5d8591ed6b887d543fe4142aca69cbbf8a3979dcaffbe49b0df8db874c3abe0 not found: ID does not exist" containerID="b5d8591ed6b887d543fe4142aca69cbbf8a3979dcaffbe49b0df8db874c3abe0" Apr 17 18:52:41.947966 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.947931 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5d8591ed6b887d543fe4142aca69cbbf8a3979dcaffbe49b0df8db874c3abe0"} err="failed to get container status \"b5d8591ed6b887d543fe4142aca69cbbf8a3979dcaffbe49b0df8db874c3abe0\": rpc error: code = NotFound desc = could not find container \"b5d8591ed6b887d543fe4142aca69cbbf8a3979dcaffbe49b0df8db874c3abe0\": container with ID starting with b5d8591ed6b887d543fe4142aca69cbbf8a3979dcaffbe49b0df8db874c3abe0 not found: ID does not exist" Apr 17 18:52:41.947966 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.947945 2571 scope.go:117] "RemoveContainer" containerID="cd8d199e58925b1647cb5063ba4c9ec2a59b5ad2dae5408ce5497d8e0d76a1f5" Apr 17 18:52:41.948163 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.948145 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd8d199e58925b1647cb5063ba4c9ec2a59b5ad2dae5408ce5497d8e0d76a1f5"} err="failed to get container status \"cd8d199e58925b1647cb5063ba4c9ec2a59b5ad2dae5408ce5497d8e0d76a1f5\": rpc error: code = NotFound desc = could not find container \"cd8d199e58925b1647cb5063ba4c9ec2a59b5ad2dae5408ce5497d8e0d76a1f5\": container with ID starting with cd8d199e58925b1647cb5063ba4c9ec2a59b5ad2dae5408ce5497d8e0d76a1f5 not found: ID does not exist" Apr 17 18:52:41.948223 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.948164 2571 scope.go:117] "RemoveContainer" containerID="cf0a15749bc3428312687f2fb086a62fe72bdf82b56b1167255e7d74b74a0ca5" Apr 17 18:52:41.948381 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.948359 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf0a15749bc3428312687f2fb086a62fe72bdf82b56b1167255e7d74b74a0ca5"} err="failed to get container status \"cf0a15749bc3428312687f2fb086a62fe72bdf82b56b1167255e7d74b74a0ca5\": rpc error: code = NotFound desc = could not find container \"cf0a15749bc3428312687f2fb086a62fe72bdf82b56b1167255e7d74b74a0ca5\": container with ID starting with cf0a15749bc3428312687f2fb086a62fe72bdf82b56b1167255e7d74b74a0ca5 not found: ID does not exist" Apr 17 18:52:41.948381 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.948381 2571 scope.go:117] "RemoveContainer" containerID="167eb6169d238e429885b20462c03f0ccb269d06f28885296236053992087556" Apr 17 18:52:41.948609 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.948593 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"167eb6169d238e429885b20462c03f0ccb269d06f28885296236053992087556"} err="failed to get container status \"167eb6169d238e429885b20462c03f0ccb269d06f28885296236053992087556\": rpc error: code = NotFound desc = could not find container \"167eb6169d238e429885b20462c03f0ccb269d06f28885296236053992087556\": container with ID starting with 167eb6169d238e429885b20462c03f0ccb269d06f28885296236053992087556 not found: ID does not exist" Apr 17 18:52:41.948650 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.948610 2571 scope.go:117] "RemoveContainer" containerID="8a1cb5335182ba7dcebb09cc75b027d9630c78179239680f75dcdd536180a168" Apr 17 18:52:41.948820 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.948803 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a1cb5335182ba7dcebb09cc75b027d9630c78179239680f75dcdd536180a168"} err="failed to get container status \"8a1cb5335182ba7dcebb09cc75b027d9630c78179239680f75dcdd536180a168\": rpc error: code = NotFound desc = could not find container \"8a1cb5335182ba7dcebb09cc75b027d9630c78179239680f75dcdd536180a168\": container with ID starting with 8a1cb5335182ba7dcebb09cc75b027d9630c78179239680f75dcdd536180a168 not found: ID does not exist" Apr 17 18:52:41.948820 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.948820 2571 scope.go:117] "RemoveContainer" containerID="54d92f2c7e520196a1361e5e4df5fc084030f060162526915ebec5cc1b6daa32" Apr 17 18:52:41.949014 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.948993 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54d92f2c7e520196a1361e5e4df5fc084030f060162526915ebec5cc1b6daa32"} err="failed to get container status \"54d92f2c7e520196a1361e5e4df5fc084030f060162526915ebec5cc1b6daa32\": rpc error: code = NotFound desc = could not find container \"54d92f2c7e520196a1361e5e4df5fc084030f060162526915ebec5cc1b6daa32\": container with ID starting with 54d92f2c7e520196a1361e5e4df5fc084030f060162526915ebec5cc1b6daa32 not found: ID does not exist" Apr 17 18:52:41.949055 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.949015 2571 scope.go:117] "RemoveContainer" containerID="f0cee600e4140634e04f660c8b99f0c244aff7cf2696b3abcd8d2c9adc7f9676" Apr 17 18:52:41.949220 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.949206 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0cee600e4140634e04f660c8b99f0c244aff7cf2696b3abcd8d2c9adc7f9676"} err="failed to get container status \"f0cee600e4140634e04f660c8b99f0c244aff7cf2696b3abcd8d2c9adc7f9676\": rpc error: code = NotFound desc = could not find container \"f0cee600e4140634e04f660c8b99f0c244aff7cf2696b3abcd8d2c9adc7f9676\": container with ID starting with f0cee600e4140634e04f660c8b99f0c244aff7cf2696b3abcd8d2c9adc7f9676 not found: ID does not exist" Apr 17 18:52:41.949273 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.949220 2571 scope.go:117] "RemoveContainer" containerID="b5d8591ed6b887d543fe4142aca69cbbf8a3979dcaffbe49b0df8db874c3abe0" Apr 17 18:52:41.949432 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.949417 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5d8591ed6b887d543fe4142aca69cbbf8a3979dcaffbe49b0df8db874c3abe0"} err="failed to get container status \"b5d8591ed6b887d543fe4142aca69cbbf8a3979dcaffbe49b0df8db874c3abe0\": rpc error: code = NotFound desc = could not find container \"b5d8591ed6b887d543fe4142aca69cbbf8a3979dcaffbe49b0df8db874c3abe0\": container with ID starting with b5d8591ed6b887d543fe4142aca69cbbf8a3979dcaffbe49b0df8db874c3abe0 not found: ID does not exist" Apr 17 18:52:41.952952 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.952932 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 18:52:41.953172 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.953159 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="742a2272-576d-4cfe-87a1-e174cb5be061" containerName="kube-rbac-proxy" Apr 17 18:52:41.953224 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.953174 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="742a2272-576d-4cfe-87a1-e174cb5be061" containerName="kube-rbac-proxy" Apr 17 18:52:41.953224 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.953182 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="742a2272-576d-4cfe-87a1-e174cb5be061" containerName="kube-rbac-proxy-metric" Apr 17 18:52:41.953224 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.953188 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="742a2272-576d-4cfe-87a1-e174cb5be061" containerName="kube-rbac-proxy-metric" Apr 17 18:52:41.953224 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.953194 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="742a2272-576d-4cfe-87a1-e174cb5be061" containerName="prom-label-proxy" Apr 17 18:52:41.953224 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.953200 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="742a2272-576d-4cfe-87a1-e174cb5be061" containerName="prom-label-proxy" Apr 17 18:52:41.953224 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.953207 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="18bd33cc-af31-4a3f-85be-462b918b953f" containerName="console" Apr 17 18:52:41.953224 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.953212 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="18bd33cc-af31-4a3f-85be-462b918b953f" containerName="console" Apr 17 18:52:41.953224 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.953220 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="742a2272-576d-4cfe-87a1-e174cb5be061" containerName="init-config-reloader" Apr 17 18:52:41.953224 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.953226 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="742a2272-576d-4cfe-87a1-e174cb5be061" containerName="init-config-reloader" Apr 17 18:52:41.953514 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.953237 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="742a2272-576d-4cfe-87a1-e174cb5be061" containerName="config-reloader" Apr 17 18:52:41.953514 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.953257 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="742a2272-576d-4cfe-87a1-e174cb5be061" containerName="config-reloader" Apr 17 18:52:41.953514 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.953264 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="742a2272-576d-4cfe-87a1-e174cb5be061" containerName="alertmanager" Apr 17 18:52:41.953514 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.953269 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="742a2272-576d-4cfe-87a1-e174cb5be061" containerName="alertmanager" Apr 17 18:52:41.953514 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.953283 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="742a2272-576d-4cfe-87a1-e174cb5be061" containerName="kube-rbac-proxy-web" Apr 17 18:52:41.953514 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.953289 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="742a2272-576d-4cfe-87a1-e174cb5be061" containerName="kube-rbac-proxy-web" Apr 17 18:52:41.953514 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.953328 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="742a2272-576d-4cfe-87a1-e174cb5be061" containerName="alertmanager" Apr 17 18:52:41.953514 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.953335 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="742a2272-576d-4cfe-87a1-e174cb5be061" containerName="prom-label-proxy" Apr 17 18:52:41.953514 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.953342 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="18bd33cc-af31-4a3f-85be-462b918b953f" containerName="console" Apr 17 18:52:41.953514 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.953348 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="742a2272-576d-4cfe-87a1-e174cb5be061" containerName="kube-rbac-proxy" Apr 17 18:52:41.953514 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.953353 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="742a2272-576d-4cfe-87a1-e174cb5be061" containerName="kube-rbac-proxy-metric" Apr 17 18:52:41.953514 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.953360 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="742a2272-576d-4cfe-87a1-e174cb5be061" containerName="config-reloader" Apr 17 18:52:41.953514 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.953365 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="742a2272-576d-4cfe-87a1-e174cb5be061" containerName="kube-rbac-proxy-web" Apr 17 18:52:41.958194 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.958179 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:52:41.960396 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.960376 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 18:52:41.960507 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.960399 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 18:52:41.960507 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.960441 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 18:52:41.960785 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.960765 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 18:52:41.960850 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.960801 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-57tht\"" Apr 17 18:52:41.960850 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.960772 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 18:52:41.960850 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.960769 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 18:52:41.961087 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.961066 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 18:52:41.961168 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.961122 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 18:52:41.965854 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.965838 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 18:52:41.968794 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:41.968774 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 18:52:42.074754 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:42.074723 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c68c979e-cae5-44ab-8530-6033686ab885-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c68c979e-cae5-44ab-8530-6033686ab885\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:52:42.074754 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:42.074758 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c68c979e-cae5-44ab-8530-6033686ab885-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c68c979e-cae5-44ab-8530-6033686ab885\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:52:42.074951 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:42.074780 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c68c979e-cae5-44ab-8530-6033686ab885-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c68c979e-cae5-44ab-8530-6033686ab885\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:52:42.074951 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:42.074832 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g25vr\" (UniqueName: \"kubernetes.io/projected/c68c979e-cae5-44ab-8530-6033686ab885-kube-api-access-g25vr\") pod \"alertmanager-main-0\" (UID: \"c68c979e-cae5-44ab-8530-6033686ab885\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:52:42.074951 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:42.074905 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c68c979e-cae5-44ab-8530-6033686ab885-web-config\") pod \"alertmanager-main-0\" (UID: \"c68c979e-cae5-44ab-8530-6033686ab885\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:52:42.074951 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:42.074939 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c68c979e-cae5-44ab-8530-6033686ab885-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c68c979e-cae5-44ab-8530-6033686ab885\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:52:42.075079 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:42.074969 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c68c979e-cae5-44ab-8530-6033686ab885-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c68c979e-cae5-44ab-8530-6033686ab885\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:52:42.075079 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:42.074986 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c68c979e-cae5-44ab-8530-6033686ab885-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c68c979e-cae5-44ab-8530-6033686ab885\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:52:42.075079 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:42.075008 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c68c979e-cae5-44ab-8530-6033686ab885-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c68c979e-cae5-44ab-8530-6033686ab885\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:52:42.075079 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:42.075029 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c68c979e-cae5-44ab-8530-6033686ab885-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c68c979e-cae5-44ab-8530-6033686ab885\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:52:42.075079 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:42.075054 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c68c979e-cae5-44ab-8530-6033686ab885-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c68c979e-cae5-44ab-8530-6033686ab885\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:52:42.075079 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:42.075073 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c68c979e-cae5-44ab-8530-6033686ab885-config-volume\") pod \"alertmanager-main-0\" (UID: \"c68c979e-cae5-44ab-8530-6033686ab885\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:52:42.075258 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:42.075092 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c68c979e-cae5-44ab-8530-6033686ab885-config-out\") pod \"alertmanager-main-0\" (UID: \"c68c979e-cae5-44ab-8530-6033686ab885\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:52:42.176337 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:42.176270 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c68c979e-cae5-44ab-8530-6033686ab885-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c68c979e-cae5-44ab-8530-6033686ab885\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:52:42.176337 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:42.176305 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c68c979e-cae5-44ab-8530-6033686ab885-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c68c979e-cae5-44ab-8530-6033686ab885\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:52:42.176337 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:42.176324 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c68c979e-cae5-44ab-8530-6033686ab885-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c68c979e-cae5-44ab-8530-6033686ab885\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:52:42.176578 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:42.176344 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c68c979e-cae5-44ab-8530-6033686ab885-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c68c979e-cae5-44ab-8530-6033686ab885\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:52:42.176578 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:42.176364 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c68c979e-cae5-44ab-8530-6033686ab885-config-volume\") pod \"alertmanager-main-0\" (UID: \"c68c979e-cae5-44ab-8530-6033686ab885\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:52:42.176578 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:42.176383 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c68c979e-cae5-44ab-8530-6033686ab885-config-out\") pod \"alertmanager-main-0\" (UID: \"c68c979e-cae5-44ab-8530-6033686ab885\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:52:42.176578 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:42.176414 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c68c979e-cae5-44ab-8530-6033686ab885-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c68c979e-cae5-44ab-8530-6033686ab885\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:52:42.176578 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:42.176493 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c68c979e-cae5-44ab-8530-6033686ab885-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c68c979e-cae5-44ab-8530-6033686ab885\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:52:42.176578 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:42.176525 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c68c979e-cae5-44ab-8530-6033686ab885-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c68c979e-cae5-44ab-8530-6033686ab885\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:52:42.176840 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:42.176585 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g25vr\" (UniqueName: \"kubernetes.io/projected/c68c979e-cae5-44ab-8530-6033686ab885-kube-api-access-g25vr\") pod \"alertmanager-main-0\" (UID: \"c68c979e-cae5-44ab-8530-6033686ab885\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:52:42.176840 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:42.176622 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c68c979e-cae5-44ab-8530-6033686ab885-web-config\") pod \"alertmanager-main-0\" (UID: \"c68c979e-cae5-44ab-8530-6033686ab885\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:52:42.176840 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:42.176646 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c68c979e-cae5-44ab-8530-6033686ab885-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c68c979e-cae5-44ab-8530-6033686ab885\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:52:42.176840 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:42.176676 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c68c979e-cae5-44ab-8530-6033686ab885-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c68c979e-cae5-44ab-8530-6033686ab885\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:52:42.177316 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:42.177289 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c68c979e-cae5-44ab-8530-6033686ab885-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c68c979e-cae5-44ab-8530-6033686ab885\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:52:42.178060 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:42.177770 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c68c979e-cae5-44ab-8530-6033686ab885-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c68c979e-cae5-44ab-8530-6033686ab885\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:52:42.179498 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:42.179399 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c68c979e-cae5-44ab-8530-6033686ab885-config-out\") pod \"alertmanager-main-0\" (UID: \"c68c979e-cae5-44ab-8530-6033686ab885\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:52:42.179498 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:42.179409 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c68c979e-cae5-44ab-8530-6033686ab885-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c68c979e-cae5-44ab-8530-6033686ab885\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:52:42.179498 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:42.177299 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c68c979e-cae5-44ab-8530-6033686ab885-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c68c979e-cae5-44ab-8530-6033686ab885\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:52:42.179498 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:42.179449 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c68c979e-cae5-44ab-8530-6033686ab885-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c68c979e-cae5-44ab-8530-6033686ab885\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:52:42.179728 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:42.179503 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c68c979e-cae5-44ab-8530-6033686ab885-config-volume\") pod \"alertmanager-main-0\" (UID: \"c68c979e-cae5-44ab-8530-6033686ab885\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:52:42.179728 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:42.179596 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c68c979e-cae5-44ab-8530-6033686ab885-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c68c979e-cae5-44ab-8530-6033686ab885\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:52:42.179891 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:42.179869 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c68c979e-cae5-44ab-8530-6033686ab885-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c68c979e-cae5-44ab-8530-6033686ab885\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:52:42.180267 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:42.180249 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c68c979e-cae5-44ab-8530-6033686ab885-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c68c979e-cae5-44ab-8530-6033686ab885\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:52:42.180553 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:42.180535 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c68c979e-cae5-44ab-8530-6033686ab885-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c68c979e-cae5-44ab-8530-6033686ab885\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:52:42.181350 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:42.181335 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c68c979e-cae5-44ab-8530-6033686ab885-web-config\") pod \"alertmanager-main-0\" (UID: \"c68c979e-cae5-44ab-8530-6033686ab885\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:52:42.186601 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:42.186586 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g25vr\" (UniqueName: \"kubernetes.io/projected/c68c979e-cae5-44ab-8530-6033686ab885-kube-api-access-g25vr\") pod \"alertmanager-main-0\" (UID: \"c68c979e-cae5-44ab-8530-6033686ab885\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:52:42.267201 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:42.267178 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 18:52:42.382218 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:42.382180 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 18:52:42.386728 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:52:42.386702 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc68c979e_cae5_44ab_8530_6033686ab885.slice/crio-ab29c889e7e6736765af73751402eb3a062b1b3095268bfa2b6347aa3cb448d8 WatchSource:0}: Error finding container ab29c889e7e6736765af73751402eb3a062b1b3095268bfa2b6347aa3cb448d8: Status 404 returned error can't find the container with id ab29c889e7e6736765af73751402eb3a062b1b3095268bfa2b6347aa3cb448d8 Apr 17 18:52:42.905694 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:42.905658 2571 generic.go:358] "Generic (PLEG): container finished" podID="c68c979e-cae5-44ab-8530-6033686ab885" containerID="06dacaad9a5fadbe4604ab5f8e24b639db56244d139baf8568f767e4ae1d2326" exitCode=0 Apr 17 18:52:42.906041 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:42.905719 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c68c979e-cae5-44ab-8530-6033686ab885","Type":"ContainerDied","Data":"06dacaad9a5fadbe4604ab5f8e24b639db56244d139baf8568f767e4ae1d2326"} Apr 17 18:52:42.906041 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:42.905745 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c68c979e-cae5-44ab-8530-6033686ab885","Type":"ContainerStarted","Data":"ab29c889e7e6736765af73751402eb3a062b1b3095268bfa2b6347aa3cb448d8"} Apr 17 18:52:43.239241 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:43.239209 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="742a2272-576d-4cfe-87a1-e174cb5be061" path="/var/lib/kubelet/pods/742a2272-576d-4cfe-87a1-e174cb5be061/volumes" Apr 17 18:52:43.911203 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:43.911170 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c68c979e-cae5-44ab-8530-6033686ab885","Type":"ContainerStarted","Data":"bb4eb6b481bddf60dfd6dd8541efa4df9d367c3c777f79d9b17846a99218a70b"} Apr 17 18:52:43.911203 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:43.911205 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c68c979e-cae5-44ab-8530-6033686ab885","Type":"ContainerStarted","Data":"a35d91333793a29dd5cbd7a0455c4f8cdae4bce43d9974aeb61342603734514c"} Apr 17 18:52:43.911619 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:43.911215 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c68c979e-cae5-44ab-8530-6033686ab885","Type":"ContainerStarted","Data":"29fda0c9c7aae4ed8d611c8a0977823e022d4372738d00ee60132ebff9e55314"} Apr 17 18:52:43.911619 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:43.911223 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c68c979e-cae5-44ab-8530-6033686ab885","Type":"ContainerStarted","Data":"a811aa2c5fdbf3c9794aa501b8e5d15d642515dab5ddefb50941feea14a47c1d"} Apr 17 18:52:43.911619 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:43.911230 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c68c979e-cae5-44ab-8530-6033686ab885","Type":"ContainerStarted","Data":"2598d306ca30dfd0b268255d3392824be58fb4d483e7bf6fb2a3ab2d7897d431"} Apr 17 18:52:43.911619 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:43.911238 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c68c979e-cae5-44ab-8530-6033686ab885","Type":"ContainerStarted","Data":"f94500b61f53ce7777e2faddb072d928aca7d5ed5fca6d64f68822d6ce87b452"} Apr 17 18:52:43.937953 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:43.937908 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.937896248 podStartE2EDuration="2.937896248s" podCreationTimestamp="2026-04-17 18:52:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:52:43.937224612 +0000 UTC m=+203.211241904" watchObservedRunningTime="2026-04-17 18:52:43.937896248 +0000 UTC m=+203.211913541" Apr 17 18:52:44.222262 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:44.222172 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-759c9978dc-xj9k4"] Apr 17 18:52:44.225746 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:44.225721 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-759c9978dc-xj9k4" Apr 17 18:52:44.228072 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:44.228037 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 17 18:52:44.228195 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:44.228072 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 17 18:52:44.228302 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:44.228277 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-m6wzr\"" Apr 17 18:52:44.228392 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:44.228339 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 17 18:52:44.228472 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:44.228409 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 17 18:52:44.228550 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:44.228532 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 17 18:52:44.232902 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:44.232878 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 17 18:52:44.235150 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:44.235129 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-759c9978dc-xj9k4"] Apr 17 18:52:44.294570 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:44.294543 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/cc158f8a-b93a-495f-9067-c18fb50820cd-secret-telemeter-client\") pod \"telemeter-client-759c9978dc-xj9k4\" (UID: \"cc158f8a-b93a-495f-9067-c18fb50820cd\") " pod="openshift-monitoring/telemeter-client-759c9978dc-xj9k4" Apr 17 18:52:44.294726 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:44.294580 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cc158f8a-b93a-495f-9067-c18fb50820cd-metrics-client-ca\") pod \"telemeter-client-759c9978dc-xj9k4\" (UID: \"cc158f8a-b93a-495f-9067-c18fb50820cd\") " pod="openshift-monitoring/telemeter-client-759c9978dc-xj9k4" Apr 17 18:52:44.294726 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:44.294597 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw7v7\" (UniqueName: \"kubernetes.io/projected/cc158f8a-b93a-495f-9067-c18fb50820cd-kube-api-access-hw7v7\") pod \"telemeter-client-759c9978dc-xj9k4\" (UID: \"cc158f8a-b93a-495f-9067-c18fb50820cd\") " pod="openshift-monitoring/telemeter-client-759c9978dc-xj9k4" Apr 17 18:52:44.294726 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:44.294679 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/cc158f8a-b93a-495f-9067-c18fb50820cd-federate-client-tls\") pod \"telemeter-client-759c9978dc-xj9k4\" (UID: \"cc158f8a-b93a-495f-9067-c18fb50820cd\") " pod="openshift-monitoring/telemeter-client-759c9978dc-xj9k4" Apr 17 18:52:44.294726 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:44.294721 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/cc158f8a-b93a-495f-9067-c18fb50820cd-telemeter-client-tls\") pod \"telemeter-client-759c9978dc-xj9k4\" (UID: \"cc158f8a-b93a-495f-9067-c18fb50820cd\") " pod="openshift-monitoring/telemeter-client-759c9978dc-xj9k4" Apr 17 18:52:44.294884 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:44.294744 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc158f8a-b93a-495f-9067-c18fb50820cd-serving-certs-ca-bundle\") pod \"telemeter-client-759c9978dc-xj9k4\" (UID: \"cc158f8a-b93a-495f-9067-c18fb50820cd\") " pod="openshift-monitoring/telemeter-client-759c9978dc-xj9k4" Apr 17 18:52:44.294884 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:44.294766 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cc158f8a-b93a-495f-9067-c18fb50820cd-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-759c9978dc-xj9k4\" (UID: \"cc158f8a-b93a-495f-9067-c18fb50820cd\") " pod="openshift-monitoring/telemeter-client-759c9978dc-xj9k4" Apr 17 18:52:44.294884 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:44.294837 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc158f8a-b93a-495f-9067-c18fb50820cd-telemeter-trusted-ca-bundle\") pod \"telemeter-client-759c9978dc-xj9k4\" (UID: \"cc158f8a-b93a-495f-9067-c18fb50820cd\") " pod="openshift-monitoring/telemeter-client-759c9978dc-xj9k4" Apr 17 18:52:44.395201 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:44.395166 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/cc158f8a-b93a-495f-9067-c18fb50820cd-telemeter-client-tls\") pod \"telemeter-client-759c9978dc-xj9k4\" (UID: \"cc158f8a-b93a-495f-9067-c18fb50820cd\") " pod="openshift-monitoring/telemeter-client-759c9978dc-xj9k4" Apr 17 18:52:44.395343 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:44.395216 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc158f8a-b93a-495f-9067-c18fb50820cd-serving-certs-ca-bundle\") pod \"telemeter-client-759c9978dc-xj9k4\" (UID: \"cc158f8a-b93a-495f-9067-c18fb50820cd\") " pod="openshift-monitoring/telemeter-client-759c9978dc-xj9k4" Apr 17 18:52:44.395343 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:44.395248 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cc158f8a-b93a-495f-9067-c18fb50820cd-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-759c9978dc-xj9k4\" (UID: \"cc158f8a-b93a-495f-9067-c18fb50820cd\") " pod="openshift-monitoring/telemeter-client-759c9978dc-xj9k4" Apr 17 18:52:44.395343 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:44.395278 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc158f8a-b93a-495f-9067-c18fb50820cd-telemeter-trusted-ca-bundle\") pod \"telemeter-client-759c9978dc-xj9k4\" (UID: \"cc158f8a-b93a-495f-9067-c18fb50820cd\") " pod="openshift-monitoring/telemeter-client-759c9978dc-xj9k4" Apr 17 18:52:44.395343 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:44.395307 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/cc158f8a-b93a-495f-9067-c18fb50820cd-secret-telemeter-client\") pod \"telemeter-client-759c9978dc-xj9k4\" (UID: \"cc158f8a-b93a-495f-9067-c18fb50820cd\") " pod="openshift-monitoring/telemeter-client-759c9978dc-xj9k4" Apr 17 18:52:44.395343 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:44.395332 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cc158f8a-b93a-495f-9067-c18fb50820cd-metrics-client-ca\") pod \"telemeter-client-759c9978dc-xj9k4\" (UID: \"cc158f8a-b93a-495f-9067-c18fb50820cd\") " pod="openshift-monitoring/telemeter-client-759c9978dc-xj9k4" Apr 17 18:52:44.395649 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:44.395351 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hw7v7\" (UniqueName: \"kubernetes.io/projected/cc158f8a-b93a-495f-9067-c18fb50820cd-kube-api-access-hw7v7\") pod \"telemeter-client-759c9978dc-xj9k4\" (UID: \"cc158f8a-b93a-495f-9067-c18fb50820cd\") " pod="openshift-monitoring/telemeter-client-759c9978dc-xj9k4" Apr 17 18:52:44.395649 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:44.395379 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/cc158f8a-b93a-495f-9067-c18fb50820cd-federate-client-tls\") pod \"telemeter-client-759c9978dc-xj9k4\" (UID: \"cc158f8a-b93a-495f-9067-c18fb50820cd\") " pod="openshift-monitoring/telemeter-client-759c9978dc-xj9k4" Apr 17 18:52:44.396080 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:44.396046 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc158f8a-b93a-495f-9067-c18fb50820cd-serving-certs-ca-bundle\") pod \"telemeter-client-759c9978dc-xj9k4\" (UID: \"cc158f8a-b93a-495f-9067-c18fb50820cd\") " pod="openshift-monitoring/telemeter-client-759c9978dc-xj9k4" Apr 17 18:52:44.396330 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:44.396307 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cc158f8a-b93a-495f-9067-c18fb50820cd-metrics-client-ca\") pod \"telemeter-client-759c9978dc-xj9k4\" (UID: \"cc158f8a-b93a-495f-9067-c18fb50820cd\") " pod="openshift-monitoring/telemeter-client-759c9978dc-xj9k4" Apr 17 18:52:44.396330 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:44.396321 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc158f8a-b93a-495f-9067-c18fb50820cd-telemeter-trusted-ca-bundle\") pod \"telemeter-client-759c9978dc-xj9k4\" (UID: \"cc158f8a-b93a-495f-9067-c18fb50820cd\") " pod="openshift-monitoring/telemeter-client-759c9978dc-xj9k4" Apr 17 18:52:44.397845 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:44.397819 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cc158f8a-b93a-495f-9067-c18fb50820cd-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-759c9978dc-xj9k4\" (UID: \"cc158f8a-b93a-495f-9067-c18fb50820cd\") " pod="openshift-monitoring/telemeter-client-759c9978dc-xj9k4" Apr 17 18:52:44.397958 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:44.397917 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/cc158f8a-b93a-495f-9067-c18fb50820cd-telemeter-client-tls\") pod \"telemeter-client-759c9978dc-xj9k4\" (UID: \"cc158f8a-b93a-495f-9067-c18fb50820cd\") " pod="openshift-monitoring/telemeter-client-759c9978dc-xj9k4" Apr 17 18:52:44.398186 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:44.398167 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/cc158f8a-b93a-495f-9067-c18fb50820cd-secret-telemeter-client\") pod \"telemeter-client-759c9978dc-xj9k4\" (UID: \"cc158f8a-b93a-495f-9067-c18fb50820cd\") " pod="openshift-monitoring/telemeter-client-759c9978dc-xj9k4" Apr 17 18:52:44.398311 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:44.398291 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/cc158f8a-b93a-495f-9067-c18fb50820cd-federate-client-tls\") pod \"telemeter-client-759c9978dc-xj9k4\" (UID: \"cc158f8a-b93a-495f-9067-c18fb50820cd\") " pod="openshift-monitoring/telemeter-client-759c9978dc-xj9k4" Apr 17 18:52:44.402947 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:44.402930 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw7v7\" (UniqueName: \"kubernetes.io/projected/cc158f8a-b93a-495f-9067-c18fb50820cd-kube-api-access-hw7v7\") pod \"telemeter-client-759c9978dc-xj9k4\" (UID: \"cc158f8a-b93a-495f-9067-c18fb50820cd\") " pod="openshift-monitoring/telemeter-client-759c9978dc-xj9k4" Apr 17 18:52:44.536843 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:44.536810 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-759c9978dc-xj9k4" Apr 17 18:52:44.654936 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:44.654907 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-759c9978dc-xj9k4"] Apr 17 18:52:44.657299 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:52:44.657272 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc158f8a_b93a_495f_9067_c18fb50820cd.slice/crio-6e9c091c0af74a4aa5c4a7eb2eadfe69302efecb46accdbc0644a247cc896363 WatchSource:0}: Error finding container 6e9c091c0af74a4aa5c4a7eb2eadfe69302efecb46accdbc0644a247cc896363: Status 404 returned error can't find the container with id 6e9c091c0af74a4aa5c4a7eb2eadfe69302efecb46accdbc0644a247cc896363 Apr 17 18:52:44.916302 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:44.916211 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-759c9978dc-xj9k4" event={"ID":"cc158f8a-b93a-495f-9067-c18fb50820cd","Type":"ContainerStarted","Data":"6e9c091c0af74a4aa5c4a7eb2eadfe69302efecb46accdbc0644a247cc896363"} Apr 17 18:52:46.923612 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:46.923577 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-759c9978dc-xj9k4" event={"ID":"cc158f8a-b93a-495f-9067-c18fb50820cd","Type":"ContainerStarted","Data":"ae4dfcfd898c6f9fd80ec845fd4f11d53092d0a48167d6e1007b40bdf13899e4"} Apr 17 18:52:46.924004 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:46.923617 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-759c9978dc-xj9k4" event={"ID":"cc158f8a-b93a-495f-9067-c18fb50820cd","Type":"ContainerStarted","Data":"c04083341d53accadb4d6bef807f9fdd7144d869e248448b91a25a4d7e4582ac"} Apr 17 18:52:46.924004 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:46.923632 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-759c9978dc-xj9k4" event={"ID":"cc158f8a-b93a-495f-9067-c18fb50820cd","Type":"ContainerStarted","Data":"2222d1f27c5fd2d60b66c618b8cea26054e8edb976b7d37231a07fda6569a4ab"} Apr 17 18:52:46.971733 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:46.971681 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-759c9978dc-xj9k4" podStartSLOduration=1.602399063 podStartE2EDuration="2.971664931s" podCreationTimestamp="2026-04-17 18:52:44 +0000 UTC" firstStartedPulling="2026-04-17 18:52:44.659136322 +0000 UTC m=+203.933153593" lastFinishedPulling="2026-04-17 18:52:46.028402189 +0000 UTC m=+205.302419461" observedRunningTime="2026-04-17 18:52:46.97101296 +0000 UTC m=+206.245030253" watchObservedRunningTime="2026-04-17 18:52:46.971664931 +0000 UTC m=+206.245682224" Apr 17 18:52:47.628633 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:47.628601 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6569bbf787-ct4jl"] Apr 17 18:52:47.631685 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:47.631669 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6569bbf787-ct4jl" Apr 17 18:52:47.639762 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:47.639742 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6569bbf787-ct4jl"] Apr 17 18:52:47.725568 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:47.725542 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f331dc6-9b74-4f40-839d-5ee8a72e424f-service-ca\") pod \"console-6569bbf787-ct4jl\" (UID: \"8f331dc6-9b74-4f40-839d-5ee8a72e424f\") " pod="openshift-console/console-6569bbf787-ct4jl" Apr 17 18:52:47.725682 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:47.725570 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8f331dc6-9b74-4f40-839d-5ee8a72e424f-console-config\") pod \"console-6569bbf787-ct4jl\" (UID: \"8f331dc6-9b74-4f40-839d-5ee8a72e424f\") " pod="openshift-console/console-6569bbf787-ct4jl" Apr 17 18:52:47.725682 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:47.725598 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f331dc6-9b74-4f40-839d-5ee8a72e424f-console-serving-cert\") pod \"console-6569bbf787-ct4jl\" (UID: \"8f331dc6-9b74-4f40-839d-5ee8a72e424f\") " pod="openshift-console/console-6569bbf787-ct4jl" Apr 17 18:52:47.725682 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:47.725646 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmnlb\" (UniqueName: \"kubernetes.io/projected/8f331dc6-9b74-4f40-839d-5ee8a72e424f-kube-api-access-gmnlb\") pod \"console-6569bbf787-ct4jl\" (UID: \"8f331dc6-9b74-4f40-839d-5ee8a72e424f\") " pod="openshift-console/console-6569bbf787-ct4jl" Apr 17 18:52:47.725781 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:47.725686 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8f331dc6-9b74-4f40-839d-5ee8a72e424f-oauth-serving-cert\") pod \"console-6569bbf787-ct4jl\" (UID: \"8f331dc6-9b74-4f40-839d-5ee8a72e424f\") " pod="openshift-console/console-6569bbf787-ct4jl" Apr 17 18:52:47.725781 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:47.725719 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8f331dc6-9b74-4f40-839d-5ee8a72e424f-console-oauth-config\") pod \"console-6569bbf787-ct4jl\" (UID: \"8f331dc6-9b74-4f40-839d-5ee8a72e424f\") " pod="openshift-console/console-6569bbf787-ct4jl" Apr 17 18:52:47.725781 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:47.725741 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f331dc6-9b74-4f40-839d-5ee8a72e424f-trusted-ca-bundle\") pod \"console-6569bbf787-ct4jl\" (UID: \"8f331dc6-9b74-4f40-839d-5ee8a72e424f\") " pod="openshift-console/console-6569bbf787-ct4jl" Apr 17 18:52:47.826575 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:47.826549 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8f331dc6-9b74-4f40-839d-5ee8a72e424f-console-oauth-config\") pod \"console-6569bbf787-ct4jl\" (UID: \"8f331dc6-9b74-4f40-839d-5ee8a72e424f\") " pod="openshift-console/console-6569bbf787-ct4jl" Apr 17 18:52:47.826693 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:47.826579 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f331dc6-9b74-4f40-839d-5ee8a72e424f-trusted-ca-bundle\") pod \"console-6569bbf787-ct4jl\" (UID: \"8f331dc6-9b74-4f40-839d-5ee8a72e424f\") " pod="openshift-console/console-6569bbf787-ct4jl" Apr 17 18:52:47.826693 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:47.826609 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f331dc6-9b74-4f40-839d-5ee8a72e424f-service-ca\") pod \"console-6569bbf787-ct4jl\" (UID: \"8f331dc6-9b74-4f40-839d-5ee8a72e424f\") " pod="openshift-console/console-6569bbf787-ct4jl" Apr 17 18:52:47.826693 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:47.826633 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8f331dc6-9b74-4f40-839d-5ee8a72e424f-console-config\") pod \"console-6569bbf787-ct4jl\" (UID: \"8f331dc6-9b74-4f40-839d-5ee8a72e424f\") " pod="openshift-console/console-6569bbf787-ct4jl" Apr 17 18:52:47.826693 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:47.826672 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f331dc6-9b74-4f40-839d-5ee8a72e424f-console-serving-cert\") pod \"console-6569bbf787-ct4jl\" (UID: \"8f331dc6-9b74-4f40-839d-5ee8a72e424f\") " pod="openshift-console/console-6569bbf787-ct4jl" Apr 17 18:52:47.826885 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:47.826702 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gmnlb\" (UniqueName: \"kubernetes.io/projected/8f331dc6-9b74-4f40-839d-5ee8a72e424f-kube-api-access-gmnlb\") pod \"console-6569bbf787-ct4jl\" (UID: \"8f331dc6-9b74-4f40-839d-5ee8a72e424f\") " pod="openshift-console/console-6569bbf787-ct4jl" Apr 17 18:52:47.827197 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:47.827170 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8f331dc6-9b74-4f40-839d-5ee8a72e424f-oauth-serving-cert\") pod \"console-6569bbf787-ct4jl\" (UID: \"8f331dc6-9b74-4f40-839d-5ee8a72e424f\") " pod="openshift-console/console-6569bbf787-ct4jl" Apr 17 18:52:47.827347 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:47.827317 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f331dc6-9b74-4f40-839d-5ee8a72e424f-service-ca\") pod \"console-6569bbf787-ct4jl\" (UID: \"8f331dc6-9b74-4f40-839d-5ee8a72e424f\") " pod="openshift-console/console-6569bbf787-ct4jl" Apr 17 18:52:47.827475 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:47.827439 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f331dc6-9b74-4f40-839d-5ee8a72e424f-trusted-ca-bundle\") pod \"console-6569bbf787-ct4jl\" (UID: \"8f331dc6-9b74-4f40-839d-5ee8a72e424f\") " pod="openshift-console/console-6569bbf787-ct4jl" Apr 17 18:52:47.827562 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:47.827505 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8f331dc6-9b74-4f40-839d-5ee8a72e424f-console-config\") pod \"console-6569bbf787-ct4jl\" (UID: \"8f331dc6-9b74-4f40-839d-5ee8a72e424f\") " pod="openshift-console/console-6569bbf787-ct4jl" Apr 17 18:52:47.827719 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:47.827702 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8f331dc6-9b74-4f40-839d-5ee8a72e424f-oauth-serving-cert\") pod \"console-6569bbf787-ct4jl\" (UID: \"8f331dc6-9b74-4f40-839d-5ee8a72e424f\") " pod="openshift-console/console-6569bbf787-ct4jl" Apr 17 18:52:47.829075 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:47.829052 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8f331dc6-9b74-4f40-839d-5ee8a72e424f-console-oauth-config\") pod \"console-6569bbf787-ct4jl\" (UID: \"8f331dc6-9b74-4f40-839d-5ee8a72e424f\") " pod="openshift-console/console-6569bbf787-ct4jl" Apr 17 18:52:47.829176 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:47.829159 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f331dc6-9b74-4f40-839d-5ee8a72e424f-console-serving-cert\") pod \"console-6569bbf787-ct4jl\" (UID: \"8f331dc6-9b74-4f40-839d-5ee8a72e424f\") " pod="openshift-console/console-6569bbf787-ct4jl" Apr 17 18:52:47.835924 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:47.835899 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmnlb\" (UniqueName: \"kubernetes.io/projected/8f331dc6-9b74-4f40-839d-5ee8a72e424f-kube-api-access-gmnlb\") pod \"console-6569bbf787-ct4jl\" (UID: \"8f331dc6-9b74-4f40-839d-5ee8a72e424f\") " pod="openshift-console/console-6569bbf787-ct4jl" Apr 17 18:52:47.941674 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:47.941620 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6569bbf787-ct4jl" Apr 17 18:52:48.062656 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:48.062631 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6569bbf787-ct4jl"] Apr 17 18:52:48.064710 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:52:48.064672 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f331dc6_9b74_4f40_839d_5ee8a72e424f.slice/crio-4c074e1996e56740310951fdc7adf97659f9f9e1d3a8cc597910a3e90917e91e WatchSource:0}: Error finding container 4c074e1996e56740310951fdc7adf97659f9f9e1d3a8cc597910a3e90917e91e: Status 404 returned error can't find the container with id 4c074e1996e56740310951fdc7adf97659f9f9e1d3a8cc597910a3e90917e91e Apr 17 18:52:48.930859 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:48.930823 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6569bbf787-ct4jl" event={"ID":"8f331dc6-9b74-4f40-839d-5ee8a72e424f","Type":"ContainerStarted","Data":"b62d20350487607388991e3b3434b8ed5aaa5bb691fafad729f4f4b5ac438d7e"} Apr 17 18:52:48.930859 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:48.930859 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6569bbf787-ct4jl" event={"ID":"8f331dc6-9b74-4f40-839d-5ee8a72e424f","Type":"ContainerStarted","Data":"4c074e1996e56740310951fdc7adf97659f9f9e1d3a8cc597910a3e90917e91e"} Apr 17 18:52:48.947913 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:48.947862 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6569bbf787-ct4jl" podStartSLOduration=1.94784891 podStartE2EDuration="1.94784891s" podCreationTimestamp="2026-04-17 18:52:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:52:48.945957477 +0000 UTC m=+208.219974770" watchObservedRunningTime="2026-04-17 18:52:48.94784891 +0000 UTC m=+208.221866203" Apr 17 18:52:57.942684 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:57.942511 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6569bbf787-ct4jl" Apr 17 18:52:57.943162 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:57.942804 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6569bbf787-ct4jl" Apr 17 18:52:57.951343 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:57.951319 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6569bbf787-ct4jl" Apr 17 18:52:57.961077 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:57.961049 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6569bbf787-ct4jl" Apr 17 18:52:58.007775 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:52:58.007751 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-74d689b959-wdc6l"] Apr 17 18:53:23.029921 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:53:23.029859 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-74d689b959-wdc6l" podUID="035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3" containerName="console" containerID="cri-o://97048c64c0592bf6b43eb732f594c568a84b4ab6bb5cabce080f03d401e8f441" gracePeriod=15 Apr 17 18:53:23.262320 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:53:23.262298 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-74d689b959-wdc6l_035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3/console/0.log" Apr 17 18:53:23.262423 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:53:23.262355 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74d689b959-wdc6l" Apr 17 18:53:23.292767 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:53:23.292702 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3-console-serving-cert\") pod \"035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3\" (UID: \"035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3\") " Apr 17 18:53:23.292767 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:53:23.292747 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3-console-oauth-config\") pod \"035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3\" (UID: \"035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3\") " Apr 17 18:53:23.292767 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:53:23.292767 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3-trusted-ca-bundle\") pod \"035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3\" (UID: \"035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3\") " Apr 17 18:53:23.293047 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:53:23.292789 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3-service-ca\") pod \"035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3\" (UID: \"035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3\") " Apr 17 18:53:23.293047 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:53:23.292824 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3-console-config\") pod \"035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3\" (UID: \"035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3\") " Apr 17 18:53:23.293047 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:53:23.292851 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3-oauth-serving-cert\") pod \"035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3\" (UID: \"035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3\") " Apr 17 18:53:23.293047 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:53:23.292893 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwrt6\" (UniqueName: \"kubernetes.io/projected/035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3-kube-api-access-fwrt6\") pod \"035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3\" (UID: \"035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3\") " Apr 17 18:53:23.293272 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:53:23.293239 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3-service-ca" (OuterVolumeSpecName: "service-ca") pod "035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3" (UID: "035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:53:23.293327 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:53:23.293290 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3" (UID: "035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:53:23.293486 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:53:23.293359 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3" (UID: "035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:53:23.293569 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:53:23.293510 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3-console-config" (OuterVolumeSpecName: "console-config") pod "035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3" (UID: "035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:53:23.295138 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:53:23.295110 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3-kube-api-access-fwrt6" (OuterVolumeSpecName: "kube-api-access-fwrt6") pod "035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3" (UID: "035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3"). InnerVolumeSpecName "kube-api-access-fwrt6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:53:23.295411 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:53:23.295379 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3" (UID: "035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:53:23.295411 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:53:23.295393 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3" (UID: "035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:53:23.394080 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:53:23.394058 2571 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3-console-config\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:53:23.394080 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:53:23.394078 2571 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3-oauth-serving-cert\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:53:23.394202 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:53:23.394088 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fwrt6\" (UniqueName: \"kubernetes.io/projected/035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3-kube-api-access-fwrt6\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:53:23.394202 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:53:23.394102 2571 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3-console-serving-cert\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:53:23.394202 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:53:23.394115 2571 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3-console-oauth-config\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:53:23.394202 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:53:23.394123 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3-trusted-ca-bundle\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:53:23.394202 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:53:23.394133 2571 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3-service-ca\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:53:24.029845 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:53:24.029819 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-74d689b959-wdc6l_035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3/console/0.log" Apr 17 18:53:24.030032 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:53:24.029856 2571 generic.go:358] "Generic (PLEG): container finished" podID="035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3" containerID="97048c64c0592bf6b43eb732f594c568a84b4ab6bb5cabce080f03d401e8f441" exitCode=2 Apr 17 18:53:24.030032 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:53:24.029929 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74d689b959-wdc6l" Apr 17 18:53:24.030032 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:53:24.029950 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74d689b959-wdc6l" event={"ID":"035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3","Type":"ContainerDied","Data":"97048c64c0592bf6b43eb732f594c568a84b4ab6bb5cabce080f03d401e8f441"} Apr 17 18:53:24.030032 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:53:24.029992 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74d689b959-wdc6l" event={"ID":"035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3","Type":"ContainerDied","Data":"1bbc2e788543df7f9d7104b0a1625e50e30370b0f23f751c7a6ddac15c399509"} Apr 17 18:53:24.030032 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:53:24.030014 2571 scope.go:117] "RemoveContainer" containerID="97048c64c0592bf6b43eb732f594c568a84b4ab6bb5cabce080f03d401e8f441" Apr 17 18:53:24.038110 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:53:24.038074 2571 scope.go:117] "RemoveContainer" containerID="97048c64c0592bf6b43eb732f594c568a84b4ab6bb5cabce080f03d401e8f441" Apr 17 18:53:24.038361 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:53:24.038344 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97048c64c0592bf6b43eb732f594c568a84b4ab6bb5cabce080f03d401e8f441\": container with ID starting with 97048c64c0592bf6b43eb732f594c568a84b4ab6bb5cabce080f03d401e8f441 not found: ID does not exist" containerID="97048c64c0592bf6b43eb732f594c568a84b4ab6bb5cabce080f03d401e8f441" Apr 17 18:53:24.038405 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:53:24.038369 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97048c64c0592bf6b43eb732f594c568a84b4ab6bb5cabce080f03d401e8f441"} err="failed to get container status \"97048c64c0592bf6b43eb732f594c568a84b4ab6bb5cabce080f03d401e8f441\": rpc error: code = NotFound desc = could not find container \"97048c64c0592bf6b43eb732f594c568a84b4ab6bb5cabce080f03d401e8f441\": container with ID starting with 97048c64c0592bf6b43eb732f594c568a84b4ab6bb5cabce080f03d401e8f441 not found: ID does not exist" Apr 17 18:53:24.048794 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:53:24.048775 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-74d689b959-wdc6l"] Apr 17 18:53:24.055238 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:53:24.055218 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-74d689b959-wdc6l"] Apr 17 18:53:25.236575 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:53:25.236541 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3" path="/var/lib/kubelet/pods/035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3/volumes" Apr 17 18:54:06.594244 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:06.594212 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7b5fc8f596-88zsz"] Apr 17 18:54:06.594666 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:06.594491 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3" containerName="console" Apr 17 18:54:06.594666 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:06.594504 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3" containerName="console" Apr 17 18:54:06.594666 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:06.594550 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="035c0fd9-ae50-4ca0-b94e-c7ab2946d3c3" containerName="console" Apr 17 18:54:06.597167 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:06.597148 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b5fc8f596-88zsz" Apr 17 18:54:06.608100 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:06.608079 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b5fc8f596-88zsz"] Apr 17 18:54:06.700506 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:06.700475 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4ffc8cb3-d10c-4e68-8c65-fac351b80c4d-console-oauth-config\") pod \"console-7b5fc8f596-88zsz\" (UID: \"4ffc8cb3-d10c-4e68-8c65-fac351b80c4d\") " pod="openshift-console/console-7b5fc8f596-88zsz" Apr 17 18:54:06.700651 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:06.700514 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwmwl\" (UniqueName: \"kubernetes.io/projected/4ffc8cb3-d10c-4e68-8c65-fac351b80c4d-kube-api-access-mwmwl\") pod \"console-7b5fc8f596-88zsz\" (UID: \"4ffc8cb3-d10c-4e68-8c65-fac351b80c4d\") " pod="openshift-console/console-7b5fc8f596-88zsz" Apr 17 18:54:06.700651 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:06.700577 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ffc8cb3-d10c-4e68-8c65-fac351b80c4d-trusted-ca-bundle\") pod \"console-7b5fc8f596-88zsz\" (UID: \"4ffc8cb3-d10c-4e68-8c65-fac351b80c4d\") " pod="openshift-console/console-7b5fc8f596-88zsz" Apr 17 18:54:06.700651 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:06.700613 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4ffc8cb3-d10c-4e68-8c65-fac351b80c4d-oauth-serving-cert\") pod \"console-7b5fc8f596-88zsz\" (UID: \"4ffc8cb3-d10c-4e68-8c65-fac351b80c4d\") " pod="openshift-console/console-7b5fc8f596-88zsz" Apr 17 18:54:06.700651 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:06.700637 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4ffc8cb3-d10c-4e68-8c65-fac351b80c4d-service-ca\") pod \"console-7b5fc8f596-88zsz\" (UID: \"4ffc8cb3-d10c-4e68-8c65-fac351b80c4d\") " pod="openshift-console/console-7b5fc8f596-88zsz" Apr 17 18:54:06.700775 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:06.700700 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4ffc8cb3-d10c-4e68-8c65-fac351b80c4d-console-serving-cert\") pod \"console-7b5fc8f596-88zsz\" (UID: \"4ffc8cb3-d10c-4e68-8c65-fac351b80c4d\") " pod="openshift-console/console-7b5fc8f596-88zsz" Apr 17 18:54:06.700775 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:06.700718 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4ffc8cb3-d10c-4e68-8c65-fac351b80c4d-console-config\") pod \"console-7b5fc8f596-88zsz\" (UID: \"4ffc8cb3-d10c-4e68-8c65-fac351b80c4d\") " pod="openshift-console/console-7b5fc8f596-88zsz" Apr 17 18:54:06.801071 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:06.801042 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4ffc8cb3-d10c-4e68-8c65-fac351b80c4d-console-oauth-config\") pod \"console-7b5fc8f596-88zsz\" (UID: \"4ffc8cb3-d10c-4e68-8c65-fac351b80c4d\") " pod="openshift-console/console-7b5fc8f596-88zsz" Apr 17 18:54:06.801071 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:06.801074 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mwmwl\" (UniqueName: \"kubernetes.io/projected/4ffc8cb3-d10c-4e68-8c65-fac351b80c4d-kube-api-access-mwmwl\") pod \"console-7b5fc8f596-88zsz\" (UID: \"4ffc8cb3-d10c-4e68-8c65-fac351b80c4d\") " pod="openshift-console/console-7b5fc8f596-88zsz" Apr 17 18:54:06.801241 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:06.801093 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ffc8cb3-d10c-4e68-8c65-fac351b80c4d-trusted-ca-bundle\") pod \"console-7b5fc8f596-88zsz\" (UID: \"4ffc8cb3-d10c-4e68-8c65-fac351b80c4d\") " pod="openshift-console/console-7b5fc8f596-88zsz" Apr 17 18:54:06.801241 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:06.801116 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4ffc8cb3-d10c-4e68-8c65-fac351b80c4d-oauth-serving-cert\") pod \"console-7b5fc8f596-88zsz\" (UID: \"4ffc8cb3-d10c-4e68-8c65-fac351b80c4d\") " pod="openshift-console/console-7b5fc8f596-88zsz" Apr 17 18:54:06.801307 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:06.801257 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4ffc8cb3-d10c-4e68-8c65-fac351b80c4d-service-ca\") pod \"console-7b5fc8f596-88zsz\" (UID: \"4ffc8cb3-d10c-4e68-8c65-fac351b80c4d\") " pod="openshift-console/console-7b5fc8f596-88zsz" Apr 17 18:54:06.801350 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:06.801314 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4ffc8cb3-d10c-4e68-8c65-fac351b80c4d-console-serving-cert\") pod \"console-7b5fc8f596-88zsz\" (UID: \"4ffc8cb3-d10c-4e68-8c65-fac351b80c4d\") " pod="openshift-console/console-7b5fc8f596-88zsz" Apr 17 18:54:06.801388 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:06.801345 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4ffc8cb3-d10c-4e68-8c65-fac351b80c4d-console-config\") pod \"console-7b5fc8f596-88zsz\" (UID: \"4ffc8cb3-d10c-4e68-8c65-fac351b80c4d\") " pod="openshift-console/console-7b5fc8f596-88zsz" Apr 17 18:54:06.801935 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:06.801912 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4ffc8cb3-d10c-4e68-8c65-fac351b80c4d-oauth-serving-cert\") pod \"console-7b5fc8f596-88zsz\" (UID: \"4ffc8cb3-d10c-4e68-8c65-fac351b80c4d\") " pod="openshift-console/console-7b5fc8f596-88zsz" Apr 17 18:54:06.802071 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:06.802045 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4ffc8cb3-d10c-4e68-8c65-fac351b80c4d-service-ca\") pod \"console-7b5fc8f596-88zsz\" (UID: \"4ffc8cb3-d10c-4e68-8c65-fac351b80c4d\") " pod="openshift-console/console-7b5fc8f596-88zsz" Apr 17 18:54:06.802153 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:06.802093 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ffc8cb3-d10c-4e68-8c65-fac351b80c4d-trusted-ca-bundle\") pod \"console-7b5fc8f596-88zsz\" (UID: \"4ffc8cb3-d10c-4e68-8c65-fac351b80c4d\") " pod="openshift-console/console-7b5fc8f596-88zsz" Apr 17 18:54:06.802153 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:06.802125 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4ffc8cb3-d10c-4e68-8c65-fac351b80c4d-console-config\") pod \"console-7b5fc8f596-88zsz\" (UID: \"4ffc8cb3-d10c-4e68-8c65-fac351b80c4d\") " pod="openshift-console/console-7b5fc8f596-88zsz" Apr 17 18:54:06.803690 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:06.803667 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4ffc8cb3-d10c-4e68-8c65-fac351b80c4d-console-oauth-config\") pod \"console-7b5fc8f596-88zsz\" (UID: \"4ffc8cb3-d10c-4e68-8c65-fac351b80c4d\") " pod="openshift-console/console-7b5fc8f596-88zsz" Apr 17 18:54:06.803806 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:06.803789 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4ffc8cb3-d10c-4e68-8c65-fac351b80c4d-console-serving-cert\") pod \"console-7b5fc8f596-88zsz\" (UID: \"4ffc8cb3-d10c-4e68-8c65-fac351b80c4d\") " pod="openshift-console/console-7b5fc8f596-88zsz" Apr 17 18:54:06.809799 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:06.809779 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwmwl\" (UniqueName: \"kubernetes.io/projected/4ffc8cb3-d10c-4e68-8c65-fac351b80c4d-kube-api-access-mwmwl\") pod \"console-7b5fc8f596-88zsz\" (UID: \"4ffc8cb3-d10c-4e68-8c65-fac351b80c4d\") " pod="openshift-console/console-7b5fc8f596-88zsz" Apr 17 18:54:06.906374 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:06.906321 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b5fc8f596-88zsz" Apr 17 18:54:07.018199 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:07.018175 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b5fc8f596-88zsz"] Apr 17 18:54:07.020137 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:54:07.020110 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ffc8cb3_d10c_4e68_8c65_fac351b80c4d.slice/crio-fce4225b0d41aea0efed0d46ae7e9bb5c8ba2e5f7eed445fce5029c880f3da98 WatchSource:0}: Error finding container fce4225b0d41aea0efed0d46ae7e9bb5c8ba2e5f7eed445fce5029c880f3da98: Status 404 returned error can't find the container with id fce4225b0d41aea0efed0d46ae7e9bb5c8ba2e5f7eed445fce5029c880f3da98 Apr 17 18:54:07.147170 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:07.147134 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b5fc8f596-88zsz" event={"ID":"4ffc8cb3-d10c-4e68-8c65-fac351b80c4d","Type":"ContainerStarted","Data":"b26bdf3969bff9fef8f7e34a03b0760f170ed8feecc55f695cac3892b36c7276"} Apr 17 18:54:07.147170 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:07.147171 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b5fc8f596-88zsz" event={"ID":"4ffc8cb3-d10c-4e68-8c65-fac351b80c4d","Type":"ContainerStarted","Data":"fce4225b0d41aea0efed0d46ae7e9bb5c8ba2e5f7eed445fce5029c880f3da98"} Apr 17 18:54:07.163198 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:07.163110 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7b5fc8f596-88zsz" podStartSLOduration=1.163094222 podStartE2EDuration="1.163094222s" podCreationTimestamp="2026-04-17 18:54:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:54:07.161248432 +0000 UTC m=+286.435265725" watchObservedRunningTime="2026-04-17 18:54:07.163094222 +0000 UTC m=+286.437111518" Apr 17 18:54:16.907422 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:16.907387 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7b5fc8f596-88zsz" Apr 17 18:54:16.907422 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:16.907427 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7b5fc8f596-88zsz" Apr 17 18:54:16.912186 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:16.912165 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7b5fc8f596-88zsz" Apr 17 18:54:17.178242 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:17.178163 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7b5fc8f596-88zsz" Apr 17 18:54:17.220160 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:17.220123 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6569bbf787-ct4jl"] Apr 17 18:54:21.134633 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:21.134606 2571 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 18:54:42.242338 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:42.242276 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6569bbf787-ct4jl" podUID="8f331dc6-9b74-4f40-839d-5ee8a72e424f" containerName="console" containerID="cri-o://b62d20350487607388991e3b3434b8ed5aaa5bb691fafad729f4f4b5ac438d7e" gracePeriod=15 Apr 17 18:54:42.472725 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:42.472703 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6569bbf787-ct4jl_8f331dc6-9b74-4f40-839d-5ee8a72e424f/console/0.log" Apr 17 18:54:42.472851 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:42.472772 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6569bbf787-ct4jl" Apr 17 18:54:42.562893 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:42.562864 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f331dc6-9b74-4f40-839d-5ee8a72e424f-service-ca\") pod \"8f331dc6-9b74-4f40-839d-5ee8a72e424f\" (UID: \"8f331dc6-9b74-4f40-839d-5ee8a72e424f\") " Apr 17 18:54:42.563018 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:42.562914 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f331dc6-9b74-4f40-839d-5ee8a72e424f-console-serving-cert\") pod \"8f331dc6-9b74-4f40-839d-5ee8a72e424f\" (UID: \"8f331dc6-9b74-4f40-839d-5ee8a72e424f\") " Apr 17 18:54:42.563018 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:42.562931 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f331dc6-9b74-4f40-839d-5ee8a72e424f-trusted-ca-bundle\") pod \"8f331dc6-9b74-4f40-839d-5ee8a72e424f\" (UID: \"8f331dc6-9b74-4f40-839d-5ee8a72e424f\") " Apr 17 18:54:42.563018 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:42.562956 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8f331dc6-9b74-4f40-839d-5ee8a72e424f-oauth-serving-cert\") pod \"8f331dc6-9b74-4f40-839d-5ee8a72e424f\" (UID: \"8f331dc6-9b74-4f40-839d-5ee8a72e424f\") " Apr 17 18:54:42.563153 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:42.563069 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8f331dc6-9b74-4f40-839d-5ee8a72e424f-console-config\") pod \"8f331dc6-9b74-4f40-839d-5ee8a72e424f\" (UID: \"8f331dc6-9b74-4f40-839d-5ee8a72e424f\") " Apr 17 18:54:42.563153 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:42.563100 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmnlb\" (UniqueName: \"kubernetes.io/projected/8f331dc6-9b74-4f40-839d-5ee8a72e424f-kube-api-access-gmnlb\") pod \"8f331dc6-9b74-4f40-839d-5ee8a72e424f\" (UID: \"8f331dc6-9b74-4f40-839d-5ee8a72e424f\") " Apr 17 18:54:42.563153 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:42.563134 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8f331dc6-9b74-4f40-839d-5ee8a72e424f-console-oauth-config\") pod \"8f331dc6-9b74-4f40-839d-5ee8a72e424f\" (UID: \"8f331dc6-9b74-4f40-839d-5ee8a72e424f\") " Apr 17 18:54:42.563326 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:42.563296 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f331dc6-9b74-4f40-839d-5ee8a72e424f-service-ca" (OuterVolumeSpecName: "service-ca") pod "8f331dc6-9b74-4f40-839d-5ee8a72e424f" (UID: "8f331dc6-9b74-4f40-839d-5ee8a72e424f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:54:42.563390 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:42.563337 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f331dc6-9b74-4f40-839d-5ee8a72e424f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "8f331dc6-9b74-4f40-839d-5ee8a72e424f" (UID: "8f331dc6-9b74-4f40-839d-5ee8a72e424f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:54:42.563449 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:42.563385 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f331dc6-9b74-4f40-839d-5ee8a72e424f-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "8f331dc6-9b74-4f40-839d-5ee8a72e424f" (UID: "8f331dc6-9b74-4f40-839d-5ee8a72e424f"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:54:42.563533 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:42.563495 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f331dc6-9b74-4f40-839d-5ee8a72e424f-console-config" (OuterVolumeSpecName: "console-config") pod "8f331dc6-9b74-4f40-839d-5ee8a72e424f" (UID: "8f331dc6-9b74-4f40-839d-5ee8a72e424f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:54:42.565009 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:42.564976 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f331dc6-9b74-4f40-839d-5ee8a72e424f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "8f331dc6-9b74-4f40-839d-5ee8a72e424f" (UID: "8f331dc6-9b74-4f40-839d-5ee8a72e424f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:54:42.565112 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:42.565073 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f331dc6-9b74-4f40-839d-5ee8a72e424f-kube-api-access-gmnlb" (OuterVolumeSpecName: "kube-api-access-gmnlb") pod "8f331dc6-9b74-4f40-839d-5ee8a72e424f" (UID: "8f331dc6-9b74-4f40-839d-5ee8a72e424f"). InnerVolumeSpecName "kube-api-access-gmnlb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:54:42.565157 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:42.565104 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f331dc6-9b74-4f40-839d-5ee8a72e424f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "8f331dc6-9b74-4f40-839d-5ee8a72e424f" (UID: "8f331dc6-9b74-4f40-839d-5ee8a72e424f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:54:42.664338 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:42.664307 2571 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f331dc6-9b74-4f40-839d-5ee8a72e424f-console-serving-cert\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:54:42.664338 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:42.664329 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f331dc6-9b74-4f40-839d-5ee8a72e424f-trusted-ca-bundle\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:54:42.664338 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:42.664338 2571 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8f331dc6-9b74-4f40-839d-5ee8a72e424f-oauth-serving-cert\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:54:42.664338 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:42.664348 2571 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8f331dc6-9b74-4f40-839d-5ee8a72e424f-console-config\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:54:42.664699 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:42.664357 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gmnlb\" (UniqueName: \"kubernetes.io/projected/8f331dc6-9b74-4f40-839d-5ee8a72e424f-kube-api-access-gmnlb\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:54:42.664699 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:42.664366 2571 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8f331dc6-9b74-4f40-839d-5ee8a72e424f-console-oauth-config\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:54:42.664699 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:42.664374 2571 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f331dc6-9b74-4f40-839d-5ee8a72e424f-service-ca\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:54:43.246528 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:43.246508 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6569bbf787-ct4jl_8f331dc6-9b74-4f40-839d-5ee8a72e424f/console/0.log" Apr 17 18:54:43.246840 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:43.246543 2571 generic.go:358] "Generic (PLEG): container finished" podID="8f331dc6-9b74-4f40-839d-5ee8a72e424f" containerID="b62d20350487607388991e3b3434b8ed5aaa5bb691fafad729f4f4b5ac438d7e" exitCode=2 Apr 17 18:54:43.246840 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:43.246597 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6569bbf787-ct4jl" event={"ID":"8f331dc6-9b74-4f40-839d-5ee8a72e424f","Type":"ContainerDied","Data":"b62d20350487607388991e3b3434b8ed5aaa5bb691fafad729f4f4b5ac438d7e"} Apr 17 18:54:43.246840 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:43.246616 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6569bbf787-ct4jl" event={"ID":"8f331dc6-9b74-4f40-839d-5ee8a72e424f","Type":"ContainerDied","Data":"4c074e1996e56740310951fdc7adf97659f9f9e1d3a8cc597910a3e90917e91e"} Apr 17 18:54:43.246840 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:43.246621 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6569bbf787-ct4jl" Apr 17 18:54:43.246840 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:43.246630 2571 scope.go:117] "RemoveContainer" containerID="b62d20350487607388991e3b3434b8ed5aaa5bb691fafad729f4f4b5ac438d7e" Apr 17 18:54:43.254042 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:43.254006 2571 scope.go:117] "RemoveContainer" containerID="b62d20350487607388991e3b3434b8ed5aaa5bb691fafad729f4f4b5ac438d7e" Apr 17 18:54:43.254267 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:54:43.254249 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b62d20350487607388991e3b3434b8ed5aaa5bb691fafad729f4f4b5ac438d7e\": container with ID starting with b62d20350487607388991e3b3434b8ed5aaa5bb691fafad729f4f4b5ac438d7e not found: ID does not exist" containerID="b62d20350487607388991e3b3434b8ed5aaa5bb691fafad729f4f4b5ac438d7e" Apr 17 18:54:43.254325 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:43.254278 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b62d20350487607388991e3b3434b8ed5aaa5bb691fafad729f4f4b5ac438d7e"} err="failed to get container status \"b62d20350487607388991e3b3434b8ed5aaa5bb691fafad729f4f4b5ac438d7e\": rpc error: code = NotFound desc = could not find container \"b62d20350487607388991e3b3434b8ed5aaa5bb691fafad729f4f4b5ac438d7e\": container with ID starting with b62d20350487607388991e3b3434b8ed5aaa5bb691fafad729f4f4b5ac438d7e not found: ID does not exist" Apr 17 18:54:43.266056 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:43.266031 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6569bbf787-ct4jl"] Apr 17 18:54:43.272886 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:43.272867 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6569bbf787-ct4jl"] Apr 17 18:54:43.288113 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:43.287794 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-44ptd"] Apr 17 18:54:43.288229 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:43.288152 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f331dc6-9b74-4f40-839d-5ee8a72e424f" containerName="console" Apr 17 18:54:43.288229 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:43.288167 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f331dc6-9b74-4f40-839d-5ee8a72e424f" containerName="console" Apr 17 18:54:43.288346 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:43.288243 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="8f331dc6-9b74-4f40-839d-5ee8a72e424f" containerName="console" Apr 17 18:54:43.292213 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:43.292196 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-44ptd" Apr 17 18:54:43.294108 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:43.294092 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 18:54:43.297473 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:43.297436 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-44ptd"] Apr 17 18:54:43.369751 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:43.369720 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a2d6e798-4c54-4a87-9001-6aa609214c8a-dbus\") pod \"global-pull-secret-syncer-44ptd\" (UID: \"a2d6e798-4c54-4a87-9001-6aa609214c8a\") " pod="kube-system/global-pull-secret-syncer-44ptd" Apr 17 18:54:43.369904 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:43.369778 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a2d6e798-4c54-4a87-9001-6aa609214c8a-kubelet-config\") pod \"global-pull-secret-syncer-44ptd\" (UID: \"a2d6e798-4c54-4a87-9001-6aa609214c8a\") " pod="kube-system/global-pull-secret-syncer-44ptd" Apr 17 18:54:43.369904 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:43.369800 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a2d6e798-4c54-4a87-9001-6aa609214c8a-original-pull-secret\") pod \"global-pull-secret-syncer-44ptd\" (UID: \"a2d6e798-4c54-4a87-9001-6aa609214c8a\") " pod="kube-system/global-pull-secret-syncer-44ptd" Apr 17 18:54:43.470665 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:43.470620 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a2d6e798-4c54-4a87-9001-6aa609214c8a-kubelet-config\") pod \"global-pull-secret-syncer-44ptd\" (UID: \"a2d6e798-4c54-4a87-9001-6aa609214c8a\") " pod="kube-system/global-pull-secret-syncer-44ptd" Apr 17 18:54:43.470665 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:43.470671 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a2d6e798-4c54-4a87-9001-6aa609214c8a-original-pull-secret\") pod \"global-pull-secret-syncer-44ptd\" (UID: \"a2d6e798-4c54-4a87-9001-6aa609214c8a\") " pod="kube-system/global-pull-secret-syncer-44ptd" Apr 17 18:54:43.470803 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:43.470704 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a2d6e798-4c54-4a87-9001-6aa609214c8a-dbus\") pod \"global-pull-secret-syncer-44ptd\" (UID: \"a2d6e798-4c54-4a87-9001-6aa609214c8a\") " pod="kube-system/global-pull-secret-syncer-44ptd" Apr 17 18:54:43.470803 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:43.470745 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a2d6e798-4c54-4a87-9001-6aa609214c8a-kubelet-config\") pod \"global-pull-secret-syncer-44ptd\" (UID: \"a2d6e798-4c54-4a87-9001-6aa609214c8a\") " pod="kube-system/global-pull-secret-syncer-44ptd" Apr 17 18:54:43.470867 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:43.470844 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a2d6e798-4c54-4a87-9001-6aa609214c8a-dbus\") pod \"global-pull-secret-syncer-44ptd\" (UID: \"a2d6e798-4c54-4a87-9001-6aa609214c8a\") " pod="kube-system/global-pull-secret-syncer-44ptd" Apr 17 18:54:43.472964 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:43.472944 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a2d6e798-4c54-4a87-9001-6aa609214c8a-original-pull-secret\") pod \"global-pull-secret-syncer-44ptd\" (UID: \"a2d6e798-4c54-4a87-9001-6aa609214c8a\") " pod="kube-system/global-pull-secret-syncer-44ptd" Apr 17 18:54:43.600961 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:43.600931 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-44ptd" Apr 17 18:54:43.715603 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:43.715564 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-44ptd"] Apr 17 18:54:43.719953 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:54:43.719917 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2d6e798_4c54_4a87_9001_6aa609214c8a.slice/crio-3e9711aa7a850b0ae40a5351c0b8b7751049fed0d6932cca09070a71329e96cf WatchSource:0}: Error finding container 3e9711aa7a850b0ae40a5351c0b8b7751049fed0d6932cca09070a71329e96cf: Status 404 returned error can't find the container with id 3e9711aa7a850b0ae40a5351c0b8b7751049fed0d6932cca09070a71329e96cf Apr 17 18:54:43.721542 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:43.721525 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 18:54:44.253060 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:44.253018 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-44ptd" event={"ID":"a2d6e798-4c54-4a87-9001-6aa609214c8a","Type":"ContainerStarted","Data":"3e9711aa7a850b0ae40a5351c0b8b7751049fed0d6932cca09070a71329e96cf"} Apr 17 18:54:45.238735 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:45.238703 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f331dc6-9b74-4f40-839d-5ee8a72e424f" path="/var/lib/kubelet/pods/8f331dc6-9b74-4f40-839d-5ee8a72e424f/volumes" Apr 17 18:54:48.268328 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:48.268287 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-44ptd" event={"ID":"a2d6e798-4c54-4a87-9001-6aa609214c8a","Type":"ContainerStarted","Data":"ee81a4d74e5df8771b6ec35073381992159dfb33c1b0ff81311166c2a8df8edc"} Apr 17 18:54:48.283050 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:48.282985 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-44ptd" podStartSLOduration=0.921587918 podStartE2EDuration="5.282969478s" podCreationTimestamp="2026-04-17 18:54:43 +0000 UTC" firstStartedPulling="2026-04-17 18:54:43.721680734 +0000 UTC m=+322.995698005" lastFinishedPulling="2026-04-17 18:54:48.083062069 +0000 UTC m=+327.357079565" observedRunningTime="2026-04-17 18:54:48.281673798 +0000 UTC m=+327.555691091" watchObservedRunningTime="2026-04-17 18:54:48.282969478 +0000 UTC m=+327.556986771" Apr 17 18:54:55.376165 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:55.376129 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e558mj6"] Apr 17 18:54:55.379532 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:55.379514 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e558mj6" Apr 17 18:54:55.381784 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:55.381760 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 18:54:55.381933 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:55.381873 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 18:54:55.382635 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:55.382622 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-sx5lk\"" Apr 17 18:54:55.387314 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:55.387289 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e558mj6"] Apr 17 18:54:55.463351 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:55.463322 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km5vh\" (UniqueName: \"kubernetes.io/projected/f1c1beac-302c-48a0-911a-5686349806dd-kube-api-access-km5vh\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e558mj6\" (UID: \"f1c1beac-302c-48a0-911a-5686349806dd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e558mj6" Apr 17 18:54:55.463506 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:55.463359 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f1c1beac-302c-48a0-911a-5686349806dd-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e558mj6\" (UID: \"f1c1beac-302c-48a0-911a-5686349806dd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e558mj6" Apr 17 18:54:55.463506 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:55.463427 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f1c1beac-302c-48a0-911a-5686349806dd-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e558mj6\" (UID: \"f1c1beac-302c-48a0-911a-5686349806dd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e558mj6" Apr 17 18:54:55.564219 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:55.564192 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f1c1beac-302c-48a0-911a-5686349806dd-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e558mj6\" (UID: \"f1c1beac-302c-48a0-911a-5686349806dd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e558mj6" Apr 17 18:54:55.564311 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:55.564236 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-km5vh\" (UniqueName: \"kubernetes.io/projected/f1c1beac-302c-48a0-911a-5686349806dd-kube-api-access-km5vh\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e558mj6\" (UID: \"f1c1beac-302c-48a0-911a-5686349806dd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e558mj6" Apr 17 18:54:55.564311 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:55.564285 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f1c1beac-302c-48a0-911a-5686349806dd-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e558mj6\" (UID: \"f1c1beac-302c-48a0-911a-5686349806dd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e558mj6" Apr 17 18:54:55.564661 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:55.564642 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f1c1beac-302c-48a0-911a-5686349806dd-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e558mj6\" (UID: \"f1c1beac-302c-48a0-911a-5686349806dd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e558mj6" Apr 17 18:54:55.564700 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:55.564656 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f1c1beac-302c-48a0-911a-5686349806dd-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e558mj6\" (UID: \"f1c1beac-302c-48a0-911a-5686349806dd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e558mj6" Apr 17 18:54:55.571752 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:55.571730 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-km5vh\" (UniqueName: \"kubernetes.io/projected/f1c1beac-302c-48a0-911a-5686349806dd-kube-api-access-km5vh\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e558mj6\" (UID: \"f1c1beac-302c-48a0-911a-5686349806dd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e558mj6" Apr 17 18:54:55.689830 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:55.689768 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e558mj6" Apr 17 18:54:55.803699 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:55.803673 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e558mj6"] Apr 17 18:54:55.805743 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:54:55.805710 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1c1beac_302c_48a0_911a_5686349806dd.slice/crio-68a78c0cc4f82d11fa86ee255e793ee722cb5d67b05402c37734d2f5af3e44cf WatchSource:0}: Error finding container 68a78c0cc4f82d11fa86ee255e793ee722cb5d67b05402c37734d2f5af3e44cf: Status 404 returned error can't find the container with id 68a78c0cc4f82d11fa86ee255e793ee722cb5d67b05402c37734d2f5af3e44cf Apr 17 18:54:56.288243 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:54:56.288204 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e558mj6" event={"ID":"f1c1beac-302c-48a0-911a-5686349806dd","Type":"ContainerStarted","Data":"68a78c0cc4f82d11fa86ee255e793ee722cb5d67b05402c37734d2f5af3e44cf"} Apr 17 18:55:03.310085 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:03.310006 2571 generic.go:358] "Generic (PLEG): container finished" podID="f1c1beac-302c-48a0-911a-5686349806dd" containerID="fc1f8faa13e88a3044d772a4580241bdfbf9b17c56bbd05cb618591fc05d22b0" exitCode=0 Apr 17 18:55:03.310085 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:03.310081 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e558mj6" event={"ID":"f1c1beac-302c-48a0-911a-5686349806dd","Type":"ContainerDied","Data":"fc1f8faa13e88a3044d772a4580241bdfbf9b17c56bbd05cb618591fc05d22b0"} Apr 17 18:55:06.320576 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:06.320544 2571 generic.go:358] "Generic (PLEG): container finished" podID="f1c1beac-302c-48a0-911a-5686349806dd" containerID="1caf5934ad56a115dca46169f996e53af8d5be92e4195096dc9753922359bc91" exitCode=0 Apr 17 18:55:06.320933 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:06.320613 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e558mj6" event={"ID":"f1c1beac-302c-48a0-911a-5686349806dd","Type":"ContainerDied","Data":"1caf5934ad56a115dca46169f996e53af8d5be92e4195096dc9753922359bc91"} Apr 17 18:55:14.345414 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:14.345380 2571 generic.go:358] "Generic (PLEG): container finished" podID="f1c1beac-302c-48a0-911a-5686349806dd" containerID="2270827b347f7f5992d2106bc5121e8adf2b81b11b733b0742dfd263a31412f0" exitCode=0 Apr 17 18:55:14.345797 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:14.345473 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e558mj6" event={"ID":"f1c1beac-302c-48a0-911a-5686349806dd","Type":"ContainerDied","Data":"2270827b347f7f5992d2106bc5121e8adf2b81b11b733b0742dfd263a31412f0"} Apr 17 18:55:15.472129 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:15.472105 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e558mj6" Apr 17 18:55:15.536409 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:15.536377 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f1c1beac-302c-48a0-911a-5686349806dd-util\") pod \"f1c1beac-302c-48a0-911a-5686349806dd\" (UID: \"f1c1beac-302c-48a0-911a-5686349806dd\") " Apr 17 18:55:15.536574 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:15.536430 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f1c1beac-302c-48a0-911a-5686349806dd-bundle\") pod \"f1c1beac-302c-48a0-911a-5686349806dd\" (UID: \"f1c1beac-302c-48a0-911a-5686349806dd\") " Apr 17 18:55:15.536574 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:15.536500 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-km5vh\" (UniqueName: \"kubernetes.io/projected/f1c1beac-302c-48a0-911a-5686349806dd-kube-api-access-km5vh\") pod \"f1c1beac-302c-48a0-911a-5686349806dd\" (UID: \"f1c1beac-302c-48a0-911a-5686349806dd\") " Apr 17 18:55:15.537079 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:15.537057 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1c1beac-302c-48a0-911a-5686349806dd-bundle" (OuterVolumeSpecName: "bundle") pod "f1c1beac-302c-48a0-911a-5686349806dd" (UID: "f1c1beac-302c-48a0-911a-5686349806dd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:55:15.538769 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:15.538750 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1c1beac-302c-48a0-911a-5686349806dd-kube-api-access-km5vh" (OuterVolumeSpecName: "kube-api-access-km5vh") pod "f1c1beac-302c-48a0-911a-5686349806dd" (UID: "f1c1beac-302c-48a0-911a-5686349806dd"). InnerVolumeSpecName "kube-api-access-km5vh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:55:15.540430 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:15.540409 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1c1beac-302c-48a0-911a-5686349806dd-util" (OuterVolumeSpecName: "util") pod "f1c1beac-302c-48a0-911a-5686349806dd" (UID: "f1c1beac-302c-48a0-911a-5686349806dd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:55:15.637184 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:15.637124 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f1c1beac-302c-48a0-911a-5686349806dd-util\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:55:15.637184 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:15.637144 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f1c1beac-302c-48a0-911a-5686349806dd-bundle\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:55:15.637184 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:15.637154 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-km5vh\" (UniqueName: \"kubernetes.io/projected/f1c1beac-302c-48a0-911a-5686349806dd-kube-api-access-km5vh\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:55:16.352212 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:16.352174 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e558mj6" event={"ID":"f1c1beac-302c-48a0-911a-5686349806dd","Type":"ContainerDied","Data":"68a78c0cc4f82d11fa86ee255e793ee722cb5d67b05402c37734d2f5af3e44cf"} Apr 17 18:55:16.352212 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:16.352214 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68a78c0cc4f82d11fa86ee255e793ee722cb5d67b05402c37734d2f5af3e44cf" Apr 17 18:55:16.352409 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:16.352242 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e558mj6" Apr 17 18:55:22.465619 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:22.465588 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-klb57"] Apr 17 18:55:22.466097 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:22.466018 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1c1beac-302c-48a0-911a-5686349806dd" containerName="extract" Apr 17 18:55:22.466097 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:22.466036 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1c1beac-302c-48a0-911a-5686349806dd" containerName="extract" Apr 17 18:55:22.466097 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:22.466050 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1c1beac-302c-48a0-911a-5686349806dd" containerName="pull" Apr 17 18:55:22.466097 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:22.466059 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1c1beac-302c-48a0-911a-5686349806dd" containerName="pull" Apr 17 18:55:22.466097 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:22.466076 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1c1beac-302c-48a0-911a-5686349806dd" containerName="util" Apr 17 18:55:22.466097 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:22.466084 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1c1beac-302c-48a0-911a-5686349806dd" containerName="util" Apr 17 18:55:22.466382 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:22.466162 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="f1c1beac-302c-48a0-911a-5686349806dd" containerName="extract" Apr 17 18:55:22.472921 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:22.472900 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-klb57" Apr 17 18:55:22.475160 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:22.475142 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 17 18:55:22.475241 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:22.475145 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 17 18:55:22.475241 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:22.475220 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-w8ddt\"" Apr 17 18:55:22.477952 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:22.477930 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-klb57"] Apr 17 18:55:22.591063 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:22.591029 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/86bd38a4-ad0c-4b43-b98a-d3cae1d784ae-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-klb57\" (UID: \"86bd38a4-ad0c-4b43-b98a-d3cae1d784ae\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-klb57" Apr 17 18:55:22.591063 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:22.591065 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9svcw\" (UniqueName: \"kubernetes.io/projected/86bd38a4-ad0c-4b43-b98a-d3cae1d784ae-kube-api-access-9svcw\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-klb57\" (UID: \"86bd38a4-ad0c-4b43-b98a-d3cae1d784ae\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-klb57" Apr 17 18:55:22.692075 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:22.692043 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/86bd38a4-ad0c-4b43-b98a-d3cae1d784ae-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-klb57\" (UID: \"86bd38a4-ad0c-4b43-b98a-d3cae1d784ae\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-klb57" Apr 17 18:55:22.692270 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:22.692089 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9svcw\" (UniqueName: \"kubernetes.io/projected/86bd38a4-ad0c-4b43-b98a-d3cae1d784ae-kube-api-access-9svcw\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-klb57\" (UID: \"86bd38a4-ad0c-4b43-b98a-d3cae1d784ae\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-klb57" Apr 17 18:55:22.692496 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:22.692450 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/86bd38a4-ad0c-4b43-b98a-d3cae1d784ae-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-klb57\" (UID: \"86bd38a4-ad0c-4b43-b98a-d3cae1d784ae\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-klb57" Apr 17 18:55:22.700818 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:22.700791 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9svcw\" (UniqueName: \"kubernetes.io/projected/86bd38a4-ad0c-4b43-b98a-d3cae1d784ae-kube-api-access-9svcw\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-klb57\" (UID: \"86bd38a4-ad0c-4b43-b98a-d3cae1d784ae\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-klb57" Apr 17 18:55:22.782062 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:22.782029 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-klb57" Apr 17 18:55:22.901759 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:22.901731 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-klb57"] Apr 17 18:55:22.904068 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:55:22.904026 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86bd38a4_ad0c_4b43_b98a_d3cae1d784ae.slice/crio-65abfe72c32348ecd1af72a5f06c905f4c2400bc33a188e1663e3eb449cb9704 WatchSource:0}: Error finding container 65abfe72c32348ecd1af72a5f06c905f4c2400bc33a188e1663e3eb449cb9704: Status 404 returned error can't find the container with id 65abfe72c32348ecd1af72a5f06c905f4c2400bc33a188e1663e3eb449cb9704 Apr 17 18:55:23.371480 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:23.371424 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-klb57" event={"ID":"86bd38a4-ad0c-4b43-b98a-d3cae1d784ae","Type":"ContainerStarted","Data":"65abfe72c32348ecd1af72a5f06c905f4c2400bc33a188e1663e3eb449cb9704"} Apr 17 18:55:25.379850 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:25.379766 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-klb57" event={"ID":"86bd38a4-ad0c-4b43-b98a-d3cae1d784ae","Type":"ContainerStarted","Data":"96543cf8977db4d3f3e37d45b85d37aaf9088e9408d79e3e6756b07af94a3ff1"} Apr 17 18:55:25.400492 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:25.400433 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-klb57" podStartSLOduration=1.182520818 podStartE2EDuration="3.4004197s" podCreationTimestamp="2026-04-17 18:55:22 +0000 UTC" firstStartedPulling="2026-04-17 18:55:22.906450508 +0000 UTC m=+362.180467779" lastFinishedPulling="2026-04-17 18:55:25.124349382 +0000 UTC m=+364.398366661" observedRunningTime="2026-04-17 18:55:25.398884264 +0000 UTC m=+364.672901556" watchObservedRunningTime="2026-04-17 18:55:25.4004197 +0000 UTC m=+364.674436994" Apr 17 18:55:26.616957 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:26.616928 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7kmt2"] Apr 17 18:55:26.620372 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:26.620356 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7kmt2" Apr 17 18:55:26.622548 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:26.622526 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 18:55:26.622966 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:26.622952 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-sx5lk\"" Apr 17 18:55:26.623298 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:26.623280 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 18:55:26.626928 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:26.626909 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7kmt2"] Apr 17 18:55:26.728291 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:26.728257 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd7ch\" (UniqueName: \"kubernetes.io/projected/bde30110-7d0e-41ee-9c8b-8beb27ad7027-kube-api-access-pd7ch\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7kmt2\" (UID: \"bde30110-7d0e-41ee-9c8b-8beb27ad7027\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7kmt2" Apr 17 18:55:26.728444 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:26.728299 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bde30110-7d0e-41ee-9c8b-8beb27ad7027-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7kmt2\" (UID: \"bde30110-7d0e-41ee-9c8b-8beb27ad7027\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7kmt2" Apr 17 18:55:26.728444 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:26.728327 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bde30110-7d0e-41ee-9c8b-8beb27ad7027-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7kmt2\" (UID: \"bde30110-7d0e-41ee-9c8b-8beb27ad7027\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7kmt2" Apr 17 18:55:26.829359 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:26.829317 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pd7ch\" (UniqueName: \"kubernetes.io/projected/bde30110-7d0e-41ee-9c8b-8beb27ad7027-kube-api-access-pd7ch\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7kmt2\" (UID: \"bde30110-7d0e-41ee-9c8b-8beb27ad7027\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7kmt2" Apr 17 18:55:26.829359 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:26.829361 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bde30110-7d0e-41ee-9c8b-8beb27ad7027-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7kmt2\" (UID: \"bde30110-7d0e-41ee-9c8b-8beb27ad7027\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7kmt2" Apr 17 18:55:26.829571 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:26.829501 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bde30110-7d0e-41ee-9c8b-8beb27ad7027-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7kmt2\" (UID: \"bde30110-7d0e-41ee-9c8b-8beb27ad7027\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7kmt2" Apr 17 18:55:26.829754 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:26.829733 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bde30110-7d0e-41ee-9c8b-8beb27ad7027-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7kmt2\" (UID: \"bde30110-7d0e-41ee-9c8b-8beb27ad7027\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7kmt2" Apr 17 18:55:26.829794 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:26.829785 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bde30110-7d0e-41ee-9c8b-8beb27ad7027-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7kmt2\" (UID: \"bde30110-7d0e-41ee-9c8b-8beb27ad7027\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7kmt2" Apr 17 18:55:26.837150 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:26.837122 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd7ch\" (UniqueName: \"kubernetes.io/projected/bde30110-7d0e-41ee-9c8b-8beb27ad7027-kube-api-access-pd7ch\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7kmt2\" (UID: \"bde30110-7d0e-41ee-9c8b-8beb27ad7027\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7kmt2" Apr 17 18:55:26.930771 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:26.930702 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7kmt2" Apr 17 18:55:27.047781 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:27.047754 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7kmt2"] Apr 17 18:55:27.050306 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:55:27.050280 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbde30110_7d0e_41ee_9c8b_8beb27ad7027.slice/crio-2c7eea41f062021bf769c1012a0fde6bab6aae77532eaf2ee41a0c9641f713f5 WatchSource:0}: Error finding container 2c7eea41f062021bf769c1012a0fde6bab6aae77532eaf2ee41a0c9641f713f5: Status 404 returned error can't find the container with id 2c7eea41f062021bf769c1012a0fde6bab6aae77532eaf2ee41a0c9641f713f5 Apr 17 18:55:27.387863 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:27.387829 2571 generic.go:358] "Generic (PLEG): container finished" podID="bde30110-7d0e-41ee-9c8b-8beb27ad7027" containerID="4a772a040cfd08bf1baee0542ab51fee90c51e19a6c5170b024692d01a5605ca" exitCode=0 Apr 17 18:55:27.388011 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:27.387873 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7kmt2" event={"ID":"bde30110-7d0e-41ee-9c8b-8beb27ad7027","Type":"ContainerDied","Data":"4a772a040cfd08bf1baee0542ab51fee90c51e19a6c5170b024692d01a5605ca"} Apr 17 18:55:27.388011 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:27.387899 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7kmt2" event={"ID":"bde30110-7d0e-41ee-9c8b-8beb27ad7027","Type":"ContainerStarted","Data":"2c7eea41f062021bf769c1012a0fde6bab6aae77532eaf2ee41a0c9641f713f5"} Apr 17 18:55:29.109659 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:29.109613 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-z9l4j"] Apr 17 18:55:29.119851 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:29.119827 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-z9l4j" Apr 17 18:55:29.123265 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:29.123237 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 17 18:55:29.124034 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:29.124015 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 17 18:55:29.124174 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:29.124086 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-vqtrs\"" Apr 17 18:55:29.129391 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:29.129365 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-z9l4j"] Apr 17 18:55:29.253027 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:29.252994 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mgjr\" (UniqueName: \"kubernetes.io/projected/3abbbcab-26db-4c10-9489-ea9d65127a31-kube-api-access-8mgjr\") pod \"cert-manager-webhook-597b96b99b-z9l4j\" (UID: \"3abbbcab-26db-4c10-9489-ea9d65127a31\") " pod="cert-manager/cert-manager-webhook-597b96b99b-z9l4j" Apr 17 18:55:29.253182 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:29.253109 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3abbbcab-26db-4c10-9489-ea9d65127a31-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-z9l4j\" (UID: \"3abbbcab-26db-4c10-9489-ea9d65127a31\") " pod="cert-manager/cert-manager-webhook-597b96b99b-z9l4j" Apr 17 18:55:29.353737 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:29.353716 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3abbbcab-26db-4c10-9489-ea9d65127a31-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-z9l4j\" (UID: \"3abbbcab-26db-4c10-9489-ea9d65127a31\") " pod="cert-manager/cert-manager-webhook-597b96b99b-z9l4j" Apr 17 18:55:29.353830 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:29.353774 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8mgjr\" (UniqueName: \"kubernetes.io/projected/3abbbcab-26db-4c10-9489-ea9d65127a31-kube-api-access-8mgjr\") pod \"cert-manager-webhook-597b96b99b-z9l4j\" (UID: \"3abbbcab-26db-4c10-9489-ea9d65127a31\") " pod="cert-manager/cert-manager-webhook-597b96b99b-z9l4j" Apr 17 18:55:29.362824 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:29.362773 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mgjr\" (UniqueName: \"kubernetes.io/projected/3abbbcab-26db-4c10-9489-ea9d65127a31-kube-api-access-8mgjr\") pod \"cert-manager-webhook-597b96b99b-z9l4j\" (UID: \"3abbbcab-26db-4c10-9489-ea9d65127a31\") " pod="cert-manager/cert-manager-webhook-597b96b99b-z9l4j" Apr 17 18:55:29.363161 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:29.363143 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3abbbcab-26db-4c10-9489-ea9d65127a31-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-z9l4j\" (UID: \"3abbbcab-26db-4c10-9489-ea9d65127a31\") " pod="cert-manager/cert-manager-webhook-597b96b99b-z9l4j" Apr 17 18:55:29.401682 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:29.401651 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7kmt2" event={"ID":"bde30110-7d0e-41ee-9c8b-8beb27ad7027","Type":"ContainerStarted","Data":"0cbee877bf6cd6d60ccdd12f19cf1df46d3487dd328bd87a025b15af3a6e7d96"} Apr 17 18:55:29.431913 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:29.431886 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-z9l4j" Apr 17 18:55:29.548431 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:29.548403 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-z9l4j"] Apr 17 18:55:29.551214 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:55:29.551189 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3abbbcab_26db_4c10_9489_ea9d65127a31.slice/crio-2866dc4136e25eca6d4f34a82315ab6457eea2501e29bd7b21aec68a57a4e025 WatchSource:0}: Error finding container 2866dc4136e25eca6d4f34a82315ab6457eea2501e29bd7b21aec68a57a4e025: Status 404 returned error can't find the container with id 2866dc4136e25eca6d4f34a82315ab6457eea2501e29bd7b21aec68a57a4e025 Apr 17 18:55:30.405958 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:30.405921 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-z9l4j" event={"ID":"3abbbcab-26db-4c10-9489-ea9d65127a31","Type":"ContainerStarted","Data":"2866dc4136e25eca6d4f34a82315ab6457eea2501e29bd7b21aec68a57a4e025"} Apr 17 18:55:30.407432 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:30.407406 2571 generic.go:358] "Generic (PLEG): container finished" podID="bde30110-7d0e-41ee-9c8b-8beb27ad7027" containerID="0cbee877bf6cd6d60ccdd12f19cf1df46d3487dd328bd87a025b15af3a6e7d96" exitCode=0 Apr 17 18:55:30.407557 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:30.407443 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7kmt2" event={"ID":"bde30110-7d0e-41ee-9c8b-8beb27ad7027","Type":"ContainerDied","Data":"0cbee877bf6cd6d60ccdd12f19cf1df46d3487dd328bd87a025b15af3a6e7d96"} Apr 17 18:55:31.413241 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:31.413190 2571 generic.go:358] "Generic (PLEG): container finished" podID="bde30110-7d0e-41ee-9c8b-8beb27ad7027" containerID="33ec18c290ac978946902ac0b5f74770d1f882e75505155e77891d1020345caa" exitCode=0 Apr 17 18:55:31.413707 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:31.413268 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7kmt2" event={"ID":"bde30110-7d0e-41ee-9c8b-8beb27ad7027","Type":"ContainerDied","Data":"33ec18c290ac978946902ac0b5f74770d1f882e75505155e77891d1020345caa"} Apr 17 18:55:32.558001 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:32.557980 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7kmt2" Apr 17 18:55:32.682948 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:32.682869 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bde30110-7d0e-41ee-9c8b-8beb27ad7027-bundle\") pod \"bde30110-7d0e-41ee-9c8b-8beb27ad7027\" (UID: \"bde30110-7d0e-41ee-9c8b-8beb27ad7027\") " Apr 17 18:55:32.683082 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:32.682971 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pd7ch\" (UniqueName: \"kubernetes.io/projected/bde30110-7d0e-41ee-9c8b-8beb27ad7027-kube-api-access-pd7ch\") pod \"bde30110-7d0e-41ee-9c8b-8beb27ad7027\" (UID: \"bde30110-7d0e-41ee-9c8b-8beb27ad7027\") " Apr 17 18:55:32.683082 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:32.683010 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bde30110-7d0e-41ee-9c8b-8beb27ad7027-util\") pod \"bde30110-7d0e-41ee-9c8b-8beb27ad7027\" (UID: \"bde30110-7d0e-41ee-9c8b-8beb27ad7027\") " Apr 17 18:55:32.683261 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:32.683241 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bde30110-7d0e-41ee-9c8b-8beb27ad7027-bundle" (OuterVolumeSpecName: "bundle") pod "bde30110-7d0e-41ee-9c8b-8beb27ad7027" (UID: "bde30110-7d0e-41ee-9c8b-8beb27ad7027"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:55:32.684919 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:32.684897 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bde30110-7d0e-41ee-9c8b-8beb27ad7027-kube-api-access-pd7ch" (OuterVolumeSpecName: "kube-api-access-pd7ch") pod "bde30110-7d0e-41ee-9c8b-8beb27ad7027" (UID: "bde30110-7d0e-41ee-9c8b-8beb27ad7027"). InnerVolumeSpecName "kube-api-access-pd7ch". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:55:32.687264 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:32.687240 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bde30110-7d0e-41ee-9c8b-8beb27ad7027-util" (OuterVolumeSpecName: "util") pod "bde30110-7d0e-41ee-9c8b-8beb27ad7027" (UID: "bde30110-7d0e-41ee-9c8b-8beb27ad7027"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:55:32.783864 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:32.783829 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bde30110-7d0e-41ee-9c8b-8beb27ad7027-bundle\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:55:32.783864 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:32.783858 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pd7ch\" (UniqueName: \"kubernetes.io/projected/bde30110-7d0e-41ee-9c8b-8beb27ad7027-kube-api-access-pd7ch\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:55:32.783864 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:32.783870 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bde30110-7d0e-41ee-9c8b-8beb27ad7027-util\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:55:33.420988 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:33.420952 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-z9l4j" event={"ID":"3abbbcab-26db-4c10-9489-ea9d65127a31","Type":"ContainerStarted","Data":"24af3adaf4540e05d9cfeacf4758e4b4235e820c86319f38bec0cc985c2c661f"} Apr 17 18:55:33.421179 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:33.421046 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-z9l4j" Apr 17 18:55:33.422628 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:33.422603 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7kmt2" event={"ID":"bde30110-7d0e-41ee-9c8b-8beb27ad7027","Type":"ContainerDied","Data":"2c7eea41f062021bf769c1012a0fde6bab6aae77532eaf2ee41a0c9641f713f5"} Apr 17 18:55:33.422742 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:33.422631 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c7eea41f062021bf769c1012a0fde6bab6aae77532eaf2ee41a0c9641f713f5" Apr 17 18:55:33.422742 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:33.422642 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7kmt2" Apr 17 18:55:33.436773 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:33.436735 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-z9l4j" podStartSLOduration=1.445029218 podStartE2EDuration="4.43672304s" podCreationTimestamp="2026-04-17 18:55:29 +0000 UTC" firstStartedPulling="2026-04-17 18:55:29.553171762 +0000 UTC m=+368.827189036" lastFinishedPulling="2026-04-17 18:55:32.544865581 +0000 UTC m=+371.818882858" observedRunningTime="2026-04-17 18:55:33.435370584 +0000 UTC m=+372.709387881" watchObservedRunningTime="2026-04-17 18:55:33.43672304 +0000 UTC m=+372.710740333" Apr 17 18:55:38.651557 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:38.651524 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-bgmdb"] Apr 17 18:55:38.651934 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:38.651822 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bde30110-7d0e-41ee-9c8b-8beb27ad7027" containerName="extract" Apr 17 18:55:38.651934 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:38.651833 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde30110-7d0e-41ee-9c8b-8beb27ad7027" containerName="extract" Apr 17 18:55:38.651934 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:38.651847 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bde30110-7d0e-41ee-9c8b-8beb27ad7027" containerName="pull" Apr 17 18:55:38.651934 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:38.651852 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde30110-7d0e-41ee-9c8b-8beb27ad7027" containerName="pull" Apr 17 18:55:38.651934 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:38.651865 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bde30110-7d0e-41ee-9c8b-8beb27ad7027" containerName="util" Apr 17 18:55:38.651934 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:38.651871 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde30110-7d0e-41ee-9c8b-8beb27ad7027" containerName="util" Apr 17 18:55:38.651934 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:38.651920 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="bde30110-7d0e-41ee-9c8b-8beb27ad7027" containerName="extract" Apr 17 18:55:38.654772 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:38.654753 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bgmdb" Apr 17 18:55:38.656823 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:38.656799 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-spdnf\"" Apr 17 18:55:38.656959 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:38.656838 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 17 18:55:38.657677 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:38.657660 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 17 18:55:38.663665 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:38.663646 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-bgmdb"] Apr 17 18:55:38.722679 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:38.722659 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/52f3400c-0624-46b1-a696-6ef36d97c1ef-tmp\") pod \"openshift-lws-operator-bfc7f696d-bgmdb\" (UID: \"52f3400c-0624-46b1-a696-6ef36d97c1ef\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bgmdb" Apr 17 18:55:38.722776 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:38.722700 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97gvx\" (UniqueName: \"kubernetes.io/projected/52f3400c-0624-46b1-a696-6ef36d97c1ef-kube-api-access-97gvx\") pod \"openshift-lws-operator-bfc7f696d-bgmdb\" (UID: \"52f3400c-0624-46b1-a696-6ef36d97c1ef\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bgmdb" Apr 17 18:55:38.824043 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:38.824011 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/52f3400c-0624-46b1-a696-6ef36d97c1ef-tmp\") pod \"openshift-lws-operator-bfc7f696d-bgmdb\" (UID: \"52f3400c-0624-46b1-a696-6ef36d97c1ef\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bgmdb" Apr 17 18:55:38.824164 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:38.824057 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-97gvx\" (UniqueName: \"kubernetes.io/projected/52f3400c-0624-46b1-a696-6ef36d97c1ef-kube-api-access-97gvx\") pod \"openshift-lws-operator-bfc7f696d-bgmdb\" (UID: \"52f3400c-0624-46b1-a696-6ef36d97c1ef\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bgmdb" Apr 17 18:55:38.824400 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:38.824381 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/52f3400c-0624-46b1-a696-6ef36d97c1ef-tmp\") pod \"openshift-lws-operator-bfc7f696d-bgmdb\" (UID: \"52f3400c-0624-46b1-a696-6ef36d97c1ef\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bgmdb" Apr 17 18:55:38.831450 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:38.831429 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-97gvx\" (UniqueName: \"kubernetes.io/projected/52f3400c-0624-46b1-a696-6ef36d97c1ef-kube-api-access-97gvx\") pod \"openshift-lws-operator-bfc7f696d-bgmdb\" (UID: \"52f3400c-0624-46b1-a696-6ef36d97c1ef\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bgmdb" Apr 17 18:55:38.964749 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:38.964683 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bgmdb" Apr 17 18:55:39.079414 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:39.079391 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-bgmdb"] Apr 17 18:55:39.081967 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:55:39.081937 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52f3400c_0624_46b1_a696_6ef36d97c1ef.slice/crio-18d895bd28f408ac286067ee581276289092cdf70164f9c64871da442bfd32b0 WatchSource:0}: Error finding container 18d895bd28f408ac286067ee581276289092cdf70164f9c64871da442bfd32b0: Status 404 returned error can't find the container with id 18d895bd28f408ac286067ee581276289092cdf70164f9c64871da442bfd32b0 Apr 17 18:55:39.428193 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:39.428168 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-z9l4j" Apr 17 18:55:39.441008 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:39.440979 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bgmdb" event={"ID":"52f3400c-0624-46b1-a696-6ef36d97c1ef","Type":"ContainerStarted","Data":"18d895bd28f408ac286067ee581276289092cdf70164f9c64871da442bfd32b0"} Apr 17 18:55:41.450140 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:41.450100 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bgmdb" event={"ID":"52f3400c-0624-46b1-a696-6ef36d97c1ef","Type":"ContainerStarted","Data":"8c71baa27c13ecf8913936573e8d0727fffa3d89a2777a12dfad238b5a735a26"} Apr 17 18:55:41.464374 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:41.464272 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bgmdb" podStartSLOduration=1.94150885 podStartE2EDuration="3.464255871s" podCreationTimestamp="2026-04-17 18:55:38 +0000 UTC" firstStartedPulling="2026-04-17 18:55:39.083363598 +0000 UTC m=+378.357380870" lastFinishedPulling="2026-04-17 18:55:40.606110619 +0000 UTC m=+379.880127891" observedRunningTime="2026-04-17 18:55:41.463699042 +0000 UTC m=+380.737716335" watchObservedRunningTime="2026-04-17 18:55:41.464255871 +0000 UTC m=+380.738273164" Apr 17 18:55:44.150104 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:44.150070 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zjkl4"] Apr 17 18:55:44.153739 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:44.153717 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zjkl4" Apr 17 18:55:44.155847 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:44.155823 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 18:55:44.155939 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:44.155867 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 18:55:44.156533 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:44.156514 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-sx5lk\"" Apr 17 18:55:44.157856 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:44.157834 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsnkh\" (UniqueName: \"kubernetes.io/projected/5ee1ba24-cbb2-47f2-87ce-02a750d08102-kube-api-access-qsnkh\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zjkl4\" (UID: \"5ee1ba24-cbb2-47f2-87ce-02a750d08102\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zjkl4" Apr 17 18:55:44.157957 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:44.157885 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5ee1ba24-cbb2-47f2-87ce-02a750d08102-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zjkl4\" (UID: \"5ee1ba24-cbb2-47f2-87ce-02a750d08102\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zjkl4" Apr 17 18:55:44.157957 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:44.157949 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5ee1ba24-cbb2-47f2-87ce-02a750d08102-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zjkl4\" (UID: \"5ee1ba24-cbb2-47f2-87ce-02a750d08102\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zjkl4" Apr 17 18:55:44.160476 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:44.160439 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zjkl4"] Apr 17 18:55:44.258926 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:44.258889 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5ee1ba24-cbb2-47f2-87ce-02a750d08102-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zjkl4\" (UID: \"5ee1ba24-cbb2-47f2-87ce-02a750d08102\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zjkl4" Apr 17 18:55:44.259088 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:44.258939 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5ee1ba24-cbb2-47f2-87ce-02a750d08102-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zjkl4\" (UID: \"5ee1ba24-cbb2-47f2-87ce-02a750d08102\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zjkl4" Apr 17 18:55:44.259088 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:44.258986 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qsnkh\" (UniqueName: \"kubernetes.io/projected/5ee1ba24-cbb2-47f2-87ce-02a750d08102-kube-api-access-qsnkh\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zjkl4\" (UID: \"5ee1ba24-cbb2-47f2-87ce-02a750d08102\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zjkl4" Apr 17 18:55:44.259293 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:44.259274 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5ee1ba24-cbb2-47f2-87ce-02a750d08102-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zjkl4\" (UID: \"5ee1ba24-cbb2-47f2-87ce-02a750d08102\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zjkl4" Apr 17 18:55:44.259354 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:44.259338 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5ee1ba24-cbb2-47f2-87ce-02a750d08102-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zjkl4\" (UID: \"5ee1ba24-cbb2-47f2-87ce-02a750d08102\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zjkl4" Apr 17 18:55:44.267154 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:44.267123 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsnkh\" (UniqueName: \"kubernetes.io/projected/5ee1ba24-cbb2-47f2-87ce-02a750d08102-kube-api-access-qsnkh\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zjkl4\" (UID: \"5ee1ba24-cbb2-47f2-87ce-02a750d08102\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zjkl4" Apr 17 18:55:44.463809 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:44.463726 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zjkl4" Apr 17 18:55:44.580953 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:44.580925 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zjkl4"] Apr 17 18:55:44.583181 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:55:44.583156 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ee1ba24_cbb2_47f2_87ce_02a750d08102.slice/crio-80ac39255482975b49c272509d945bc827ab14a7423e452efd1050f6b7c6fbe2 WatchSource:0}: Error finding container 80ac39255482975b49c272509d945bc827ab14a7423e452efd1050f6b7c6fbe2: Status 404 returned error can't find the container with id 80ac39255482975b49c272509d945bc827ab14a7423e452efd1050f6b7c6fbe2 Apr 17 18:55:45.462962 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:45.462926 2571 generic.go:358] "Generic (PLEG): container finished" podID="5ee1ba24-cbb2-47f2-87ce-02a750d08102" containerID="494829a06869440dc78184c0595ef5c9d4724c4275b850cd076913a5d9ea96ad" exitCode=0 Apr 17 18:55:45.463324 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:45.463012 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zjkl4" event={"ID":"5ee1ba24-cbb2-47f2-87ce-02a750d08102","Type":"ContainerDied","Data":"494829a06869440dc78184c0595ef5c9d4724c4275b850cd076913a5d9ea96ad"} Apr 17 18:55:45.463324 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:45.463054 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zjkl4" event={"ID":"5ee1ba24-cbb2-47f2-87ce-02a750d08102","Type":"ContainerStarted","Data":"80ac39255482975b49c272509d945bc827ab14a7423e452efd1050f6b7c6fbe2"} Apr 17 18:55:47.470092 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:47.470059 2571 generic.go:358] "Generic (PLEG): container finished" podID="5ee1ba24-cbb2-47f2-87ce-02a750d08102" containerID="d9b718df734b7abb69c12588b65417a0a25cdf2dd50847b23152e02db1323827" exitCode=0 Apr 17 18:55:47.470486 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:47.470148 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zjkl4" event={"ID":"5ee1ba24-cbb2-47f2-87ce-02a750d08102","Type":"ContainerDied","Data":"d9b718df734b7abb69c12588b65417a0a25cdf2dd50847b23152e02db1323827"} Apr 17 18:55:48.475833 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:48.475797 2571 generic.go:358] "Generic (PLEG): container finished" podID="5ee1ba24-cbb2-47f2-87ce-02a750d08102" containerID="aae26dcbe981d20acaba50584f65787b8759d1602b75c25823aeba7d792f3378" exitCode=0 Apr 17 18:55:48.476207 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:48.475879 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zjkl4" event={"ID":"5ee1ba24-cbb2-47f2-87ce-02a750d08102","Type":"ContainerDied","Data":"aae26dcbe981d20acaba50584f65787b8759d1602b75c25823aeba7d792f3378"} Apr 17 18:55:49.594189 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:49.594165 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zjkl4" Apr 17 18:55:49.693125 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:49.693099 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5ee1ba24-cbb2-47f2-87ce-02a750d08102-bundle\") pod \"5ee1ba24-cbb2-47f2-87ce-02a750d08102\" (UID: \"5ee1ba24-cbb2-47f2-87ce-02a750d08102\") " Apr 17 18:55:49.693256 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:49.693134 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsnkh\" (UniqueName: \"kubernetes.io/projected/5ee1ba24-cbb2-47f2-87ce-02a750d08102-kube-api-access-qsnkh\") pod \"5ee1ba24-cbb2-47f2-87ce-02a750d08102\" (UID: \"5ee1ba24-cbb2-47f2-87ce-02a750d08102\") " Apr 17 18:55:49.693256 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:49.693201 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5ee1ba24-cbb2-47f2-87ce-02a750d08102-util\") pod \"5ee1ba24-cbb2-47f2-87ce-02a750d08102\" (UID: \"5ee1ba24-cbb2-47f2-87ce-02a750d08102\") " Apr 17 18:55:49.693891 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:49.693860 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ee1ba24-cbb2-47f2-87ce-02a750d08102-bundle" (OuterVolumeSpecName: "bundle") pod "5ee1ba24-cbb2-47f2-87ce-02a750d08102" (UID: "5ee1ba24-cbb2-47f2-87ce-02a750d08102"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:55:49.695223 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:49.695201 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ee1ba24-cbb2-47f2-87ce-02a750d08102-kube-api-access-qsnkh" (OuterVolumeSpecName: "kube-api-access-qsnkh") pod "5ee1ba24-cbb2-47f2-87ce-02a750d08102" (UID: "5ee1ba24-cbb2-47f2-87ce-02a750d08102"). InnerVolumeSpecName "kube-api-access-qsnkh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:55:49.698453 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:49.698420 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ee1ba24-cbb2-47f2-87ce-02a750d08102-util" (OuterVolumeSpecName: "util") pod "5ee1ba24-cbb2-47f2-87ce-02a750d08102" (UID: "5ee1ba24-cbb2-47f2-87ce-02a750d08102"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:55:49.793666 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:49.793639 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5ee1ba24-cbb2-47f2-87ce-02a750d08102-util\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:55:49.793666 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:49.793665 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5ee1ba24-cbb2-47f2-87ce-02a750d08102-bundle\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:55:49.793804 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:49.793676 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qsnkh\" (UniqueName: \"kubernetes.io/projected/5ee1ba24-cbb2-47f2-87ce-02a750d08102-kube-api-access-qsnkh\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:55:50.483501 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:50.483449 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zjkl4" event={"ID":"5ee1ba24-cbb2-47f2-87ce-02a750d08102","Type":"ContainerDied","Data":"80ac39255482975b49c272509d945bc827ab14a7423e452efd1050f6b7c6fbe2"} Apr 17 18:55:50.483501 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:50.483501 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80ac39255482975b49c272509d945bc827ab14a7423e452efd1050f6b7c6fbe2" Apr 17 18:55:50.483700 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:55:50.483527 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zjkl4" Apr 17 18:56:00.773023 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:00.772984 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xnswz"] Apr 17 18:56:00.773489 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:00.773425 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ee1ba24-cbb2-47f2-87ce-02a750d08102" containerName="pull" Apr 17 18:56:00.773489 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:00.773443 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ee1ba24-cbb2-47f2-87ce-02a750d08102" containerName="pull" Apr 17 18:56:00.773489 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:00.773453 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ee1ba24-cbb2-47f2-87ce-02a750d08102" containerName="extract" Apr 17 18:56:00.773489 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:00.773478 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ee1ba24-cbb2-47f2-87ce-02a750d08102" containerName="extract" Apr 17 18:56:00.773489 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:00.773490 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ee1ba24-cbb2-47f2-87ce-02a750d08102" containerName="util" Apr 17 18:56:00.773721 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:00.773498 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ee1ba24-cbb2-47f2-87ce-02a750d08102" containerName="util" Apr 17 18:56:00.776706 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:00.774172 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="5ee1ba24-cbb2-47f2-87ce-02a750d08102" containerName="extract" Apr 17 18:56:00.781631 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:00.781586 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xnswz" Apr 17 18:56:00.783828 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:00.783807 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 18:56:00.784361 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:00.784337 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xnswz"] Apr 17 18:56:00.784643 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:00.784621 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 18:56:00.784765 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:00.784629 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-sx5lk\"" Apr 17 18:56:00.876888 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:00.876855 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99b9a6ab-f980-4a3c-96ba-c30c9e0884ef-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xnswz\" (UID: \"99b9a6ab-f980-4a3c-96ba-c30c9e0884ef\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xnswz" Apr 17 18:56:00.877042 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:00.876898 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkhsw\" (UniqueName: \"kubernetes.io/projected/99b9a6ab-f980-4a3c-96ba-c30c9e0884ef-kube-api-access-lkhsw\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xnswz\" (UID: \"99b9a6ab-f980-4a3c-96ba-c30c9e0884ef\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xnswz" Apr 17 18:56:00.877042 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:00.876973 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99b9a6ab-f980-4a3c-96ba-c30c9e0884ef-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xnswz\" (UID: \"99b9a6ab-f980-4a3c-96ba-c30c9e0884ef\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xnswz" Apr 17 18:56:00.977728 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:00.977700 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99b9a6ab-f980-4a3c-96ba-c30c9e0884ef-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xnswz\" (UID: \"99b9a6ab-f980-4a3c-96ba-c30c9e0884ef\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xnswz" Apr 17 18:56:00.977906 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:00.977742 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lkhsw\" (UniqueName: \"kubernetes.io/projected/99b9a6ab-f980-4a3c-96ba-c30c9e0884ef-kube-api-access-lkhsw\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xnswz\" (UID: \"99b9a6ab-f980-4a3c-96ba-c30c9e0884ef\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xnswz" Apr 17 18:56:00.977906 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:00.977869 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99b9a6ab-f980-4a3c-96ba-c30c9e0884ef-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xnswz\" (UID: \"99b9a6ab-f980-4a3c-96ba-c30c9e0884ef\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xnswz" Apr 17 18:56:00.978137 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:00.978114 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99b9a6ab-f980-4a3c-96ba-c30c9e0884ef-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xnswz\" (UID: \"99b9a6ab-f980-4a3c-96ba-c30c9e0884ef\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xnswz" Apr 17 18:56:00.978203 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:00.978146 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99b9a6ab-f980-4a3c-96ba-c30c9e0884ef-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xnswz\" (UID: \"99b9a6ab-f980-4a3c-96ba-c30c9e0884ef\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xnswz" Apr 17 18:56:00.985524 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:00.985501 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkhsw\" (UniqueName: \"kubernetes.io/projected/99b9a6ab-f980-4a3c-96ba-c30c9e0884ef-kube-api-access-lkhsw\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xnswz\" (UID: \"99b9a6ab-f980-4a3c-96ba-c30c9e0884ef\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xnswz" Apr 17 18:56:01.091275 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:01.091211 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xnswz" Apr 17 18:56:01.212112 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:01.212089 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xnswz"] Apr 17 18:56:01.214361 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:56:01.214332 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99b9a6ab_f980_4a3c_96ba_c30c9e0884ef.slice/crio-009669d475f563714865cc9392163696673f77da2f8bbb1342008a623a39d87a WatchSource:0}: Error finding container 009669d475f563714865cc9392163696673f77da2f8bbb1342008a623a39d87a: Status 404 returned error can't find the container with id 009669d475f563714865cc9392163696673f77da2f8bbb1342008a623a39d87a Apr 17 18:56:01.520013 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:01.519979 2571 generic.go:358] "Generic (PLEG): container finished" podID="99b9a6ab-f980-4a3c-96ba-c30c9e0884ef" containerID="9a31747aebb3c4a5fe57d74b796ad831fd68b6b529cfb6411c568f033f018ee5" exitCode=0 Apr 17 18:56:01.520160 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:01.520060 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xnswz" event={"ID":"99b9a6ab-f980-4a3c-96ba-c30c9e0884ef","Type":"ContainerDied","Data":"9a31747aebb3c4a5fe57d74b796ad831fd68b6b529cfb6411c568f033f018ee5"} Apr 17 18:56:01.520160 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:01.520101 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xnswz" event={"ID":"99b9a6ab-f980-4a3c-96ba-c30c9e0884ef","Type":"ContainerStarted","Data":"009669d475f563714865cc9392163696673f77da2f8bbb1342008a623a39d87a"} Apr 17 18:56:01.538279 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:01.538249 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-jbgdp"] Apr 17 18:56:01.541702 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:01.541682 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-jbgdp" Apr 17 18:56:01.544064 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:01.544043 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-w5cl6\"" Apr 17 18:56:01.544190 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:01.544158 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 17 18:56:01.544424 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:01.544404 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 18:56:01.544552 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:01.544535 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 17 18:56:01.544618 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:01.544605 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 18:56:01.556727 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:01.556703 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-jbgdp"] Apr 17 18:56:01.583741 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:01.583716 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/08b734ec-3c4b-42b6-830b-03d6aa1ece20-webhook-cert\") pod \"opendatahub-operator-controller-manager-6fc6488c9d-jbgdp\" (UID: \"08b734ec-3c4b-42b6-830b-03d6aa1ece20\") " pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-jbgdp" Apr 17 18:56:01.583866 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:01.583809 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4tzs\" (UniqueName: \"kubernetes.io/projected/08b734ec-3c4b-42b6-830b-03d6aa1ece20-kube-api-access-w4tzs\") pod \"opendatahub-operator-controller-manager-6fc6488c9d-jbgdp\" (UID: \"08b734ec-3c4b-42b6-830b-03d6aa1ece20\") " pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-jbgdp" Apr 17 18:56:01.583866 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:01.583851 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/08b734ec-3c4b-42b6-830b-03d6aa1ece20-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6fc6488c9d-jbgdp\" (UID: \"08b734ec-3c4b-42b6-830b-03d6aa1ece20\") " pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-jbgdp" Apr 17 18:56:01.684340 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:01.684301 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/08b734ec-3c4b-42b6-830b-03d6aa1ece20-webhook-cert\") pod \"opendatahub-operator-controller-manager-6fc6488c9d-jbgdp\" (UID: \"08b734ec-3c4b-42b6-830b-03d6aa1ece20\") " pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-jbgdp" Apr 17 18:56:01.684557 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:01.684367 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w4tzs\" (UniqueName: \"kubernetes.io/projected/08b734ec-3c4b-42b6-830b-03d6aa1ece20-kube-api-access-w4tzs\") pod \"opendatahub-operator-controller-manager-6fc6488c9d-jbgdp\" (UID: \"08b734ec-3c4b-42b6-830b-03d6aa1ece20\") " pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-jbgdp" Apr 17 18:56:01.684557 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:01.684396 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/08b734ec-3c4b-42b6-830b-03d6aa1ece20-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6fc6488c9d-jbgdp\" (UID: \"08b734ec-3c4b-42b6-830b-03d6aa1ece20\") " pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-jbgdp" Apr 17 18:56:01.686796 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:01.686770 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/08b734ec-3c4b-42b6-830b-03d6aa1ece20-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6fc6488c9d-jbgdp\" (UID: \"08b734ec-3c4b-42b6-830b-03d6aa1ece20\") " pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-jbgdp" Apr 17 18:56:01.686796 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:01.686782 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/08b734ec-3c4b-42b6-830b-03d6aa1ece20-webhook-cert\") pod \"opendatahub-operator-controller-manager-6fc6488c9d-jbgdp\" (UID: \"08b734ec-3c4b-42b6-830b-03d6aa1ece20\") " pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-jbgdp" Apr 17 18:56:01.691787 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:01.691768 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4tzs\" (UniqueName: \"kubernetes.io/projected/08b734ec-3c4b-42b6-830b-03d6aa1ece20-kube-api-access-w4tzs\") pod \"opendatahub-operator-controller-manager-6fc6488c9d-jbgdp\" (UID: \"08b734ec-3c4b-42b6-830b-03d6aa1ece20\") " pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-jbgdp" Apr 17 18:56:01.852297 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:01.852207 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-jbgdp" Apr 17 18:56:01.974201 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:01.974175 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-jbgdp"] Apr 17 18:56:01.977176 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:56:01.977147 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08b734ec_3c4b_42b6_830b_03d6aa1ece20.slice/crio-7234a1f224d9acc561e8d402e7869e0535a12985d008257325773ac7887eddf7 WatchSource:0}: Error finding container 7234a1f224d9acc561e8d402e7869e0535a12985d008257325773ac7887eddf7: Status 404 returned error can't find the container with id 7234a1f224d9acc561e8d402e7869e0535a12985d008257325773ac7887eddf7 Apr 17 18:56:02.526681 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:02.526647 2571 generic.go:358] "Generic (PLEG): container finished" podID="99b9a6ab-f980-4a3c-96ba-c30c9e0884ef" containerID="2fd4d3f3a69f47f191fe8ec76cdf2d2bc95e2348b14c603c21c67d37ae998afc" exitCode=0 Apr 17 18:56:02.526862 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:02.526770 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xnswz" event={"ID":"99b9a6ab-f980-4a3c-96ba-c30c9e0884ef","Type":"ContainerDied","Data":"2fd4d3f3a69f47f191fe8ec76cdf2d2bc95e2348b14c603c21c67d37ae998afc"} Apr 17 18:56:02.528660 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:02.528561 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-jbgdp" event={"ID":"08b734ec-3c4b-42b6-830b-03d6aa1ece20","Type":"ContainerStarted","Data":"7234a1f224d9acc561e8d402e7869e0535a12985d008257325773ac7887eddf7"} Apr 17 18:56:03.534515 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:03.534476 2571 generic.go:358] "Generic (PLEG): container finished" podID="99b9a6ab-f980-4a3c-96ba-c30c9e0884ef" containerID="2b4fc8fd1e457ca95baec41875fc0ecfc0cb3514d16d427de584f040fb2123dc" exitCode=0 Apr 17 18:56:03.534972 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:03.534558 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xnswz" event={"ID":"99b9a6ab-f980-4a3c-96ba-c30c9e0884ef","Type":"ContainerDied","Data":"2b4fc8fd1e457ca95baec41875fc0ecfc0cb3514d16d427de584f040fb2123dc"} Apr 17 18:56:04.545453 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:04.545410 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-jbgdp" event={"ID":"08b734ec-3c4b-42b6-830b-03d6aa1ece20","Type":"ContainerStarted","Data":"105b6a8842a5d6b60c7e54e785884767b57af291c77837ba286aaec72d18b336"} Apr 17 18:56:04.566283 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:04.566202 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-jbgdp" podStartSLOduration=1.153486085 podStartE2EDuration="3.566184317s" podCreationTimestamp="2026-04-17 18:56:01 +0000 UTC" firstStartedPulling="2026-04-17 18:56:01.978862382 +0000 UTC m=+401.252879653" lastFinishedPulling="2026-04-17 18:56:04.391560614 +0000 UTC m=+403.665577885" observedRunningTime="2026-04-17 18:56:04.563290235 +0000 UTC m=+403.837307528" watchObservedRunningTime="2026-04-17 18:56:04.566184317 +0000 UTC m=+403.840201611" Apr 17 18:56:04.662108 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:04.662085 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xnswz" Apr 17 18:56:04.710561 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:04.710537 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99b9a6ab-f980-4a3c-96ba-c30c9e0884ef-bundle\") pod \"99b9a6ab-f980-4a3c-96ba-c30c9e0884ef\" (UID: \"99b9a6ab-f980-4a3c-96ba-c30c9e0884ef\") " Apr 17 18:56:04.710713 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:04.710600 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99b9a6ab-f980-4a3c-96ba-c30c9e0884ef-util\") pod \"99b9a6ab-f980-4a3c-96ba-c30c9e0884ef\" (UID: \"99b9a6ab-f980-4a3c-96ba-c30c9e0884ef\") " Apr 17 18:56:04.710713 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:04.710642 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkhsw\" (UniqueName: \"kubernetes.io/projected/99b9a6ab-f980-4a3c-96ba-c30c9e0884ef-kube-api-access-lkhsw\") pod \"99b9a6ab-f980-4a3c-96ba-c30c9e0884ef\" (UID: \"99b9a6ab-f980-4a3c-96ba-c30c9e0884ef\") " Apr 17 18:56:04.711369 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:04.711326 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99b9a6ab-f980-4a3c-96ba-c30c9e0884ef-bundle" (OuterVolumeSpecName: "bundle") pod "99b9a6ab-f980-4a3c-96ba-c30c9e0884ef" (UID: "99b9a6ab-f980-4a3c-96ba-c30c9e0884ef"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:56:04.712639 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:04.712617 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99b9a6ab-f980-4a3c-96ba-c30c9e0884ef-kube-api-access-lkhsw" (OuterVolumeSpecName: "kube-api-access-lkhsw") pod "99b9a6ab-f980-4a3c-96ba-c30c9e0884ef" (UID: "99b9a6ab-f980-4a3c-96ba-c30c9e0884ef"). InnerVolumeSpecName "kube-api-access-lkhsw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:56:04.715872 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:04.715849 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99b9a6ab-f980-4a3c-96ba-c30c9e0884ef-util" (OuterVolumeSpecName: "util") pod "99b9a6ab-f980-4a3c-96ba-c30c9e0884ef" (UID: "99b9a6ab-f980-4a3c-96ba-c30c9e0884ef"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:56:04.811095 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:04.811065 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99b9a6ab-f980-4a3c-96ba-c30c9e0884ef-util\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:56:04.811095 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:04.811091 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lkhsw\" (UniqueName: \"kubernetes.io/projected/99b9a6ab-f980-4a3c-96ba-c30c9e0884ef-kube-api-access-lkhsw\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:56:04.811236 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:04.811110 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99b9a6ab-f980-4a3c-96ba-c30c9e0884ef-bundle\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:56:05.551885 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:05.551858 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xnswz" Apr 17 18:56:05.552237 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:05.551859 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xnswz" event={"ID":"99b9a6ab-f980-4a3c-96ba-c30c9e0884ef","Type":"ContainerDied","Data":"009669d475f563714865cc9392163696673f77da2f8bbb1342008a623a39d87a"} Apr 17 18:56:05.552237 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:05.551967 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="009669d475f563714865cc9392163696673f77da2f8bbb1342008a623a39d87a" Apr 17 18:56:05.552237 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:05.552232 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-jbgdp" Apr 17 18:56:16.557549 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:16.557515 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-jbgdp" Apr 17 18:56:19.197334 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:19.197304 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-697b5bd5df-x9vvj"] Apr 17 18:56:19.197716 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:19.197628 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99b9a6ab-f980-4a3c-96ba-c30c9e0884ef" containerName="pull" Apr 17 18:56:19.197716 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:19.197639 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="99b9a6ab-f980-4a3c-96ba-c30c9e0884ef" containerName="pull" Apr 17 18:56:19.197716 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:19.197655 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99b9a6ab-f980-4a3c-96ba-c30c9e0884ef" containerName="extract" Apr 17 18:56:19.197716 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:19.197660 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="99b9a6ab-f980-4a3c-96ba-c30c9e0884ef" containerName="extract" Apr 17 18:56:19.197716 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:19.197672 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99b9a6ab-f980-4a3c-96ba-c30c9e0884ef" containerName="util" Apr 17 18:56:19.197716 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:19.197677 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="99b9a6ab-f980-4a3c-96ba-c30c9e0884ef" containerName="util" Apr 17 18:56:19.197899 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:19.197725 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="99b9a6ab-f980-4a3c-96ba-c30c9e0884ef" containerName="extract" Apr 17 18:56:19.202516 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:19.202496 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-697b5bd5df-x9vvj" Apr 17 18:56:19.204864 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:19.204836 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 17 18:56:19.205656 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:19.205641 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 17 18:56:19.205656 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:19.205649 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 17 18:56:19.205804 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:19.205724 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-n9k5b\"" Apr 17 18:56:19.211886 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:19.211866 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-697b5bd5df-x9vvj"] Apr 17 18:56:19.318983 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:19.318952 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/19dba8f8-4b70-40ff-9034-d52c4e0f0ba5-cert\") pod \"lws-controller-manager-697b5bd5df-x9vvj\" (UID: \"19dba8f8-4b70-40ff-9034-d52c4e0f0ba5\") " pod="openshift-lws-operator/lws-controller-manager-697b5bd5df-x9vvj" Apr 17 18:56:19.319170 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:19.319012 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/19dba8f8-4b70-40ff-9034-d52c4e0f0ba5-metrics-cert\") pod \"lws-controller-manager-697b5bd5df-x9vvj\" (UID: \"19dba8f8-4b70-40ff-9034-d52c4e0f0ba5\") " pod="openshift-lws-operator/lws-controller-manager-697b5bd5df-x9vvj" Apr 17 18:56:19.319170 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:19.319076 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/19dba8f8-4b70-40ff-9034-d52c4e0f0ba5-manager-config\") pod \"lws-controller-manager-697b5bd5df-x9vvj\" (UID: \"19dba8f8-4b70-40ff-9034-d52c4e0f0ba5\") " pod="openshift-lws-operator/lws-controller-manager-697b5bd5df-x9vvj" Apr 17 18:56:19.319170 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:19.319093 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhdbr\" (UniqueName: \"kubernetes.io/projected/19dba8f8-4b70-40ff-9034-d52c4e0f0ba5-kube-api-access-qhdbr\") pod \"lws-controller-manager-697b5bd5df-x9vvj\" (UID: \"19dba8f8-4b70-40ff-9034-d52c4e0f0ba5\") " pod="openshift-lws-operator/lws-controller-manager-697b5bd5df-x9vvj" Apr 17 18:56:19.420032 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:19.419995 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/19dba8f8-4b70-40ff-9034-d52c4e0f0ba5-cert\") pod \"lws-controller-manager-697b5bd5df-x9vvj\" (UID: \"19dba8f8-4b70-40ff-9034-d52c4e0f0ba5\") " pod="openshift-lws-operator/lws-controller-manager-697b5bd5df-x9vvj" Apr 17 18:56:19.420185 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:19.420049 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/19dba8f8-4b70-40ff-9034-d52c4e0f0ba5-metrics-cert\") pod \"lws-controller-manager-697b5bd5df-x9vvj\" (UID: \"19dba8f8-4b70-40ff-9034-d52c4e0f0ba5\") " pod="openshift-lws-operator/lws-controller-manager-697b5bd5df-x9vvj" Apr 17 18:56:19.420185 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:19.420078 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/19dba8f8-4b70-40ff-9034-d52c4e0f0ba5-manager-config\") pod \"lws-controller-manager-697b5bd5df-x9vvj\" (UID: \"19dba8f8-4b70-40ff-9034-d52c4e0f0ba5\") " pod="openshift-lws-operator/lws-controller-manager-697b5bd5df-x9vvj" Apr 17 18:56:19.420185 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:19.420096 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qhdbr\" (UniqueName: \"kubernetes.io/projected/19dba8f8-4b70-40ff-9034-d52c4e0f0ba5-kube-api-access-qhdbr\") pod \"lws-controller-manager-697b5bd5df-x9vvj\" (UID: \"19dba8f8-4b70-40ff-9034-d52c4e0f0ba5\") " pod="openshift-lws-operator/lws-controller-manager-697b5bd5df-x9vvj" Apr 17 18:56:19.420769 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:19.420746 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/19dba8f8-4b70-40ff-9034-d52c4e0f0ba5-manager-config\") pod \"lws-controller-manager-697b5bd5df-x9vvj\" (UID: \"19dba8f8-4b70-40ff-9034-d52c4e0f0ba5\") " pod="openshift-lws-operator/lws-controller-manager-697b5bd5df-x9vvj" Apr 17 18:56:19.422485 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:19.422442 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/19dba8f8-4b70-40ff-9034-d52c4e0f0ba5-metrics-cert\") pod \"lws-controller-manager-697b5bd5df-x9vvj\" (UID: \"19dba8f8-4b70-40ff-9034-d52c4e0f0ba5\") " pod="openshift-lws-operator/lws-controller-manager-697b5bd5df-x9vvj" Apr 17 18:56:19.422577 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:19.422516 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/19dba8f8-4b70-40ff-9034-d52c4e0f0ba5-cert\") pod \"lws-controller-manager-697b5bd5df-x9vvj\" (UID: \"19dba8f8-4b70-40ff-9034-d52c4e0f0ba5\") " pod="openshift-lws-operator/lws-controller-manager-697b5bd5df-x9vvj" Apr 17 18:56:19.428549 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:19.428512 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhdbr\" (UniqueName: \"kubernetes.io/projected/19dba8f8-4b70-40ff-9034-d52c4e0f0ba5-kube-api-access-qhdbr\") pod \"lws-controller-manager-697b5bd5df-x9vvj\" (UID: \"19dba8f8-4b70-40ff-9034-d52c4e0f0ba5\") " pod="openshift-lws-operator/lws-controller-manager-697b5bd5df-x9vvj" Apr 17 18:56:19.512073 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:19.512047 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-697b5bd5df-x9vvj" Apr 17 18:56:19.632778 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:19.632676 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-697b5bd5df-x9vvj"] Apr 17 18:56:19.635405 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:56:19.635371 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19dba8f8_4b70_40ff_9034_d52c4e0f0ba5.slice/crio-30b1e76897e081aecdea3dc003180e876cae8cd7289d8834739426d320262dd6 WatchSource:0}: Error finding container 30b1e76897e081aecdea3dc003180e876cae8cd7289d8834739426d320262dd6: Status 404 returned error can't find the container with id 30b1e76897e081aecdea3dc003180e876cae8cd7289d8834739426d320262dd6 Apr 17 18:56:20.604420 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:20.604377 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-697b5bd5df-x9vvj" event={"ID":"19dba8f8-4b70-40ff-9034-d52c4e0f0ba5","Type":"ContainerStarted","Data":"30b1e76897e081aecdea3dc003180e876cae8cd7289d8834739426d320262dd6"} Apr 17 18:56:21.609711 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:21.609670 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-697b5bd5df-x9vvj" event={"ID":"19dba8f8-4b70-40ff-9034-d52c4e0f0ba5","Type":"ContainerStarted","Data":"d9b2680a70743b0c4c142bf1e9be53fa87f6caba899b61945f9accb14dc449f8"} Apr 17 18:56:21.610089 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:21.609816 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-697b5bd5df-x9vvj" Apr 17 18:56:21.623978 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:21.623926 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-697b5bd5df-x9vvj" podStartSLOduration=1.015808297 podStartE2EDuration="2.623912767s" podCreationTimestamp="2026-04-17 18:56:19 +0000 UTC" firstStartedPulling="2026-04-17 18:56:19.637013643 +0000 UTC m=+418.911030913" lastFinishedPulling="2026-04-17 18:56:21.245118111 +0000 UTC m=+420.519135383" observedRunningTime="2026-04-17 18:56:21.623565695 +0000 UTC m=+420.897582987" watchObservedRunningTime="2026-04-17 18:56:21.623912767 +0000 UTC m=+420.897930060" Apr 17 18:56:21.759363 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:21.759330 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lvswg"] Apr 17 18:56:21.768364 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:21.768331 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lvswg" Apr 17 18:56:21.769472 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:21.769423 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lvswg"] Apr 17 18:56:21.770712 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:21.770693 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 18:56:21.770835 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:21.770759 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-sx5lk\"" Apr 17 18:56:21.770971 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:21.770956 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 18:56:21.838973 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:21.838946 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv824\" (UniqueName: \"kubernetes.io/projected/fb937405-4711-489a-b79a-f82e4c3dedcc-kube-api-access-tv824\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lvswg\" (UID: \"fb937405-4711-489a-b79a-f82e4c3dedcc\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lvswg" Apr 17 18:56:21.839135 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:21.839004 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fb937405-4711-489a-b79a-f82e4c3dedcc-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lvswg\" (UID: \"fb937405-4711-489a-b79a-f82e4c3dedcc\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lvswg" Apr 17 18:56:21.839135 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:21.839082 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fb937405-4711-489a-b79a-f82e4c3dedcc-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lvswg\" (UID: \"fb937405-4711-489a-b79a-f82e4c3dedcc\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lvswg" Apr 17 18:56:21.940496 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:21.940413 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tv824\" (UniqueName: \"kubernetes.io/projected/fb937405-4711-489a-b79a-f82e4c3dedcc-kube-api-access-tv824\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lvswg\" (UID: \"fb937405-4711-489a-b79a-f82e4c3dedcc\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lvswg" Apr 17 18:56:21.940496 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:21.940476 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fb937405-4711-489a-b79a-f82e4c3dedcc-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lvswg\" (UID: \"fb937405-4711-489a-b79a-f82e4c3dedcc\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lvswg" Apr 17 18:56:21.940659 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:21.940502 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fb937405-4711-489a-b79a-f82e4c3dedcc-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lvswg\" (UID: \"fb937405-4711-489a-b79a-f82e4c3dedcc\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lvswg" Apr 17 18:56:21.940858 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:21.940825 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fb937405-4711-489a-b79a-f82e4c3dedcc-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lvswg\" (UID: \"fb937405-4711-489a-b79a-f82e4c3dedcc\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lvswg" Apr 17 18:56:21.940893 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:21.940870 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fb937405-4711-489a-b79a-f82e4c3dedcc-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lvswg\" (UID: \"fb937405-4711-489a-b79a-f82e4c3dedcc\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lvswg" Apr 17 18:56:21.948301 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:21.948278 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv824\" (UniqueName: \"kubernetes.io/projected/fb937405-4711-489a-b79a-f82e4c3dedcc-kube-api-access-tv824\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lvswg\" (UID: \"fb937405-4711-489a-b79a-f82e4c3dedcc\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lvswg" Apr 17 18:56:22.078089 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:22.078054 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lvswg" Apr 17 18:56:22.191755 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:22.191731 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lvswg"] Apr 17 18:56:22.193796 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:56:22.193764 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb937405_4711_489a_b79a_f82e4c3dedcc.slice/crio-a636d463aae69de0416a346fee04498dfd404a44bbfcf7e4720fcd4064748148 WatchSource:0}: Error finding container a636d463aae69de0416a346fee04498dfd404a44bbfcf7e4720fcd4064748148: Status 404 returned error can't find the container with id a636d463aae69de0416a346fee04498dfd404a44bbfcf7e4720fcd4064748148 Apr 17 18:56:22.614094 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:22.614058 2571 generic.go:358] "Generic (PLEG): container finished" podID="fb937405-4711-489a-b79a-f82e4c3dedcc" containerID="27dac327109fbeca11a12d8033721dff917dd96cbd1ac57570d33eee59657747" exitCode=0 Apr 17 18:56:22.614582 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:22.614147 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lvswg" event={"ID":"fb937405-4711-489a-b79a-f82e4c3dedcc","Type":"ContainerDied","Data":"27dac327109fbeca11a12d8033721dff917dd96cbd1ac57570d33eee59657747"} Apr 17 18:56:22.614582 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:22.614190 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lvswg" event={"ID":"fb937405-4711-489a-b79a-f82e4c3dedcc","Type":"ContainerStarted","Data":"a636d463aae69de0416a346fee04498dfd404a44bbfcf7e4720fcd4064748148"} Apr 17 18:56:24.622128 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:24.622091 2571 generic.go:358] "Generic (PLEG): container finished" podID="fb937405-4711-489a-b79a-f82e4c3dedcc" containerID="362607a161f147fc6ab9743cc194d319f999a0d73514688f5facee25ba6cf02c" exitCode=0 Apr 17 18:56:24.622484 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:24.622176 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lvswg" event={"ID":"fb937405-4711-489a-b79a-f82e4c3dedcc","Type":"ContainerDied","Data":"362607a161f147fc6ab9743cc194d319f999a0d73514688f5facee25ba6cf02c"} Apr 17 18:56:25.630205 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:25.630171 2571 generic.go:358] "Generic (PLEG): container finished" podID="fb937405-4711-489a-b79a-f82e4c3dedcc" containerID="1bc4c9b663ffa3fc6c71de10e873dff2d06353adb317d63cb32013b082dee760" exitCode=0 Apr 17 18:56:25.630668 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:25.630265 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lvswg" event={"ID":"fb937405-4711-489a-b79a-f82e4c3dedcc","Type":"ContainerDied","Data":"1bc4c9b663ffa3fc6c71de10e873dff2d06353adb317d63cb32013b082dee760"} Apr 17 18:56:26.751876 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:26.751854 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lvswg" Apr 17 18:56:26.883946 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:26.883913 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fb937405-4711-489a-b79a-f82e4c3dedcc-bundle\") pod \"fb937405-4711-489a-b79a-f82e4c3dedcc\" (UID: \"fb937405-4711-489a-b79a-f82e4c3dedcc\") " Apr 17 18:56:26.884131 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:26.883963 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fb937405-4711-489a-b79a-f82e4c3dedcc-util\") pod \"fb937405-4711-489a-b79a-f82e4c3dedcc\" (UID: \"fb937405-4711-489a-b79a-f82e4c3dedcc\") " Apr 17 18:56:26.884131 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:26.883996 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tv824\" (UniqueName: \"kubernetes.io/projected/fb937405-4711-489a-b79a-f82e4c3dedcc-kube-api-access-tv824\") pod \"fb937405-4711-489a-b79a-f82e4c3dedcc\" (UID: \"fb937405-4711-489a-b79a-f82e4c3dedcc\") " Apr 17 18:56:26.884907 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:26.884872 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb937405-4711-489a-b79a-f82e4c3dedcc-bundle" (OuterVolumeSpecName: "bundle") pod "fb937405-4711-489a-b79a-f82e4c3dedcc" (UID: "fb937405-4711-489a-b79a-f82e4c3dedcc"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:56:26.886091 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:26.886065 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb937405-4711-489a-b79a-f82e4c3dedcc-kube-api-access-tv824" (OuterVolumeSpecName: "kube-api-access-tv824") pod "fb937405-4711-489a-b79a-f82e4c3dedcc" (UID: "fb937405-4711-489a-b79a-f82e4c3dedcc"). InnerVolumeSpecName "kube-api-access-tv824". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:56:26.889423 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:26.889382 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb937405-4711-489a-b79a-f82e4c3dedcc-util" (OuterVolumeSpecName: "util") pod "fb937405-4711-489a-b79a-f82e4c3dedcc" (UID: "fb937405-4711-489a-b79a-f82e4c3dedcc"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:56:26.984710 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:26.984683 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fb937405-4711-489a-b79a-f82e4c3dedcc-bundle\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:56:26.984710 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:26.984707 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fb937405-4711-489a-b79a-f82e4c3dedcc-util\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:56:26.984855 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:26.984717 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tv824\" (UniqueName: \"kubernetes.io/projected/fb937405-4711-489a-b79a-f82e4c3dedcc-kube-api-access-tv824\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:56:27.639229 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:27.639191 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lvswg" event={"ID":"fb937405-4711-489a-b79a-f82e4c3dedcc","Type":"ContainerDied","Data":"a636d463aae69de0416a346fee04498dfd404a44bbfcf7e4720fcd4064748148"} Apr 17 18:56:27.639229 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:27.639232 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a636d463aae69de0416a346fee04498dfd404a44bbfcf7e4720fcd4064748148" Apr 17 18:56:27.639430 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:27.639204 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835lvswg" Apr 17 18:56:32.616415 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:32.616386 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-697b5bd5df-x9vvj" Apr 17 18:56:36.375326 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:36.375293 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2s6j2z"] Apr 17 18:56:36.375831 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:36.375756 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb937405-4711-489a-b79a-f82e4c3dedcc" containerName="pull" Apr 17 18:56:36.375831 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:36.375775 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb937405-4711-489a-b79a-f82e4c3dedcc" containerName="pull" Apr 17 18:56:36.375831 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:36.375788 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb937405-4711-489a-b79a-f82e4c3dedcc" containerName="util" Apr 17 18:56:36.375831 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:36.375795 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb937405-4711-489a-b79a-f82e4c3dedcc" containerName="util" Apr 17 18:56:36.375831 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:36.375807 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb937405-4711-489a-b79a-f82e4c3dedcc" containerName="extract" Apr 17 18:56:36.375831 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:36.375814 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb937405-4711-489a-b79a-f82e4c3dedcc" containerName="extract" Apr 17 18:56:36.376167 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:36.375888 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb937405-4711-489a-b79a-f82e4c3dedcc" containerName="extract" Apr 17 18:56:36.383034 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:36.383013 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2s6j2z" Apr 17 18:56:36.385242 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:36.385214 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 18:56:36.385242 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:36.385220 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 18:56:36.385993 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:36.385974 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-sx5lk\"" Apr 17 18:56:36.391752 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:36.391731 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2s6j2z"] Apr 17 18:56:36.462529 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:36.462498 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b68773de-45ca-46de-99b5-1c7c5f32628e-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2s6j2z\" (UID: \"b68773de-45ca-46de-99b5-1c7c5f32628e\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2s6j2z" Apr 17 18:56:36.462529 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:36.462537 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqqsl\" (UniqueName: \"kubernetes.io/projected/b68773de-45ca-46de-99b5-1c7c5f32628e-kube-api-access-fqqsl\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2s6j2z\" (UID: \"b68773de-45ca-46de-99b5-1c7c5f32628e\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2s6j2z" Apr 17 18:56:36.462735 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:36.462555 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b68773de-45ca-46de-99b5-1c7c5f32628e-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2s6j2z\" (UID: \"b68773de-45ca-46de-99b5-1c7c5f32628e\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2s6j2z" Apr 17 18:56:36.563119 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:36.563088 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b68773de-45ca-46de-99b5-1c7c5f32628e-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2s6j2z\" (UID: \"b68773de-45ca-46de-99b5-1c7c5f32628e\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2s6j2z" Apr 17 18:56:36.563308 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:36.563162 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b68773de-45ca-46de-99b5-1c7c5f32628e-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2s6j2z\" (UID: \"b68773de-45ca-46de-99b5-1c7c5f32628e\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2s6j2z" Apr 17 18:56:36.563308 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:36.563184 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fqqsl\" (UniqueName: \"kubernetes.io/projected/b68773de-45ca-46de-99b5-1c7c5f32628e-kube-api-access-fqqsl\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2s6j2z\" (UID: \"b68773de-45ca-46de-99b5-1c7c5f32628e\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2s6j2z" Apr 17 18:56:36.563557 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:36.563537 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b68773de-45ca-46de-99b5-1c7c5f32628e-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2s6j2z\" (UID: \"b68773de-45ca-46de-99b5-1c7c5f32628e\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2s6j2z" Apr 17 18:56:36.563634 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:36.563555 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b68773de-45ca-46de-99b5-1c7c5f32628e-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2s6j2z\" (UID: \"b68773de-45ca-46de-99b5-1c7c5f32628e\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2s6j2z" Apr 17 18:56:36.575196 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:36.575168 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqqsl\" (UniqueName: \"kubernetes.io/projected/b68773de-45ca-46de-99b5-1c7c5f32628e-kube-api-access-fqqsl\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2s6j2z\" (UID: \"b68773de-45ca-46de-99b5-1c7c5f32628e\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2s6j2z" Apr 17 18:56:36.693666 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:36.693589 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2s6j2z" Apr 17 18:56:36.812978 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:36.812948 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2s6j2z"] Apr 17 18:56:36.815497 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:56:36.815444 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb68773de_45ca_46de_99b5_1c7c5f32628e.slice/crio-cbcd3abf41c861426e150f01451258af5e2b6f7f7696ccb753b29d0ee8fbbad2 WatchSource:0}: Error finding container cbcd3abf41c861426e150f01451258af5e2b6f7f7696ccb753b29d0ee8fbbad2: Status 404 returned error can't find the container with id cbcd3abf41c861426e150f01451258af5e2b6f7f7696ccb753b29d0ee8fbbad2 Apr 17 18:56:37.674952 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:37.674919 2571 generic.go:358] "Generic (PLEG): container finished" podID="b68773de-45ca-46de-99b5-1c7c5f32628e" containerID="07a11875c92633dddac0939adcfced5ace9c5c5fa6c07d9f11862ea2955125df" exitCode=0 Apr 17 18:56:37.675314 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:37.674967 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2s6j2z" event={"ID":"b68773de-45ca-46de-99b5-1c7c5f32628e","Type":"ContainerDied","Data":"07a11875c92633dddac0939adcfced5ace9c5c5fa6c07d9f11862ea2955125df"} Apr 17 18:56:37.675314 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:37.674989 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2s6j2z" event={"ID":"b68773de-45ca-46de-99b5-1c7c5f32628e","Type":"ContainerStarted","Data":"cbcd3abf41c861426e150f01451258af5e2b6f7f7696ccb753b29d0ee8fbbad2"} Apr 17 18:56:39.683570 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:39.683534 2571 generic.go:358] "Generic (PLEG): container finished" podID="b68773de-45ca-46de-99b5-1c7c5f32628e" containerID="de6e378ae8fb6b918daeded9fccbe39dc8019230d22e15da46ca256e26929cb2" exitCode=0 Apr 17 18:56:39.683949 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:39.683624 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2s6j2z" event={"ID":"b68773de-45ca-46de-99b5-1c7c5f32628e","Type":"ContainerDied","Data":"de6e378ae8fb6b918daeded9fccbe39dc8019230d22e15da46ca256e26929cb2"} Apr 17 18:56:40.688140 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:40.688102 2571 generic.go:358] "Generic (PLEG): container finished" podID="b68773de-45ca-46de-99b5-1c7c5f32628e" containerID="861d70e4214ad0ec2f6e8337b5b78ea6a8a2370848f72af661e6a45717c49cfb" exitCode=0 Apr 17 18:56:40.688541 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:40.688225 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2s6j2z" event={"ID":"b68773de-45ca-46de-99b5-1c7c5f32628e","Type":"ContainerDied","Data":"861d70e4214ad0ec2f6e8337b5b78ea6a8a2370848f72af661e6a45717c49cfb"} Apr 17 18:56:41.807035 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:41.807010 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2s6j2z" Apr 17 18:56:41.908072 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:41.908029 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b68773de-45ca-46de-99b5-1c7c5f32628e-util\") pod \"b68773de-45ca-46de-99b5-1c7c5f32628e\" (UID: \"b68773de-45ca-46de-99b5-1c7c5f32628e\") " Apr 17 18:56:41.908262 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:41.908109 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqqsl\" (UniqueName: \"kubernetes.io/projected/b68773de-45ca-46de-99b5-1c7c5f32628e-kube-api-access-fqqsl\") pod \"b68773de-45ca-46de-99b5-1c7c5f32628e\" (UID: \"b68773de-45ca-46de-99b5-1c7c5f32628e\") " Apr 17 18:56:41.908262 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:41.908135 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b68773de-45ca-46de-99b5-1c7c5f32628e-bundle\") pod \"b68773de-45ca-46de-99b5-1c7c5f32628e\" (UID: \"b68773de-45ca-46de-99b5-1c7c5f32628e\") " Apr 17 18:56:41.909043 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:41.909011 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b68773de-45ca-46de-99b5-1c7c5f32628e-bundle" (OuterVolumeSpecName: "bundle") pod "b68773de-45ca-46de-99b5-1c7c5f32628e" (UID: "b68773de-45ca-46de-99b5-1c7c5f32628e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:56:41.910168 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:41.910131 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b68773de-45ca-46de-99b5-1c7c5f32628e-kube-api-access-fqqsl" (OuterVolumeSpecName: "kube-api-access-fqqsl") pod "b68773de-45ca-46de-99b5-1c7c5f32628e" (UID: "b68773de-45ca-46de-99b5-1c7c5f32628e"). InnerVolumeSpecName "kube-api-access-fqqsl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:56:41.913769 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:41.913732 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b68773de-45ca-46de-99b5-1c7c5f32628e-util" (OuterVolumeSpecName: "util") pod "b68773de-45ca-46de-99b5-1c7c5f32628e" (UID: "b68773de-45ca-46de-99b5-1c7c5f32628e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:56:42.009150 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:42.009125 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b68773de-45ca-46de-99b5-1c7c5f32628e-util\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:56:42.009150 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:42.009151 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fqqsl\" (UniqueName: \"kubernetes.io/projected/b68773de-45ca-46de-99b5-1c7c5f32628e-kube-api-access-fqqsl\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:56:42.009316 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:42.009161 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b68773de-45ca-46de-99b5-1c7c5f32628e-bundle\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:56:42.697370 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:42.697336 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2s6j2z" event={"ID":"b68773de-45ca-46de-99b5-1c7c5f32628e","Type":"ContainerDied","Data":"cbcd3abf41c861426e150f01451258af5e2b6f7f7696ccb753b29d0ee8fbbad2"} Apr 17 18:56:42.697370 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:42.697369 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbcd3abf41c861426e150f01451258af5e2b6f7f7696ccb753b29d0ee8fbbad2" Apr 17 18:56:42.697596 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:56:42.697387 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2s6j2z" Apr 17 18:57:12.583595 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:12.583559 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cflfthd"] Apr 17 18:57:12.584138 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:12.583881 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b68773de-45ca-46de-99b5-1c7c5f32628e" containerName="extract" Apr 17 18:57:12.584138 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:12.583892 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b68773de-45ca-46de-99b5-1c7c5f32628e" containerName="extract" Apr 17 18:57:12.584138 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:12.583904 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b68773de-45ca-46de-99b5-1c7c5f32628e" containerName="util" Apr 17 18:57:12.584138 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:12.583909 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b68773de-45ca-46de-99b5-1c7c5f32628e" containerName="util" Apr 17 18:57:12.584138 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:12.583916 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b68773de-45ca-46de-99b5-1c7c5f32628e" containerName="pull" Apr 17 18:57:12.584138 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:12.583924 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b68773de-45ca-46de-99b5-1c7c5f32628e" containerName="pull" Apr 17 18:57:12.584138 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:12.583991 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="b68773de-45ca-46de-99b5-1c7c5f32628e" containerName="extract" Apr 17 18:57:12.587902 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:12.587882 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cflfthd" Apr 17 18:57:12.590059 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:12.590034 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 18:57:12.590349 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:12.590072 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-hg7sg\"" Apr 17 18:57:12.590583 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:12.590073 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 17 18:57:12.590674 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:12.590160 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 18:57:12.597852 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:12.597831 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cflfthd"] Apr 17 18:57:12.641663 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:12.641636 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/bb4dbe95-70be-41e5-961f-23695c4912f3-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cflfthd\" (UID: \"bb4dbe95-70be-41e5-961f-23695c4912f3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cflfthd" Apr 17 18:57:12.641801 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:12.641686 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6nlc\" (UniqueName: \"kubernetes.io/projected/bb4dbe95-70be-41e5-961f-23695c4912f3-kube-api-access-d6nlc\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cflfthd\" (UID: \"bb4dbe95-70be-41e5-961f-23695c4912f3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cflfthd" Apr 17 18:57:12.641801 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:12.641732 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/bb4dbe95-70be-41e5-961f-23695c4912f3-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cflfthd\" (UID: \"bb4dbe95-70be-41e5-961f-23695c4912f3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cflfthd" Apr 17 18:57:12.641801 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:12.641768 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/bb4dbe95-70be-41e5-961f-23695c4912f3-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cflfthd\" (UID: \"bb4dbe95-70be-41e5-961f-23695c4912f3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cflfthd" Apr 17 18:57:12.641936 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:12.641836 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/bb4dbe95-70be-41e5-961f-23695c4912f3-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cflfthd\" (UID: \"bb4dbe95-70be-41e5-961f-23695c4912f3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cflfthd" Apr 17 18:57:12.641936 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:12.641868 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/bb4dbe95-70be-41e5-961f-23695c4912f3-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cflfthd\" (UID: \"bb4dbe95-70be-41e5-961f-23695c4912f3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cflfthd" Apr 17 18:57:12.641936 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:12.641909 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/bb4dbe95-70be-41e5-961f-23695c4912f3-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cflfthd\" (UID: \"bb4dbe95-70be-41e5-961f-23695c4912f3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cflfthd" Apr 17 18:57:12.642053 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:12.641938 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/bb4dbe95-70be-41e5-961f-23695c4912f3-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cflfthd\" (UID: \"bb4dbe95-70be-41e5-961f-23695c4912f3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cflfthd" Apr 17 18:57:12.642053 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:12.641955 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/bb4dbe95-70be-41e5-961f-23695c4912f3-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cflfthd\" (UID: \"bb4dbe95-70be-41e5-961f-23695c4912f3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cflfthd" Apr 17 18:57:12.743262 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:12.743233 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/bb4dbe95-70be-41e5-961f-23695c4912f3-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cflfthd\" (UID: \"bb4dbe95-70be-41e5-961f-23695c4912f3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cflfthd" Apr 17 18:57:12.743397 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:12.743263 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/bb4dbe95-70be-41e5-961f-23695c4912f3-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cflfthd\" (UID: \"bb4dbe95-70be-41e5-961f-23695c4912f3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cflfthd" Apr 17 18:57:12.743397 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:12.743288 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/bb4dbe95-70be-41e5-961f-23695c4912f3-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cflfthd\" (UID: \"bb4dbe95-70be-41e5-961f-23695c4912f3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cflfthd" Apr 17 18:57:12.743537 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:12.743514 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/bb4dbe95-70be-41e5-961f-23695c4912f3-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cflfthd\" (UID: \"bb4dbe95-70be-41e5-961f-23695c4912f3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cflfthd" Apr 17 18:57:12.743597 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:12.743573 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/bb4dbe95-70be-41e5-961f-23695c4912f3-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cflfthd\" (UID: \"bb4dbe95-70be-41e5-961f-23695c4912f3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cflfthd" Apr 17 18:57:12.743722 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:12.743693 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/bb4dbe95-70be-41e5-961f-23695c4912f3-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cflfthd\" (UID: \"bb4dbe95-70be-41e5-961f-23695c4912f3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cflfthd" Apr 17 18:57:12.743830 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:12.743735 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/bb4dbe95-70be-41e5-961f-23695c4912f3-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cflfthd\" (UID: \"bb4dbe95-70be-41e5-961f-23695c4912f3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cflfthd" Apr 17 18:57:12.743830 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:12.743742 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/bb4dbe95-70be-41e5-961f-23695c4912f3-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cflfthd\" (UID: \"bb4dbe95-70be-41e5-961f-23695c4912f3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cflfthd" Apr 17 18:57:12.743830 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:12.743770 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/bb4dbe95-70be-41e5-961f-23695c4912f3-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cflfthd\" (UID: \"bb4dbe95-70be-41e5-961f-23695c4912f3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cflfthd" Apr 17 18:57:12.743984 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:12.743846 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/bb4dbe95-70be-41e5-961f-23695c4912f3-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cflfthd\" (UID: \"bb4dbe95-70be-41e5-961f-23695c4912f3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cflfthd" Apr 17 18:57:12.743984 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:12.743894 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d6nlc\" (UniqueName: \"kubernetes.io/projected/bb4dbe95-70be-41e5-961f-23695c4912f3-kube-api-access-d6nlc\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cflfthd\" (UID: \"bb4dbe95-70be-41e5-961f-23695c4912f3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cflfthd" Apr 17 18:57:12.743984 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:12.743917 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/bb4dbe95-70be-41e5-961f-23695c4912f3-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cflfthd\" (UID: \"bb4dbe95-70be-41e5-961f-23695c4912f3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cflfthd" Apr 17 18:57:12.744215 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:12.744191 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/bb4dbe95-70be-41e5-961f-23695c4912f3-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cflfthd\" (UID: \"bb4dbe95-70be-41e5-961f-23695c4912f3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cflfthd" Apr 17 18:57:12.744310 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:12.744292 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/bb4dbe95-70be-41e5-961f-23695c4912f3-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cflfthd\" (UID: \"bb4dbe95-70be-41e5-961f-23695c4912f3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cflfthd" Apr 17 18:57:12.746205 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:12.746187 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/bb4dbe95-70be-41e5-961f-23695c4912f3-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cflfthd\" (UID: \"bb4dbe95-70be-41e5-961f-23695c4912f3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cflfthd" Apr 17 18:57:12.746281 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:12.746195 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/bb4dbe95-70be-41e5-961f-23695c4912f3-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cflfthd\" (UID: \"bb4dbe95-70be-41e5-961f-23695c4912f3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cflfthd" Apr 17 18:57:12.750969 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:12.750941 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/bb4dbe95-70be-41e5-961f-23695c4912f3-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cflfthd\" (UID: \"bb4dbe95-70be-41e5-961f-23695c4912f3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cflfthd" Apr 17 18:57:12.751061 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:12.751047 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6nlc\" (UniqueName: \"kubernetes.io/projected/bb4dbe95-70be-41e5-961f-23695c4912f3-kube-api-access-d6nlc\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cflfthd\" (UID: \"bb4dbe95-70be-41e5-961f-23695c4912f3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cflfthd" Apr 17 18:57:12.900440 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:12.900367 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cflfthd" Apr 17 18:57:13.017019 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:13.016996 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cflfthd"] Apr 17 18:57:13.019182 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:57:13.019151 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb4dbe95_70be_41e5_961f_23695c4912f3.slice/crio-466a752a2268755371067b276062127177c7c9fa39737db31710fe688ef0c61b WatchSource:0}: Error finding container 466a752a2268755371067b276062127177c7c9fa39737db31710fe688ef0c61b: Status 404 returned error can't find the container with id 466a752a2268755371067b276062127177c7c9fa39737db31710fe688ef0c61b Apr 17 18:57:13.805688 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:13.805650 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cflfthd" event={"ID":"bb4dbe95-70be-41e5-961f-23695c4912f3","Type":"ContainerStarted","Data":"466a752a2268755371067b276062127177c7c9fa39737db31710fe688ef0c61b"} Apr 17 18:57:15.581374 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:15.581336 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236232Ki","pods":"250"} Apr 17 18:57:15.581642 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:15.581416 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236232Ki","pods":"250"} Apr 17 18:57:15.581642 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:15.581444 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236232Ki","pods":"250"} Apr 17 18:57:15.817562 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:15.817526 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cflfthd" event={"ID":"bb4dbe95-70be-41e5-961f-23695c4912f3","Type":"ContainerStarted","Data":"c7b2065e167838582d0d26900c6e2250e3e5995f815e8501a6a87b3cd6956bfa"} Apr 17 18:57:15.836485 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:15.836377 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cflfthd" podStartSLOduration=1.2762137789999999 podStartE2EDuration="3.836362972s" podCreationTimestamp="2026-04-17 18:57:12 +0000 UTC" firstStartedPulling="2026-04-17 18:57:13.020950062 +0000 UTC m=+472.294967332" lastFinishedPulling="2026-04-17 18:57:15.581099253 +0000 UTC m=+474.855116525" observedRunningTime="2026-04-17 18:57:15.834789605 +0000 UTC m=+475.108806899" watchObservedRunningTime="2026-04-17 18:57:15.836362972 +0000 UTC m=+475.110380265" Apr 17 18:57:15.901032 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:15.901005 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cflfthd" Apr 17 18:57:16.905514 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:16.905484 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cflfthd" Apr 17 18:57:17.824171 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:17.824132 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cflfthd" Apr 17 18:57:17.825201 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:17.825179 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cflfthd" Apr 17 18:57:20.093551 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:20.093520 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-rdm6z"] Apr 17 18:57:20.097094 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:20.097068 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-rdm6z" Apr 17 18:57:20.099380 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:20.099357 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 18:57:20.099575 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:20.099475 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-t5w9l\"" Apr 17 18:57:20.099575 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:20.099475 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 18:57:20.106211 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:20.106187 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-rdm6z"] Apr 17 18:57:20.203315 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:20.203282 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljp7x\" (UniqueName: \"kubernetes.io/projected/13d3b34a-cdbb-4352-946e-f8e269ace67f-kube-api-access-ljp7x\") pod \"kuadrant-operator-catalog-rdm6z\" (UID: \"13d3b34a-cdbb-4352-946e-f8e269ace67f\") " pod="kuadrant-system/kuadrant-operator-catalog-rdm6z" Apr 17 18:57:20.304142 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:20.304113 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ljp7x\" (UniqueName: \"kubernetes.io/projected/13d3b34a-cdbb-4352-946e-f8e269ace67f-kube-api-access-ljp7x\") pod \"kuadrant-operator-catalog-rdm6z\" (UID: \"13d3b34a-cdbb-4352-946e-f8e269ace67f\") " pod="kuadrant-system/kuadrant-operator-catalog-rdm6z" Apr 17 18:57:20.311987 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:20.311962 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljp7x\" (UniqueName: \"kubernetes.io/projected/13d3b34a-cdbb-4352-946e-f8e269ace67f-kube-api-access-ljp7x\") pod \"kuadrant-operator-catalog-rdm6z\" (UID: \"13d3b34a-cdbb-4352-946e-f8e269ace67f\") " pod="kuadrant-system/kuadrant-operator-catalog-rdm6z" Apr 17 18:57:20.409597 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:20.409509 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-rdm6z" Apr 17 18:57:20.457195 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:20.457155 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-rdm6z"] Apr 17 18:57:20.526224 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:20.526148 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-rdm6z"] Apr 17 18:57:20.528371 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:57:20.528334 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13d3b34a_cdbb_4352_946e_f8e269ace67f.slice/crio-930824bf8f79dca084756160a5c172da35ae427589963b413acd291931a8057a WatchSource:0}: Error finding container 930824bf8f79dca084756160a5c172da35ae427589963b413acd291931a8057a: Status 404 returned error can't find the container with id 930824bf8f79dca084756160a5c172da35ae427589963b413acd291931a8057a Apr 17 18:57:20.664050 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:20.663973 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-66ln6"] Apr 17 18:57:20.668352 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:20.668335 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-66ln6" Apr 17 18:57:20.672515 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:20.672493 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-66ln6"] Apr 17 18:57:20.706197 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:20.706174 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcmjl\" (UniqueName: \"kubernetes.io/projected/b2a9405a-505c-4d1a-8245-439bd2294533-kube-api-access-lcmjl\") pod \"kuadrant-operator-catalog-66ln6\" (UID: \"b2a9405a-505c-4d1a-8245-439bd2294533\") " pod="kuadrant-system/kuadrant-operator-catalog-66ln6" Apr 17 18:57:20.807505 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:20.807479 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lcmjl\" (UniqueName: \"kubernetes.io/projected/b2a9405a-505c-4d1a-8245-439bd2294533-kube-api-access-lcmjl\") pod \"kuadrant-operator-catalog-66ln6\" (UID: \"b2a9405a-505c-4d1a-8245-439bd2294533\") " pod="kuadrant-system/kuadrant-operator-catalog-66ln6" Apr 17 18:57:20.815345 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:20.815322 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcmjl\" (UniqueName: \"kubernetes.io/projected/b2a9405a-505c-4d1a-8245-439bd2294533-kube-api-access-lcmjl\") pod \"kuadrant-operator-catalog-66ln6\" (UID: \"b2a9405a-505c-4d1a-8245-439bd2294533\") " pod="kuadrant-system/kuadrant-operator-catalog-66ln6" Apr 17 18:57:20.835128 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:20.835101 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-rdm6z" event={"ID":"13d3b34a-cdbb-4352-946e-f8e269ace67f","Type":"ContainerStarted","Data":"930824bf8f79dca084756160a5c172da35ae427589963b413acd291931a8057a"} Apr 17 18:57:20.978889 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:20.978806 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-66ln6" Apr 17 18:57:21.095247 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:21.095220 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-66ln6"] Apr 17 18:57:21.097976 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:57:21.097946 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2a9405a_505c_4d1a_8245_439bd2294533.slice/crio-c69b74fcc150a90464315bc2f4eb552e2769505adb048949dacb6f86b5204be9 WatchSource:0}: Error finding container c69b74fcc150a90464315bc2f4eb552e2769505adb048949dacb6f86b5204be9: Status 404 returned error can't find the container with id c69b74fcc150a90464315bc2f4eb552e2769505adb048949dacb6f86b5204be9 Apr 17 18:57:21.839850 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:21.839817 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-66ln6" event={"ID":"b2a9405a-505c-4d1a-8245-439bd2294533","Type":"ContainerStarted","Data":"c69b74fcc150a90464315bc2f4eb552e2769505adb048949dacb6f86b5204be9"} Apr 17 18:57:23.847441 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:23.847398 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-rdm6z" event={"ID":"13d3b34a-cdbb-4352-946e-f8e269ace67f","Type":"ContainerStarted","Data":"d371f22ec3d0735bbe26dfb42851bd188fddae568e09a0d6f8a5cb6a2a62c9e4"} Apr 17 18:57:23.847903 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:23.847528 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-rdm6z" podUID="13d3b34a-cdbb-4352-946e-f8e269ace67f" containerName="registry-server" containerID="cri-o://d371f22ec3d0735bbe26dfb42851bd188fddae568e09a0d6f8a5cb6a2a62c9e4" gracePeriod=2 Apr 17 18:57:23.848839 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:23.848806 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-66ln6" event={"ID":"b2a9405a-505c-4d1a-8245-439bd2294533","Type":"ContainerStarted","Data":"4aee454592bc7a85f87b6c70dfb171be331fcf5e6a5e26203766d6a2ca319dd3"} Apr 17 18:57:23.862548 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:23.862499 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-rdm6z" podStartSLOduration=1.5254309190000002 podStartE2EDuration="3.862485574s" podCreationTimestamp="2026-04-17 18:57:20 +0000 UTC" firstStartedPulling="2026-04-17 18:57:20.529699697 +0000 UTC m=+479.803716968" lastFinishedPulling="2026-04-17 18:57:22.866754338 +0000 UTC m=+482.140771623" observedRunningTime="2026-04-17 18:57:23.860132524 +0000 UTC m=+483.134149817" watchObservedRunningTime="2026-04-17 18:57:23.862485574 +0000 UTC m=+483.136502866" Apr 17 18:57:23.872806 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:23.872763 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-66ln6" podStartSLOduration=2.109314809 podStartE2EDuration="3.872752021s" podCreationTimestamp="2026-04-17 18:57:20 +0000 UTC" firstStartedPulling="2026-04-17 18:57:21.099375218 +0000 UTC m=+480.373392489" lastFinishedPulling="2026-04-17 18:57:22.862812431 +0000 UTC m=+482.136829701" observedRunningTime="2026-04-17 18:57:23.872296978 +0000 UTC m=+483.146314274" watchObservedRunningTime="2026-04-17 18:57:23.872752021 +0000 UTC m=+483.146769314" Apr 17 18:57:24.098401 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:24.098348 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-rdm6z" Apr 17 18:57:24.137933 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:24.137902 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljp7x\" (UniqueName: \"kubernetes.io/projected/13d3b34a-cdbb-4352-946e-f8e269ace67f-kube-api-access-ljp7x\") pod \"13d3b34a-cdbb-4352-946e-f8e269ace67f\" (UID: \"13d3b34a-cdbb-4352-946e-f8e269ace67f\") " Apr 17 18:57:24.139982 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:24.139961 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13d3b34a-cdbb-4352-946e-f8e269ace67f-kube-api-access-ljp7x" (OuterVolumeSpecName: "kube-api-access-ljp7x") pod "13d3b34a-cdbb-4352-946e-f8e269ace67f" (UID: "13d3b34a-cdbb-4352-946e-f8e269ace67f"). InnerVolumeSpecName "kube-api-access-ljp7x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:57:24.238783 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:24.238760 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ljp7x\" (UniqueName: \"kubernetes.io/projected/13d3b34a-cdbb-4352-946e-f8e269ace67f-kube-api-access-ljp7x\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:57:24.852903 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:24.852871 2571 generic.go:358] "Generic (PLEG): container finished" podID="13d3b34a-cdbb-4352-946e-f8e269ace67f" containerID="d371f22ec3d0735bbe26dfb42851bd188fddae568e09a0d6f8a5cb6a2a62c9e4" exitCode=0 Apr 17 18:57:24.853398 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:24.852938 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-rdm6z" Apr 17 18:57:24.853398 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:24.852950 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-rdm6z" event={"ID":"13d3b34a-cdbb-4352-946e-f8e269ace67f","Type":"ContainerDied","Data":"d371f22ec3d0735bbe26dfb42851bd188fddae568e09a0d6f8a5cb6a2a62c9e4"} Apr 17 18:57:24.853398 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:24.852992 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-rdm6z" event={"ID":"13d3b34a-cdbb-4352-946e-f8e269ace67f","Type":"ContainerDied","Data":"930824bf8f79dca084756160a5c172da35ae427589963b413acd291931a8057a"} Apr 17 18:57:24.853398 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:24.853007 2571 scope.go:117] "RemoveContainer" containerID="d371f22ec3d0735bbe26dfb42851bd188fddae568e09a0d6f8a5cb6a2a62c9e4" Apr 17 18:57:24.862326 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:24.862309 2571 scope.go:117] "RemoveContainer" containerID="d371f22ec3d0735bbe26dfb42851bd188fddae568e09a0d6f8a5cb6a2a62c9e4" Apr 17 18:57:24.862586 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:57:24.862568 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d371f22ec3d0735bbe26dfb42851bd188fddae568e09a0d6f8a5cb6a2a62c9e4\": container with ID starting with d371f22ec3d0735bbe26dfb42851bd188fddae568e09a0d6f8a5cb6a2a62c9e4 not found: ID does not exist" containerID="d371f22ec3d0735bbe26dfb42851bd188fddae568e09a0d6f8a5cb6a2a62c9e4" Apr 17 18:57:24.862641 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:24.862597 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d371f22ec3d0735bbe26dfb42851bd188fddae568e09a0d6f8a5cb6a2a62c9e4"} err="failed to get container status \"d371f22ec3d0735bbe26dfb42851bd188fddae568e09a0d6f8a5cb6a2a62c9e4\": rpc error: code = NotFound desc = could not find container \"d371f22ec3d0735bbe26dfb42851bd188fddae568e09a0d6f8a5cb6a2a62c9e4\": container with ID starting with d371f22ec3d0735bbe26dfb42851bd188fddae568e09a0d6f8a5cb6a2a62c9e4 not found: ID does not exist" Apr 17 18:57:24.873100 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:24.873078 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-rdm6z"] Apr 17 18:57:24.877221 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:24.877191 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-rdm6z"] Apr 17 18:57:25.237622 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:25.237544 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13d3b34a-cdbb-4352-946e-f8e269ace67f" path="/var/lib/kubelet/pods/13d3b34a-cdbb-4352-946e-f8e269ace67f/volumes" Apr 17 18:57:30.979351 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:30.979312 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-66ln6" Apr 17 18:57:30.979351 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:30.979356 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-66ln6" Apr 17 18:57:31.000216 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:31.000189 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-66ln6" Apr 17 18:57:31.897255 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:31.897229 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-66ln6" Apr 17 18:57:38.996129 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:38.996097 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1mrjm2"] Apr 17 18:57:38.996528 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:38.996442 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="13d3b34a-cdbb-4352-946e-f8e269ace67f" containerName="registry-server" Apr 17 18:57:38.996528 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:38.996467 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="13d3b34a-cdbb-4352-946e-f8e269ace67f" containerName="registry-server" Apr 17 18:57:38.996605 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:38.996557 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="13d3b34a-cdbb-4352-946e-f8e269ace67f" containerName="registry-server" Apr 17 18:57:38.999588 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:38.999573 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1mrjm2" Apr 17 18:57:39.001837 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.001815 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-zjsrt\"" Apr 17 18:57:39.007069 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.007049 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1mrjm2"] Apr 17 18:57:39.053929 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.053903 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr957\" (UniqueName: \"kubernetes.io/projected/b1b8254e-6889-4a58-99ba-3dfe78089325-kube-api-access-hr957\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1mrjm2\" (UID: \"b1b8254e-6889-4a58-99ba-3dfe78089325\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1mrjm2" Apr 17 18:57:39.054057 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.053948 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1b8254e-6889-4a58-99ba-3dfe78089325-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1mrjm2\" (UID: \"b1b8254e-6889-4a58-99ba-3dfe78089325\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1mrjm2" Apr 17 18:57:39.054057 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.053972 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1b8254e-6889-4a58-99ba-3dfe78089325-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1mrjm2\" (UID: \"b1b8254e-6889-4a58-99ba-3dfe78089325\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1mrjm2" Apr 17 18:57:39.097262 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.097235 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hb8v7"] Apr 17 18:57:39.100595 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.100580 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hb8v7" Apr 17 18:57:39.107892 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.107816 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hb8v7"] Apr 17 18:57:39.155157 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.155075 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hr957\" (UniqueName: \"kubernetes.io/projected/b1b8254e-6889-4a58-99ba-3dfe78089325-kube-api-access-hr957\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1mrjm2\" (UID: \"b1b8254e-6889-4a58-99ba-3dfe78089325\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1mrjm2" Apr 17 18:57:39.155555 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.155535 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1b8254e-6889-4a58-99ba-3dfe78089325-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1mrjm2\" (UID: \"b1b8254e-6889-4a58-99ba-3dfe78089325\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1mrjm2" Apr 17 18:57:39.155681 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.155592 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1b8254e-6889-4a58-99ba-3dfe78089325-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1mrjm2\" (UID: \"b1b8254e-6889-4a58-99ba-3dfe78089325\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1mrjm2" Apr 17 18:57:39.155681 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.155626 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee74b3d9-cab8-4e88-9052-dd2d7f28e1df-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hb8v7\" (UID: \"ee74b3d9-cab8-4e88-9052-dd2d7f28e1df\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hb8v7" Apr 17 18:57:39.155681 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.155665 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee74b3d9-cab8-4e88-9052-dd2d7f28e1df-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hb8v7\" (UID: \"ee74b3d9-cab8-4e88-9052-dd2d7f28e1df\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hb8v7" Apr 17 18:57:39.155830 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.155700 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q79vp\" (UniqueName: \"kubernetes.io/projected/ee74b3d9-cab8-4e88-9052-dd2d7f28e1df-kube-api-access-q79vp\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hb8v7\" (UID: \"ee74b3d9-cab8-4e88-9052-dd2d7f28e1df\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hb8v7" Apr 17 18:57:39.156159 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.156135 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1b8254e-6889-4a58-99ba-3dfe78089325-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1mrjm2\" (UID: \"b1b8254e-6889-4a58-99ba-3dfe78089325\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1mrjm2" Apr 17 18:57:39.156528 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.156506 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1b8254e-6889-4a58-99ba-3dfe78089325-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1mrjm2\" (UID: \"b1b8254e-6889-4a58-99ba-3dfe78089325\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1mrjm2" Apr 17 18:57:39.162604 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.162584 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr957\" (UniqueName: \"kubernetes.io/projected/b1b8254e-6889-4a58-99ba-3dfe78089325-kube-api-access-hr957\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1mrjm2\" (UID: \"b1b8254e-6889-4a58-99ba-3dfe78089325\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1mrjm2" Apr 17 18:57:39.201917 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.201889 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73lndsr"] Apr 17 18:57:39.205338 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.205323 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73lndsr" Apr 17 18:57:39.210903 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.210883 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73lndsr"] Apr 17 18:57:39.256127 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.256104 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee74b3d9-cab8-4e88-9052-dd2d7f28e1df-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hb8v7\" (UID: \"ee74b3d9-cab8-4e88-9052-dd2d7f28e1df\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hb8v7" Apr 17 18:57:39.256250 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.256133 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee74b3d9-cab8-4e88-9052-dd2d7f28e1df-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hb8v7\" (UID: \"ee74b3d9-cab8-4e88-9052-dd2d7f28e1df\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hb8v7" Apr 17 18:57:39.256250 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.256155 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q79vp\" (UniqueName: \"kubernetes.io/projected/ee74b3d9-cab8-4e88-9052-dd2d7f28e1df-kube-api-access-q79vp\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hb8v7\" (UID: \"ee74b3d9-cab8-4e88-9052-dd2d7f28e1df\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hb8v7" Apr 17 18:57:39.256442 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.256424 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee74b3d9-cab8-4e88-9052-dd2d7f28e1df-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hb8v7\" (UID: \"ee74b3d9-cab8-4e88-9052-dd2d7f28e1df\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hb8v7" Apr 17 18:57:39.256510 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.256496 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee74b3d9-cab8-4e88-9052-dd2d7f28e1df-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hb8v7\" (UID: \"ee74b3d9-cab8-4e88-9052-dd2d7f28e1df\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hb8v7" Apr 17 18:57:39.263716 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.263694 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q79vp\" (UniqueName: \"kubernetes.io/projected/ee74b3d9-cab8-4e88-9052-dd2d7f28e1df-kube-api-access-q79vp\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hb8v7\" (UID: \"ee74b3d9-cab8-4e88-9052-dd2d7f28e1df\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hb8v7" Apr 17 18:57:39.303014 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.302987 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hqjx8"] Apr 17 18:57:39.306600 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.306578 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hqjx8" Apr 17 18:57:39.309224 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.309200 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1mrjm2" Apr 17 18:57:39.312533 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.312514 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hqjx8"] Apr 17 18:57:39.357602 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.357573 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73lndsr\" (UID: \"fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73lndsr" Apr 17 18:57:39.357712 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.357648 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73lndsr\" (UID: \"fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73lndsr" Apr 17 18:57:39.357766 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.357735 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvwmg\" (UniqueName: \"kubernetes.io/projected/fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d-kube-api-access-mvwmg\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73lndsr\" (UID: \"fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73lndsr" Apr 17 18:57:39.410912 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.410881 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hb8v7" Apr 17 18:57:39.427786 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.427757 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1mrjm2"] Apr 17 18:57:39.429078 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:57:39.429051 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1b8254e_6889_4a58_99ba_3dfe78089325.slice/crio-2669b65bb25b7de9be45a64b9d018edc8d2cae45733ea9e3695990d6fb6f6c4c WatchSource:0}: Error finding container 2669b65bb25b7de9be45a64b9d018edc8d2cae45733ea9e3695990d6fb6f6c4c: Status 404 returned error can't find the container with id 2669b65bb25b7de9be45a64b9d018edc8d2cae45733ea9e3695990d6fb6f6c4c Apr 17 18:57:39.459048 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.459017 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73lndsr\" (UID: \"fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73lndsr" Apr 17 18:57:39.459164 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.459068 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5768680b-c657-4605-86d2-05896de9cea7-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hqjx8\" (UID: \"5768680b-c657-4605-86d2-05896de9cea7\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hqjx8" Apr 17 18:57:39.459164 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.459125 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsv2k\" (UniqueName: \"kubernetes.io/projected/5768680b-c657-4605-86d2-05896de9cea7-kube-api-access-bsv2k\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hqjx8\" (UID: \"5768680b-c657-4605-86d2-05896de9cea7\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hqjx8" Apr 17 18:57:39.459284 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.459186 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mvwmg\" (UniqueName: \"kubernetes.io/projected/fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d-kube-api-access-mvwmg\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73lndsr\" (UID: \"fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73lndsr" Apr 17 18:57:39.459284 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.459266 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73lndsr\" (UID: \"fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73lndsr" Apr 17 18:57:39.459397 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.459320 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5768680b-c657-4605-86d2-05896de9cea7-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hqjx8\" (UID: \"5768680b-c657-4605-86d2-05896de9cea7\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hqjx8" Apr 17 18:57:39.459529 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.459506 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73lndsr\" (UID: \"fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73lndsr" Apr 17 18:57:39.459695 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.459674 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73lndsr\" (UID: \"fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73lndsr" Apr 17 18:57:39.467276 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.467247 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvwmg\" (UniqueName: \"kubernetes.io/projected/fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d-kube-api-access-mvwmg\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73lndsr\" (UID: \"fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73lndsr" Apr 17 18:57:39.515757 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.515679 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73lndsr" Apr 17 18:57:39.532447 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.532422 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hb8v7"] Apr 17 18:57:39.534014 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:57:39.533983 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee74b3d9_cab8_4e88_9052_dd2d7f28e1df.slice/crio-1808cf0114faa56e47da0d66e8a91d31a0b45f6e926463be3565acb98ba0d355 WatchSource:0}: Error finding container 1808cf0114faa56e47da0d66e8a91d31a0b45f6e926463be3565acb98ba0d355: Status 404 returned error can't find the container with id 1808cf0114faa56e47da0d66e8a91d31a0b45f6e926463be3565acb98ba0d355 Apr 17 18:57:39.560402 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.560375 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bsv2k\" (UniqueName: \"kubernetes.io/projected/5768680b-c657-4605-86d2-05896de9cea7-kube-api-access-bsv2k\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hqjx8\" (UID: \"5768680b-c657-4605-86d2-05896de9cea7\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hqjx8" Apr 17 18:57:39.560512 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.560495 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5768680b-c657-4605-86d2-05896de9cea7-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hqjx8\" (UID: \"5768680b-c657-4605-86d2-05896de9cea7\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hqjx8" Apr 17 18:57:39.560562 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.560534 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5768680b-c657-4605-86d2-05896de9cea7-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hqjx8\" (UID: \"5768680b-c657-4605-86d2-05896de9cea7\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hqjx8" Apr 17 18:57:39.560907 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.560887 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5768680b-c657-4605-86d2-05896de9cea7-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hqjx8\" (UID: \"5768680b-c657-4605-86d2-05896de9cea7\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hqjx8" Apr 17 18:57:39.561007 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.560985 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5768680b-c657-4605-86d2-05896de9cea7-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hqjx8\" (UID: \"5768680b-c657-4605-86d2-05896de9cea7\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hqjx8" Apr 17 18:57:39.569876 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.569847 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsv2k\" (UniqueName: \"kubernetes.io/projected/5768680b-c657-4605-86d2-05896de9cea7-kube-api-access-bsv2k\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hqjx8\" (UID: \"5768680b-c657-4605-86d2-05896de9cea7\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hqjx8" Apr 17 18:57:39.618290 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.618260 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hqjx8" Apr 17 18:57:39.642508 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.642484 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73lndsr"] Apr 17 18:57:39.644472 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:57:39.644431 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbd9f2e3_9354_4ec3_86bc_ddcc80c8cd9d.slice/crio-e098e6c47bf0224cbc7798c23ed8a6c115b6d2dad40a480e710febfe52aa8090 WatchSource:0}: Error finding container e098e6c47bf0224cbc7798c23ed8a6c115b6d2dad40a480e710febfe52aa8090: Status 404 returned error can't find the container with id e098e6c47bf0224cbc7798c23ed8a6c115b6d2dad40a480e710febfe52aa8090 Apr 17 18:57:39.753090 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.753066 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hqjx8"] Apr 17 18:57:39.754899 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:57:39.754871 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5768680b_c657_4605_86d2_05896de9cea7.slice/crio-9cc98720a118e503ec7feea790cf9e42a745ba21859d4313a10e7bf31400088c WatchSource:0}: Error finding container 9cc98720a118e503ec7feea790cf9e42a745ba21859d4313a10e7bf31400088c: Status 404 returned error can't find the container with id 9cc98720a118e503ec7feea790cf9e42a745ba21859d4313a10e7bf31400088c Apr 17 18:57:39.904374 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.904347 2571 generic.go:358] "Generic (PLEG): container finished" podID="fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d" containerID="159e580276196094b5ea5b93464a358847eeed11833f51ca0eb1f7217829801f" exitCode=0 Apr 17 18:57:39.904511 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.904419 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73lndsr" event={"ID":"fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d","Type":"ContainerDied","Data":"159e580276196094b5ea5b93464a358847eeed11833f51ca0eb1f7217829801f"} Apr 17 18:57:39.904511 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.904444 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73lndsr" event={"ID":"fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d","Type":"ContainerStarted","Data":"e098e6c47bf0224cbc7798c23ed8a6c115b6d2dad40a480e710febfe52aa8090"} Apr 17 18:57:39.905860 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.905837 2571 generic.go:358] "Generic (PLEG): container finished" podID="b1b8254e-6889-4a58-99ba-3dfe78089325" containerID="47c3fef141ffe3edeaeedf777be750f2d55d93c9cd77cc414ca19671e7030820" exitCode=0 Apr 17 18:57:39.905977 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.905929 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1mrjm2" event={"ID":"b1b8254e-6889-4a58-99ba-3dfe78089325","Type":"ContainerDied","Data":"47c3fef141ffe3edeaeedf777be750f2d55d93c9cd77cc414ca19671e7030820"} Apr 17 18:57:39.905977 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.905952 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1mrjm2" event={"ID":"b1b8254e-6889-4a58-99ba-3dfe78089325","Type":"ContainerStarted","Data":"2669b65bb25b7de9be45a64b9d018edc8d2cae45733ea9e3695990d6fb6f6c4c"} Apr 17 18:57:39.907323 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.907296 2571 generic.go:358] "Generic (PLEG): container finished" podID="5768680b-c657-4605-86d2-05896de9cea7" containerID="7c9ca6133d5222fadaab7e74f31e411ec46b6de23d2c411f144113d9a851a360" exitCode=0 Apr 17 18:57:39.907486 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.907399 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hqjx8" event={"ID":"5768680b-c657-4605-86d2-05896de9cea7","Type":"ContainerDied","Data":"7c9ca6133d5222fadaab7e74f31e411ec46b6de23d2c411f144113d9a851a360"} Apr 17 18:57:39.907486 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.907422 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hqjx8" event={"ID":"5768680b-c657-4605-86d2-05896de9cea7","Type":"ContainerStarted","Data":"9cc98720a118e503ec7feea790cf9e42a745ba21859d4313a10e7bf31400088c"} Apr 17 18:57:39.908918 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.908894 2571 generic.go:358] "Generic (PLEG): container finished" podID="ee74b3d9-cab8-4e88-9052-dd2d7f28e1df" containerID="871964fa61f33c88e3cf11321ce27c48b757d21a2ed338998b4e7bc5782d6d64" exitCode=0 Apr 17 18:57:39.909012 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.908941 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hb8v7" event={"ID":"ee74b3d9-cab8-4e88-9052-dd2d7f28e1df","Type":"ContainerDied","Data":"871964fa61f33c88e3cf11321ce27c48b757d21a2ed338998b4e7bc5782d6d64"} Apr 17 18:57:39.909012 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:39.908963 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hb8v7" event={"ID":"ee74b3d9-cab8-4e88-9052-dd2d7f28e1df","Type":"ContainerStarted","Data":"1808cf0114faa56e47da0d66e8a91d31a0b45f6e926463be3565acb98ba0d355"} Apr 17 18:57:41.917483 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:41.917425 2571 generic.go:358] "Generic (PLEG): container finished" podID="fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d" containerID="259428ddec753d9d1b7a293b15bdbeb0e355cd1b215f9c5514ec5ab6b6ec57e1" exitCode=0 Apr 17 18:57:41.917917 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:41.917506 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73lndsr" event={"ID":"fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d","Type":"ContainerDied","Data":"259428ddec753d9d1b7a293b15bdbeb0e355cd1b215f9c5514ec5ab6b6ec57e1"} Apr 17 18:57:41.919264 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:41.919240 2571 generic.go:358] "Generic (PLEG): container finished" podID="b1b8254e-6889-4a58-99ba-3dfe78089325" containerID="0c1057cef7101e931d5e65b745322f108a2f3487fceaeee6419e34e721efce41" exitCode=0 Apr 17 18:57:41.919360 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:41.919342 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1mrjm2" event={"ID":"b1b8254e-6889-4a58-99ba-3dfe78089325","Type":"ContainerDied","Data":"0c1057cef7101e931d5e65b745322f108a2f3487fceaeee6419e34e721efce41"} Apr 17 18:57:41.921103 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:41.921082 2571 generic.go:358] "Generic (PLEG): container finished" podID="5768680b-c657-4605-86d2-05896de9cea7" containerID="3ba10f34cad6149d3e57c6af09c01964c1428b38739556f811effd082044d1d4" exitCode=0 Apr 17 18:57:41.921193 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:41.921162 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hqjx8" event={"ID":"5768680b-c657-4605-86d2-05896de9cea7","Type":"ContainerDied","Data":"3ba10f34cad6149d3e57c6af09c01964c1428b38739556f811effd082044d1d4"} Apr 17 18:57:41.922915 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:41.922884 2571 generic.go:358] "Generic (PLEG): container finished" podID="ee74b3d9-cab8-4e88-9052-dd2d7f28e1df" containerID="68c7084277446c270985586482276b1884f95af2c2214f90c3f587fd69b043f7" exitCode=0 Apr 17 18:57:41.922979 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:41.922927 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hb8v7" event={"ID":"ee74b3d9-cab8-4e88-9052-dd2d7f28e1df","Type":"ContainerDied","Data":"68c7084277446c270985586482276b1884f95af2c2214f90c3f587fd69b043f7"} Apr 17 18:57:42.928938 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:42.928906 2571 generic.go:358] "Generic (PLEG): container finished" podID="ee74b3d9-cab8-4e88-9052-dd2d7f28e1df" containerID="c8107cae8a0b8f99a3ed3538ab52ad41300e1c5f84bcb022a8530679e8c0de40" exitCode=0 Apr 17 18:57:42.929330 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:42.928969 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hb8v7" event={"ID":"ee74b3d9-cab8-4e88-9052-dd2d7f28e1df","Type":"ContainerDied","Data":"c8107cae8a0b8f99a3ed3538ab52ad41300e1c5f84bcb022a8530679e8c0de40"} Apr 17 18:57:42.930817 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:42.930796 2571 generic.go:358] "Generic (PLEG): container finished" podID="fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d" containerID="fa1a8fbe049a830a0d4e809717bff93f40921f90fa1e63e0d54207140868632e" exitCode=0 Apr 17 18:57:42.930936 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:42.930853 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73lndsr" event={"ID":"fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d","Type":"ContainerDied","Data":"fa1a8fbe049a830a0d4e809717bff93f40921f90fa1e63e0d54207140868632e"} Apr 17 18:57:42.932555 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:42.932534 2571 generic.go:358] "Generic (PLEG): container finished" podID="b1b8254e-6889-4a58-99ba-3dfe78089325" containerID="073b9d04f1902fbecac96fc0e7cb6053cfe28da0a0f2c9c26da77a61177ec964" exitCode=0 Apr 17 18:57:42.932668 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:42.932628 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1mrjm2" event={"ID":"b1b8254e-6889-4a58-99ba-3dfe78089325","Type":"ContainerDied","Data":"073b9d04f1902fbecac96fc0e7cb6053cfe28da0a0f2c9c26da77a61177ec964"} Apr 17 18:57:42.934326 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:42.934305 2571 generic.go:358] "Generic (PLEG): container finished" podID="5768680b-c657-4605-86d2-05896de9cea7" containerID="bc3deb8d93d50d03a5e631d2187e5f9b7e88d20b36af573d83a383f15b9a9490" exitCode=0 Apr 17 18:57:42.934410 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:42.934356 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hqjx8" event={"ID":"5768680b-c657-4605-86d2-05896de9cea7","Type":"ContainerDied","Data":"bc3deb8d93d50d03a5e631d2187e5f9b7e88d20b36af573d83a383f15b9a9490"} Apr 17 18:57:44.072589 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.072563 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hb8v7" Apr 17 18:57:44.116437 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.116408 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hqjx8" Apr 17 18:57:44.156944 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.156922 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73lndsr" Apr 17 18:57:44.159999 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.159982 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1mrjm2" Apr 17 18:57:44.203664 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.203588 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5768680b-c657-4605-86d2-05896de9cea7-util\") pod \"5768680b-c657-4605-86d2-05896de9cea7\" (UID: \"5768680b-c657-4605-86d2-05896de9cea7\") " Apr 17 18:57:44.203664 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.203636 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee74b3d9-cab8-4e88-9052-dd2d7f28e1df-bundle\") pod \"ee74b3d9-cab8-4e88-9052-dd2d7f28e1df\" (UID: \"ee74b3d9-cab8-4e88-9052-dd2d7f28e1df\") " Apr 17 18:57:44.203859 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.203703 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q79vp\" (UniqueName: \"kubernetes.io/projected/ee74b3d9-cab8-4e88-9052-dd2d7f28e1df-kube-api-access-q79vp\") pod \"ee74b3d9-cab8-4e88-9052-dd2d7f28e1df\" (UID: \"ee74b3d9-cab8-4e88-9052-dd2d7f28e1df\") " Apr 17 18:57:44.203859 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.203739 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee74b3d9-cab8-4e88-9052-dd2d7f28e1df-util\") pod \"ee74b3d9-cab8-4e88-9052-dd2d7f28e1df\" (UID: \"ee74b3d9-cab8-4e88-9052-dd2d7f28e1df\") " Apr 17 18:57:44.203859 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.203764 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsv2k\" (UniqueName: \"kubernetes.io/projected/5768680b-c657-4605-86d2-05896de9cea7-kube-api-access-bsv2k\") pod \"5768680b-c657-4605-86d2-05896de9cea7\" (UID: \"5768680b-c657-4605-86d2-05896de9cea7\") " Apr 17 18:57:44.203859 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.203799 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5768680b-c657-4605-86d2-05896de9cea7-bundle\") pod \"5768680b-c657-4605-86d2-05896de9cea7\" (UID: \"5768680b-c657-4605-86d2-05896de9cea7\") " Apr 17 18:57:44.204309 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.204282 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee74b3d9-cab8-4e88-9052-dd2d7f28e1df-bundle" (OuterVolumeSpecName: "bundle") pod "ee74b3d9-cab8-4e88-9052-dd2d7f28e1df" (UID: "ee74b3d9-cab8-4e88-9052-dd2d7f28e1df"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:57:44.204480 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.204419 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5768680b-c657-4605-86d2-05896de9cea7-bundle" (OuterVolumeSpecName: "bundle") pod "5768680b-c657-4605-86d2-05896de9cea7" (UID: "5768680b-c657-4605-86d2-05896de9cea7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:57:44.206449 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.206416 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5768680b-c657-4605-86d2-05896de9cea7-kube-api-access-bsv2k" (OuterVolumeSpecName: "kube-api-access-bsv2k") pod "5768680b-c657-4605-86d2-05896de9cea7" (UID: "5768680b-c657-4605-86d2-05896de9cea7"). InnerVolumeSpecName "kube-api-access-bsv2k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:57:44.206597 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.206475 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee74b3d9-cab8-4e88-9052-dd2d7f28e1df-kube-api-access-q79vp" (OuterVolumeSpecName: "kube-api-access-q79vp") pod "ee74b3d9-cab8-4e88-9052-dd2d7f28e1df" (UID: "ee74b3d9-cab8-4e88-9052-dd2d7f28e1df"). InnerVolumeSpecName "kube-api-access-q79vp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:57:44.209900 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.209872 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5768680b-c657-4605-86d2-05896de9cea7-util" (OuterVolumeSpecName: "util") pod "5768680b-c657-4605-86d2-05896de9cea7" (UID: "5768680b-c657-4605-86d2-05896de9cea7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:57:44.210184 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.210159 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee74b3d9-cab8-4e88-9052-dd2d7f28e1df-util" (OuterVolumeSpecName: "util") pod "ee74b3d9-cab8-4e88-9052-dd2d7f28e1df" (UID: "ee74b3d9-cab8-4e88-9052-dd2d7f28e1df"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:57:44.304943 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.304906 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1b8254e-6889-4a58-99ba-3dfe78089325-bundle\") pod \"b1b8254e-6889-4a58-99ba-3dfe78089325\" (UID: \"b1b8254e-6889-4a58-99ba-3dfe78089325\") " Apr 17 18:57:44.305129 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.304960 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d-util\") pod \"fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d\" (UID: \"fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d\") " Apr 17 18:57:44.305129 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.304994 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvwmg\" (UniqueName: \"kubernetes.io/projected/fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d-kube-api-access-mvwmg\") pod \"fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d\" (UID: \"fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d\") " Apr 17 18:57:44.305129 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.305033 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hr957\" (UniqueName: \"kubernetes.io/projected/b1b8254e-6889-4a58-99ba-3dfe78089325-kube-api-access-hr957\") pod \"b1b8254e-6889-4a58-99ba-3dfe78089325\" (UID: \"b1b8254e-6889-4a58-99ba-3dfe78089325\") " Apr 17 18:57:44.305313 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.305163 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d-bundle\") pod \"fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d\" (UID: \"fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d\") " Apr 17 18:57:44.305313 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.305228 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1b8254e-6889-4a58-99ba-3dfe78089325-util\") pod \"b1b8254e-6889-4a58-99ba-3dfe78089325\" (UID: \"b1b8254e-6889-4a58-99ba-3dfe78089325\") " Apr 17 18:57:44.305576 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.305556 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5768680b-c657-4605-86d2-05896de9cea7-util\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:57:44.305718 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.305581 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee74b3d9-cab8-4e88-9052-dd2d7f28e1df-bundle\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:57:44.305718 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.305597 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q79vp\" (UniqueName: \"kubernetes.io/projected/ee74b3d9-cab8-4e88-9052-dd2d7f28e1df-kube-api-access-q79vp\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:57:44.305718 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.305612 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee74b3d9-cab8-4e88-9052-dd2d7f28e1df-util\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:57:44.305718 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.305628 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bsv2k\" (UniqueName: \"kubernetes.io/projected/5768680b-c657-4605-86d2-05896de9cea7-kube-api-access-bsv2k\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:57:44.305718 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.305641 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5768680b-c657-4605-86d2-05896de9cea7-bundle\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:57:44.305976 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.305831 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1b8254e-6889-4a58-99ba-3dfe78089325-bundle" (OuterVolumeSpecName: "bundle") pod "b1b8254e-6889-4a58-99ba-3dfe78089325" (UID: "b1b8254e-6889-4a58-99ba-3dfe78089325"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:57:44.305976 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.305932 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d-bundle" (OuterVolumeSpecName: "bundle") pod "fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d" (UID: "fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:57:44.307239 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.307211 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1b8254e-6889-4a58-99ba-3dfe78089325-kube-api-access-hr957" (OuterVolumeSpecName: "kube-api-access-hr957") pod "b1b8254e-6889-4a58-99ba-3dfe78089325" (UID: "b1b8254e-6889-4a58-99ba-3dfe78089325"). InnerVolumeSpecName "kube-api-access-hr957". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:57:44.307324 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.307265 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d-kube-api-access-mvwmg" (OuterVolumeSpecName: "kube-api-access-mvwmg") pod "fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d" (UID: "fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d"). InnerVolumeSpecName "kube-api-access-mvwmg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:57:44.310708 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.310678 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d-util" (OuterVolumeSpecName: "util") pod "fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d" (UID: "fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:57:44.311436 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.311413 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1b8254e-6889-4a58-99ba-3dfe78089325-util" (OuterVolumeSpecName: "util") pod "b1b8254e-6889-4a58-99ba-3dfe78089325" (UID: "b1b8254e-6889-4a58-99ba-3dfe78089325"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:57:44.406502 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.406434 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1b8254e-6889-4a58-99ba-3dfe78089325-util\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:57:44.406502 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.406496 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1b8254e-6889-4a58-99ba-3dfe78089325-bundle\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:57:44.406502 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.406506 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d-util\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:57:44.406502 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.406515 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mvwmg\" (UniqueName: \"kubernetes.io/projected/fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d-kube-api-access-mvwmg\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:57:44.406754 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.406527 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hr957\" (UniqueName: \"kubernetes.io/projected/b1b8254e-6889-4a58-99ba-3dfe78089325-kube-api-access-hr957\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:57:44.406754 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.406536 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d-bundle\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:57:44.943720 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.943686 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73lndsr" event={"ID":"fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d","Type":"ContainerDied","Data":"e098e6c47bf0224cbc7798c23ed8a6c115b6d2dad40a480e710febfe52aa8090"} Apr 17 18:57:44.943720 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.943717 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73lndsr" Apr 17 18:57:44.943924 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.943720 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e098e6c47bf0224cbc7798c23ed8a6c115b6d2dad40a480e710febfe52aa8090" Apr 17 18:57:44.945279 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.945251 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1mrjm2" event={"ID":"b1b8254e-6889-4a58-99ba-3dfe78089325","Type":"ContainerDied","Data":"2669b65bb25b7de9be45a64b9d018edc8d2cae45733ea9e3695990d6fb6f6c4c"} Apr 17 18:57:44.945407 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.945281 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2669b65bb25b7de9be45a64b9d018edc8d2cae45733ea9e3695990d6fb6f6c4c" Apr 17 18:57:44.945407 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.945260 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1mrjm2" Apr 17 18:57:44.947088 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.947064 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hqjx8" Apr 17 18:57:44.947182 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.947068 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hqjx8" event={"ID":"5768680b-c657-4605-86d2-05896de9cea7","Type":"ContainerDied","Data":"9cc98720a118e503ec7feea790cf9e42a745ba21859d4313a10e7bf31400088c"} Apr 17 18:57:44.947231 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.947178 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cc98720a118e503ec7feea790cf9e42a745ba21859d4313a10e7bf31400088c" Apr 17 18:57:44.948873 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.948853 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hb8v7" event={"ID":"ee74b3d9-cab8-4e88-9052-dd2d7f28e1df","Type":"ContainerDied","Data":"1808cf0114faa56e47da0d66e8a91d31a0b45f6e926463be3565acb98ba0d355"} Apr 17 18:57:44.948873 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.948865 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hb8v7" Apr 17 18:57:44.948991 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:44.948877 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1808cf0114faa56e47da0d66e8a91d31a0b45f6e926463be3565acb98ba0d355" Apr 17 18:57:48.452005 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.451972 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-556fb96dfc-fjvvd"] Apr 17 18:57:48.452344 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.452318 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b1b8254e-6889-4a58-99ba-3dfe78089325" containerName="util" Apr 17 18:57:48.452344 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.452329 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1b8254e-6889-4a58-99ba-3dfe78089325" containerName="util" Apr 17 18:57:48.452542 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.452346 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d" containerName="extract" Apr 17 18:57:48.452542 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.452354 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d" containerName="extract" Apr 17 18:57:48.452542 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.452361 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ee74b3d9-cab8-4e88-9052-dd2d7f28e1df" containerName="util" Apr 17 18:57:48.452542 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.452369 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee74b3d9-cab8-4e88-9052-dd2d7f28e1df" containerName="util" Apr 17 18:57:48.452542 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.452374 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d" containerName="pull" Apr 17 18:57:48.452542 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.452380 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d" containerName="pull" Apr 17 18:57:48.452542 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.452387 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5768680b-c657-4605-86d2-05896de9cea7" containerName="extract" Apr 17 18:57:48.452542 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.452392 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5768680b-c657-4605-86d2-05896de9cea7" containerName="extract" Apr 17 18:57:48.452542 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.452401 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b1b8254e-6889-4a58-99ba-3dfe78089325" containerName="pull" Apr 17 18:57:48.452542 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.452406 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1b8254e-6889-4a58-99ba-3dfe78089325" containerName="pull" Apr 17 18:57:48.452542 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.452412 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ee74b3d9-cab8-4e88-9052-dd2d7f28e1df" containerName="pull" Apr 17 18:57:48.452542 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.452419 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee74b3d9-cab8-4e88-9052-dd2d7f28e1df" containerName="pull" Apr 17 18:57:48.452542 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.452436 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5768680b-c657-4605-86d2-05896de9cea7" containerName="pull" Apr 17 18:57:48.452542 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.452442 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5768680b-c657-4605-86d2-05896de9cea7" containerName="pull" Apr 17 18:57:48.452542 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.452448 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b1b8254e-6889-4a58-99ba-3dfe78089325" containerName="extract" Apr 17 18:57:48.452542 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.452453 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1b8254e-6889-4a58-99ba-3dfe78089325" containerName="extract" Apr 17 18:57:48.452542 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.452481 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d" containerName="util" Apr 17 18:57:48.452542 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.452486 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d" containerName="util" Apr 17 18:57:48.452542 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.452492 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ee74b3d9-cab8-4e88-9052-dd2d7f28e1df" containerName="extract" Apr 17 18:57:48.452542 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.452497 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee74b3d9-cab8-4e88-9052-dd2d7f28e1df" containerName="extract" Apr 17 18:57:48.452542 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.452504 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5768680b-c657-4605-86d2-05896de9cea7" containerName="util" Apr 17 18:57:48.452542 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.452509 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5768680b-c657-4605-86d2-05896de9cea7" containerName="util" Apr 17 18:57:48.453252 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.452563 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d" containerName="extract" Apr 17 18:57:48.453252 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.452572 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="5768680b-c657-4605-86d2-05896de9cea7" containerName="extract" Apr 17 18:57:48.453252 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.452578 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="ee74b3d9-cab8-4e88-9052-dd2d7f28e1df" containerName="extract" Apr 17 18:57:48.453252 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.452584 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="b1b8254e-6889-4a58-99ba-3dfe78089325" containerName="extract" Apr 17 18:57:48.456845 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.456829 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-556fb96dfc-fjvvd" Apr 17 18:57:48.466954 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.466931 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-556fb96dfc-fjvvd"] Apr 17 18:57:48.543898 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.543867 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f49ba530-b751-4126-8940-163ce9a5d35b-trusted-ca-bundle\") pod \"console-556fb96dfc-fjvvd\" (UID: \"f49ba530-b751-4126-8940-163ce9a5d35b\") " pod="openshift-console/console-556fb96dfc-fjvvd" Apr 17 18:57:48.544023 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.543903 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f49ba530-b751-4126-8940-163ce9a5d35b-console-serving-cert\") pod \"console-556fb96dfc-fjvvd\" (UID: \"f49ba530-b751-4126-8940-163ce9a5d35b\") " pod="openshift-console/console-556fb96dfc-fjvvd" Apr 17 18:57:48.544023 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.543936 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f49ba530-b751-4126-8940-163ce9a5d35b-service-ca\") pod \"console-556fb96dfc-fjvvd\" (UID: \"f49ba530-b751-4126-8940-163ce9a5d35b\") " pod="openshift-console/console-556fb96dfc-fjvvd" Apr 17 18:57:48.544136 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.544029 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f49ba530-b751-4126-8940-163ce9a5d35b-console-oauth-config\") pod \"console-556fb96dfc-fjvvd\" (UID: \"f49ba530-b751-4126-8940-163ce9a5d35b\") " pod="openshift-console/console-556fb96dfc-fjvvd" Apr 17 18:57:48.544136 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.544079 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f49ba530-b751-4126-8940-163ce9a5d35b-oauth-serving-cert\") pod \"console-556fb96dfc-fjvvd\" (UID: \"f49ba530-b751-4126-8940-163ce9a5d35b\") " pod="openshift-console/console-556fb96dfc-fjvvd" Apr 17 18:57:48.544136 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.544115 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rfbl\" (UniqueName: \"kubernetes.io/projected/f49ba530-b751-4126-8940-163ce9a5d35b-kube-api-access-7rfbl\") pod \"console-556fb96dfc-fjvvd\" (UID: \"f49ba530-b751-4126-8940-163ce9a5d35b\") " pod="openshift-console/console-556fb96dfc-fjvvd" Apr 17 18:57:48.544245 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.544141 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f49ba530-b751-4126-8940-163ce9a5d35b-console-config\") pod \"console-556fb96dfc-fjvvd\" (UID: \"f49ba530-b751-4126-8940-163ce9a5d35b\") " pod="openshift-console/console-556fb96dfc-fjvvd" Apr 17 18:57:48.645428 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.645399 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f49ba530-b751-4126-8940-163ce9a5d35b-trusted-ca-bundle\") pod \"console-556fb96dfc-fjvvd\" (UID: \"f49ba530-b751-4126-8940-163ce9a5d35b\") " pod="openshift-console/console-556fb96dfc-fjvvd" Apr 17 18:57:48.645537 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.645432 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f49ba530-b751-4126-8940-163ce9a5d35b-console-serving-cert\") pod \"console-556fb96dfc-fjvvd\" (UID: \"f49ba530-b751-4126-8940-163ce9a5d35b\") " pod="openshift-console/console-556fb96dfc-fjvvd" Apr 17 18:57:48.645537 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.645489 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f49ba530-b751-4126-8940-163ce9a5d35b-service-ca\") pod \"console-556fb96dfc-fjvvd\" (UID: \"f49ba530-b751-4126-8940-163ce9a5d35b\") " pod="openshift-console/console-556fb96dfc-fjvvd" Apr 17 18:57:48.645537 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.645524 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f49ba530-b751-4126-8940-163ce9a5d35b-console-oauth-config\") pod \"console-556fb96dfc-fjvvd\" (UID: \"f49ba530-b751-4126-8940-163ce9a5d35b\") " pod="openshift-console/console-556fb96dfc-fjvvd" Apr 17 18:57:48.645701 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.645547 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f49ba530-b751-4126-8940-163ce9a5d35b-oauth-serving-cert\") pod \"console-556fb96dfc-fjvvd\" (UID: \"f49ba530-b751-4126-8940-163ce9a5d35b\") " pod="openshift-console/console-556fb96dfc-fjvvd" Apr 17 18:57:48.645701 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.645567 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7rfbl\" (UniqueName: \"kubernetes.io/projected/f49ba530-b751-4126-8940-163ce9a5d35b-kube-api-access-7rfbl\") pod \"console-556fb96dfc-fjvvd\" (UID: \"f49ba530-b751-4126-8940-163ce9a5d35b\") " pod="openshift-console/console-556fb96dfc-fjvvd" Apr 17 18:57:48.645701 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.645591 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f49ba530-b751-4126-8940-163ce9a5d35b-console-config\") pod \"console-556fb96dfc-fjvvd\" (UID: \"f49ba530-b751-4126-8940-163ce9a5d35b\") " pod="openshift-console/console-556fb96dfc-fjvvd" Apr 17 18:57:48.646279 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.646254 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f49ba530-b751-4126-8940-163ce9a5d35b-service-ca\") pod \"console-556fb96dfc-fjvvd\" (UID: \"f49ba530-b751-4126-8940-163ce9a5d35b\") " pod="openshift-console/console-556fb96dfc-fjvvd" Apr 17 18:57:48.646369 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.646351 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f49ba530-b751-4126-8940-163ce9a5d35b-oauth-serving-cert\") pod \"console-556fb96dfc-fjvvd\" (UID: \"f49ba530-b751-4126-8940-163ce9a5d35b\") " pod="openshift-console/console-556fb96dfc-fjvvd" Apr 17 18:57:48.646369 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.646355 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f49ba530-b751-4126-8940-163ce9a5d35b-trusted-ca-bundle\") pod \"console-556fb96dfc-fjvvd\" (UID: \"f49ba530-b751-4126-8940-163ce9a5d35b\") " pod="openshift-console/console-556fb96dfc-fjvvd" Apr 17 18:57:48.646438 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.646368 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f49ba530-b751-4126-8940-163ce9a5d35b-console-config\") pod \"console-556fb96dfc-fjvvd\" (UID: \"f49ba530-b751-4126-8940-163ce9a5d35b\") " pod="openshift-console/console-556fb96dfc-fjvvd" Apr 17 18:57:48.647959 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.647937 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f49ba530-b751-4126-8940-163ce9a5d35b-console-oauth-config\") pod \"console-556fb96dfc-fjvvd\" (UID: \"f49ba530-b751-4126-8940-163ce9a5d35b\") " pod="openshift-console/console-556fb96dfc-fjvvd" Apr 17 18:57:48.648071 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.648055 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f49ba530-b751-4126-8940-163ce9a5d35b-console-serving-cert\") pod \"console-556fb96dfc-fjvvd\" (UID: \"f49ba530-b751-4126-8940-163ce9a5d35b\") " pod="openshift-console/console-556fb96dfc-fjvvd" Apr 17 18:57:48.653763 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.653740 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rfbl\" (UniqueName: \"kubernetes.io/projected/f49ba530-b751-4126-8940-163ce9a5d35b-kube-api-access-7rfbl\") pod \"console-556fb96dfc-fjvvd\" (UID: \"f49ba530-b751-4126-8940-163ce9a5d35b\") " pod="openshift-console/console-556fb96dfc-fjvvd" Apr 17 18:57:48.766401 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.766375 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-556fb96dfc-fjvvd" Apr 17 18:57:48.908771 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.908747 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-556fb96dfc-fjvvd"] Apr 17 18:57:48.910991 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:57:48.910963 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf49ba530_b751_4126_8940_163ce9a5d35b.slice/crio-3fae30feb87560bcd7ced7d4fc4ab1b5c6ec455d1d4ff375b16d499db26e9a12 WatchSource:0}: Error finding container 3fae30feb87560bcd7ced7d4fc4ab1b5c6ec455d1d4ff375b16d499db26e9a12: Status 404 returned error can't find the container with id 3fae30feb87560bcd7ced7d4fc4ab1b5c6ec455d1d4ff375b16d499db26e9a12 Apr 17 18:57:48.968773 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:48.968743 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-556fb96dfc-fjvvd" event={"ID":"f49ba530-b751-4126-8940-163ce9a5d35b","Type":"ContainerStarted","Data":"3fae30feb87560bcd7ced7d4fc4ab1b5c6ec455d1d4ff375b16d499db26e9a12"} Apr 17 18:57:49.577858 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:49.577818 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-d4psq"] Apr 17 18:57:49.582365 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:49.582341 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-d4psq" Apr 17 18:57:49.584717 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:49.584696 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-bqmb2\"" Apr 17 18:57:49.584808 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:49.584720 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 17 18:57:49.590301 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:49.590278 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-d4psq"] Apr 17 18:57:49.672694 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:49.672670 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hckxz\" (UniqueName: \"kubernetes.io/projected/a2d8d80b-bcb5-4e98-87c8-7bf38bd21c44-kube-api-access-hckxz\") pod \"dns-operator-controller-manager-648d5c98bc-d4psq\" (UID: \"a2d8d80b-bcb5-4e98-87c8-7bf38bd21c44\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-d4psq" Apr 17 18:57:49.773378 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:49.773348 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hckxz\" (UniqueName: \"kubernetes.io/projected/a2d8d80b-bcb5-4e98-87c8-7bf38bd21c44-kube-api-access-hckxz\") pod \"dns-operator-controller-manager-648d5c98bc-d4psq\" (UID: \"a2d8d80b-bcb5-4e98-87c8-7bf38bd21c44\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-d4psq" Apr 17 18:57:49.792338 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:49.792312 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hckxz\" (UniqueName: \"kubernetes.io/projected/a2d8d80b-bcb5-4e98-87c8-7bf38bd21c44-kube-api-access-hckxz\") pod \"dns-operator-controller-manager-648d5c98bc-d4psq\" (UID: \"a2d8d80b-bcb5-4e98-87c8-7bf38bd21c44\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-d4psq" Apr 17 18:57:49.893194 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:49.893132 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-d4psq" Apr 17 18:57:49.975101 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:49.975061 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-556fb96dfc-fjvvd" event={"ID":"f49ba530-b751-4126-8940-163ce9a5d35b","Type":"ContainerStarted","Data":"11b78c615613d8814b2be258a9673095d8062824919ee0cef4c7f28b5a7b189d"} Apr 17 18:57:49.999997 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:49.999955 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-556fb96dfc-fjvvd" podStartSLOduration=1.9999391800000001 podStartE2EDuration="1.99993918s" podCreationTimestamp="2026-04-17 18:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:57:49.997327872 +0000 UTC m=+509.271345165" watchObservedRunningTime="2026-04-17 18:57:49.99993918 +0000 UTC m=+509.273956473" Apr 17 18:57:50.017237 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:50.017213 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-d4psq"] Apr 17 18:57:50.019051 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:57:50.019025 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2d8d80b_bcb5_4e98_87c8_7bf38bd21c44.slice/crio-6c85bcb48f63573bc040b8aba6b492b5fde0c193d47bc79e8bb047707cc134ef WatchSource:0}: Error finding container 6c85bcb48f63573bc040b8aba6b492b5fde0c193d47bc79e8bb047707cc134ef: Status 404 returned error can't find the container with id 6c85bcb48f63573bc040b8aba6b492b5fde0c193d47bc79e8bb047707cc134ef Apr 17 18:57:50.988696 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:50.988653 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-d4psq" event={"ID":"a2d8d80b-bcb5-4e98-87c8-7bf38bd21c44","Type":"ContainerStarted","Data":"6c85bcb48f63573bc040b8aba6b492b5fde0c193d47bc79e8bb047707cc134ef"} Apr 17 18:57:52.998187 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:52.998150 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-d4psq" event={"ID":"a2d8d80b-bcb5-4e98-87c8-7bf38bd21c44","Type":"ContainerStarted","Data":"659b54e52b12254fff244c59d48829a6f08175a534c8b9b0d77ced49cf7a5e7a"} Apr 17 18:57:52.998624 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:52.998272 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-d4psq" Apr 17 18:57:53.015576 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:53.015528 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-d4psq" podStartSLOduration=1.770943816 podStartE2EDuration="4.0155164s" podCreationTimestamp="2026-04-17 18:57:49 +0000 UTC" firstStartedPulling="2026-04-17 18:57:50.02106477 +0000 UTC m=+509.295082045" lastFinishedPulling="2026-04-17 18:57:52.265637356 +0000 UTC m=+511.539654629" observedRunningTime="2026-04-17 18:57:53.012883741 +0000 UTC m=+512.286901046" watchObservedRunningTime="2026-04-17 18:57:53.0155164 +0000 UTC m=+512.289533692" Apr 17 18:57:54.543273 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:54.543239 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dwhsx"] Apr 17 18:57:54.546409 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:54.546393 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dwhsx" Apr 17 18:57:54.548742 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:54.548719 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-rrr25\"" Apr 17 18:57:54.557771 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:54.557744 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dwhsx"] Apr 17 18:57:54.607175 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:54.607150 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b20ab65c-624f-4b77-9960-7900782bb108-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-dwhsx\" (UID: \"b20ab65c-624f-4b77-9960-7900782bb108\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dwhsx" Apr 17 18:57:54.607296 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:54.607182 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5z27d\" (UniqueName: \"kubernetes.io/projected/b20ab65c-624f-4b77-9960-7900782bb108-kube-api-access-5z27d\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-dwhsx\" (UID: \"b20ab65c-624f-4b77-9960-7900782bb108\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dwhsx" Apr 17 18:57:54.707587 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:54.707557 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5z27d\" (UniqueName: \"kubernetes.io/projected/b20ab65c-624f-4b77-9960-7900782bb108-kube-api-access-5z27d\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-dwhsx\" (UID: \"b20ab65c-624f-4b77-9960-7900782bb108\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dwhsx" Apr 17 18:57:54.707708 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:54.707658 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b20ab65c-624f-4b77-9960-7900782bb108-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-dwhsx\" (UID: \"b20ab65c-624f-4b77-9960-7900782bb108\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dwhsx" Apr 17 18:57:54.707963 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:54.707948 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b20ab65c-624f-4b77-9960-7900782bb108-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-dwhsx\" (UID: \"b20ab65c-624f-4b77-9960-7900782bb108\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dwhsx" Apr 17 18:57:54.717204 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:54.717169 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5z27d\" (UniqueName: \"kubernetes.io/projected/b20ab65c-624f-4b77-9960-7900782bb108-kube-api-access-5z27d\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-dwhsx\" (UID: \"b20ab65c-624f-4b77-9960-7900782bb108\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dwhsx" Apr 17 18:57:54.858209 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:54.858146 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dwhsx" Apr 17 18:57:54.980053 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:54.980024 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dwhsx"] Apr 17 18:57:54.981752 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:57:54.981715 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb20ab65c_624f_4b77_9960_7900782bb108.slice/crio-beed588e05641a79d57f7154ce153647d0547c45767ab13accaed9661a478f0a WatchSource:0}: Error finding container beed588e05641a79d57f7154ce153647d0547c45767ab13accaed9661a478f0a: Status 404 returned error can't find the container with id beed588e05641a79d57f7154ce153647d0547c45767ab13accaed9661a478f0a Apr 17 18:57:55.006557 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:55.006529 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dwhsx" event={"ID":"b20ab65c-624f-4b77-9960-7900782bb108","Type":"ContainerStarted","Data":"beed588e05641a79d57f7154ce153647d0547c45767ab13accaed9661a478f0a"} Apr 17 18:57:58.766939 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:58.766907 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-556fb96dfc-fjvvd" Apr 17 18:57:58.767391 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:58.766950 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-556fb96dfc-fjvvd" Apr 17 18:57:58.772151 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:58.772128 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-556fb96dfc-fjvvd" Apr 17 18:57:59.027525 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:59.027427 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-556fb96dfc-fjvvd" Apr 17 18:57:59.079485 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:57:59.079433 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7b5fc8f596-88zsz"] Apr 17 18:58:00.028766 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:00.028725 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dwhsx" event={"ID":"b20ab65c-624f-4b77-9960-7900782bb108","Type":"ContainerStarted","Data":"c51816f2b577c3e0b93740ca1b23daa051322af3d138985f465d3aa7f4395722"} Apr 17 18:58:00.029216 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:00.028837 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dwhsx" Apr 17 18:58:00.048004 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:00.047949 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dwhsx" podStartSLOduration=1.227831823 podStartE2EDuration="6.04793368s" podCreationTimestamp="2026-04-17 18:57:54 +0000 UTC" firstStartedPulling="2026-04-17 18:57:54.984158384 +0000 UTC m=+514.258175656" lastFinishedPulling="2026-04-17 18:57:59.804260242 +0000 UTC m=+519.078277513" observedRunningTime="2026-04-17 18:58:00.045208607 +0000 UTC m=+519.319225899" watchObservedRunningTime="2026-04-17 18:58:00.04793368 +0000 UTC m=+519.321950974" Apr 17 18:58:01.509809 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:01.509769 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-rpcm7"] Apr 17 18:58:01.511877 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:01.511861 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-rpcm7" Apr 17 18:58:01.514126 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:01.514108 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-xrdkk\"" Apr 17 18:58:01.526156 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:01.526132 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-rpcm7"] Apr 17 18:58:01.561385 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:01.561355 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l2cx\" (UniqueName: \"kubernetes.io/projected/7c1cefaf-94fe-4ab4-a072-078ba0be1ec3-kube-api-access-6l2cx\") pod \"authorino-operator-657f44b778-rpcm7\" (UID: \"7c1cefaf-94fe-4ab4-a072-078ba0be1ec3\") " pod="kuadrant-system/authorino-operator-657f44b778-rpcm7" Apr 17 18:58:01.662161 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:01.662128 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6l2cx\" (UniqueName: \"kubernetes.io/projected/7c1cefaf-94fe-4ab4-a072-078ba0be1ec3-kube-api-access-6l2cx\") pod \"authorino-operator-657f44b778-rpcm7\" (UID: \"7c1cefaf-94fe-4ab4-a072-078ba0be1ec3\") " pod="kuadrant-system/authorino-operator-657f44b778-rpcm7" Apr 17 18:58:01.669632 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:01.669607 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l2cx\" (UniqueName: \"kubernetes.io/projected/7c1cefaf-94fe-4ab4-a072-078ba0be1ec3-kube-api-access-6l2cx\") pod \"authorino-operator-657f44b778-rpcm7\" (UID: \"7c1cefaf-94fe-4ab4-a072-078ba0be1ec3\") " pod="kuadrant-system/authorino-operator-657f44b778-rpcm7" Apr 17 18:58:01.822023 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:01.821936 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-rpcm7" Apr 17 18:58:01.943295 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:01.943266 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-rpcm7"] Apr 17 18:58:01.945378 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:58:01.945345 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c1cefaf_94fe_4ab4_a072_078ba0be1ec3.slice/crio-6ca9f8ac8bc04994569871a6ef051a837703e83bcaea00d2555049154410a49e WatchSource:0}: Error finding container 6ca9f8ac8bc04994569871a6ef051a837703e83bcaea00d2555049154410a49e: Status 404 returned error can't find the container with id 6ca9f8ac8bc04994569871a6ef051a837703e83bcaea00d2555049154410a49e Apr 17 18:58:02.037165 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:02.037129 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-rpcm7" event={"ID":"7c1cefaf-94fe-4ab4-a072-078ba0be1ec3","Type":"ContainerStarted","Data":"6ca9f8ac8bc04994569871a6ef051a837703e83bcaea00d2555049154410a49e"} Apr 17 18:58:04.004381 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:04.004286 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-d4psq" Apr 17 18:58:05.051609 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:05.051568 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-rpcm7" event={"ID":"7c1cefaf-94fe-4ab4-a072-078ba0be1ec3","Type":"ContainerStarted","Data":"33f143b9fac124557f7c591b140685d6f90ec50e015d2a2d6b2d3ae2a41ad43d"} Apr 17 18:58:05.052058 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:05.051667 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-rpcm7" Apr 17 18:58:05.067365 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:05.067311 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-rpcm7" podStartSLOduration=1.8674179579999999 podStartE2EDuration="4.067293897s" podCreationTimestamp="2026-04-17 18:58:01 +0000 UTC" firstStartedPulling="2026-04-17 18:58:01.947386221 +0000 UTC m=+521.221403492" lastFinishedPulling="2026-04-17 18:58:04.14726216 +0000 UTC m=+523.421279431" observedRunningTime="2026-04-17 18:58:05.066051791 +0000 UTC m=+524.340069086" watchObservedRunningTime="2026-04-17 18:58:05.067293897 +0000 UTC m=+524.341311198" Apr 17 18:58:11.034789 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:11.034754 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dwhsx" Apr 17 18:58:12.837732 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:12.837700 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dwhsx"] Apr 17 18:58:12.838275 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:12.838243 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dwhsx" podUID="b20ab65c-624f-4b77-9960-7900782bb108" containerName="manager" containerID="cri-o://c51816f2b577c3e0b93740ca1b23daa051322af3d138985f465d3aa7f4395722" gracePeriod=2 Apr 17 18:58:12.840076 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:12.840033 2571 status_manager.go:895] "Failed to get status for pod" podUID="b20ab65c-624f-4b77-9960-7900782bb108" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dwhsx" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-dwhsx\" is forbidden: User \"system:node:ip-10-0-132-192.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-192.ec2.internal' and this object" Apr 17 18:58:12.840750 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:12.840726 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dwhsx"] Apr 17 18:58:12.859287 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:12.859265 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dgj5q"] Apr 17 18:58:12.859661 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:12.859647 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b20ab65c-624f-4b77-9960-7900782bb108" containerName="manager" Apr 17 18:58:12.859661 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:12.859663 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b20ab65c-624f-4b77-9960-7900782bb108" containerName="manager" Apr 17 18:58:12.859751 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:12.859715 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="b20ab65c-624f-4b77-9960-7900782bb108" containerName="manager" Apr 17 18:58:12.862633 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:12.862616 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dgj5q" Apr 17 18:58:12.871811 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:12.871787 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dgj5q"] Apr 17 18:58:12.888692 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:12.888658 2571 status_manager.go:895] "Failed to get status for pod" podUID="b20ab65c-624f-4b77-9960-7900782bb108" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dwhsx" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-dwhsx\" is forbidden: User \"system:node:ip-10-0-132-192.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-192.ec2.internal' and this object" Apr 17 18:58:12.950583 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:12.950557 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/075446b0-6eb7-4b72-9ea9-ddae593f4a5c-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-dgj5q\" (UID: \"075446b0-6eb7-4b72-9ea9-ddae593f4a5c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dgj5q" Apr 17 18:58:12.950676 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:12.950614 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb9rb\" (UniqueName: \"kubernetes.io/projected/075446b0-6eb7-4b72-9ea9-ddae593f4a5c-kube-api-access-tb9rb\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-dgj5q\" (UID: \"075446b0-6eb7-4b72-9ea9-ddae593f4a5c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dgj5q" Apr 17 18:58:13.052025 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:13.051995 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tb9rb\" (UniqueName: \"kubernetes.io/projected/075446b0-6eb7-4b72-9ea9-ddae593f4a5c-kube-api-access-tb9rb\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-dgj5q\" (UID: \"075446b0-6eb7-4b72-9ea9-ddae593f4a5c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dgj5q" Apr 17 18:58:13.052144 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:13.052093 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/075446b0-6eb7-4b72-9ea9-ddae593f4a5c-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-dgj5q\" (UID: \"075446b0-6eb7-4b72-9ea9-ddae593f4a5c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dgj5q" Apr 17 18:58:13.052395 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:13.052375 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/075446b0-6eb7-4b72-9ea9-ddae593f4a5c-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-dgj5q\" (UID: \"075446b0-6eb7-4b72-9ea9-ddae593f4a5c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dgj5q" Apr 17 18:58:13.059440 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:13.059416 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb9rb\" (UniqueName: \"kubernetes.io/projected/075446b0-6eb7-4b72-9ea9-ddae593f4a5c-kube-api-access-tb9rb\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-dgj5q\" (UID: \"075446b0-6eb7-4b72-9ea9-ddae593f4a5c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dgj5q" Apr 17 18:58:13.073924 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:13.073906 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dwhsx" Apr 17 18:58:13.075792 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:13.075768 2571 status_manager.go:895] "Failed to get status for pod" podUID="b20ab65c-624f-4b77-9960-7900782bb108" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dwhsx" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-dwhsx\" is forbidden: User \"system:node:ip-10-0-132-192.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-192.ec2.internal' and this object" Apr 17 18:58:13.080723 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:13.080700 2571 generic.go:358] "Generic (PLEG): container finished" podID="b20ab65c-624f-4b77-9960-7900782bb108" containerID="c51816f2b577c3e0b93740ca1b23daa051322af3d138985f465d3aa7f4395722" exitCode=0 Apr 17 18:58:13.080808 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:13.080753 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dwhsx" Apr 17 18:58:13.080808 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:13.080802 2571 scope.go:117] "RemoveContainer" containerID="c51816f2b577c3e0b93740ca1b23daa051322af3d138985f465d3aa7f4395722" Apr 17 18:58:13.082608 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:13.082581 2571 status_manager.go:895] "Failed to get status for pod" podUID="b20ab65c-624f-4b77-9960-7900782bb108" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dwhsx" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-dwhsx\" is forbidden: User \"system:node:ip-10-0-132-192.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-192.ec2.internal' and this object" Apr 17 18:58:13.089160 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:13.089144 2571 scope.go:117] "RemoveContainer" containerID="c51816f2b577c3e0b93740ca1b23daa051322af3d138985f465d3aa7f4395722" Apr 17 18:58:13.089424 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:58:13.089408 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c51816f2b577c3e0b93740ca1b23daa051322af3d138985f465d3aa7f4395722\": container with ID starting with c51816f2b577c3e0b93740ca1b23daa051322af3d138985f465d3aa7f4395722 not found: ID does not exist" containerID="c51816f2b577c3e0b93740ca1b23daa051322af3d138985f465d3aa7f4395722" Apr 17 18:58:13.089483 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:13.089433 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c51816f2b577c3e0b93740ca1b23daa051322af3d138985f465d3aa7f4395722"} err="failed to get container status \"c51816f2b577c3e0b93740ca1b23daa051322af3d138985f465d3aa7f4395722\": rpc error: code = NotFound desc = could not find container \"c51816f2b577c3e0b93740ca1b23daa051322af3d138985f465d3aa7f4395722\": container with ID starting with c51816f2b577c3e0b93740ca1b23daa051322af3d138985f465d3aa7f4395722 not found: ID does not exist" Apr 17 18:58:13.153106 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:13.153085 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b20ab65c-624f-4b77-9960-7900782bb108-extensions-socket-volume\") pod \"b20ab65c-624f-4b77-9960-7900782bb108\" (UID: \"b20ab65c-624f-4b77-9960-7900782bb108\") " Apr 17 18:58:13.153172 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:13.153140 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5z27d\" (UniqueName: \"kubernetes.io/projected/b20ab65c-624f-4b77-9960-7900782bb108-kube-api-access-5z27d\") pod \"b20ab65c-624f-4b77-9960-7900782bb108\" (UID: \"b20ab65c-624f-4b77-9960-7900782bb108\") " Apr 17 18:58:13.153587 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:13.153567 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b20ab65c-624f-4b77-9960-7900782bb108-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "b20ab65c-624f-4b77-9960-7900782bb108" (UID: "b20ab65c-624f-4b77-9960-7900782bb108"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:58:13.154960 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:13.154939 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b20ab65c-624f-4b77-9960-7900782bb108-kube-api-access-5z27d" (OuterVolumeSpecName: "kube-api-access-5z27d") pod "b20ab65c-624f-4b77-9960-7900782bb108" (UID: "b20ab65c-624f-4b77-9960-7900782bb108"). InnerVolumeSpecName "kube-api-access-5z27d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:58:13.206288 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:13.206259 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dgj5q" Apr 17 18:58:13.237707 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:13.237672 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b20ab65c-624f-4b77-9960-7900782bb108" path="/var/lib/kubelet/pods/b20ab65c-624f-4b77-9960-7900782bb108/volumes" Apr 17 18:58:13.254385 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:13.254332 2571 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b20ab65c-624f-4b77-9960-7900782bb108-extensions-socket-volume\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:58:13.254385 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:13.254362 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5z27d\" (UniqueName: \"kubernetes.io/projected/b20ab65c-624f-4b77-9960-7900782bb108-kube-api-access-5z27d\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:58:13.346858 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:13.346832 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dgj5q"] Apr 17 18:58:13.348426 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:58:13.348387 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod075446b0_6eb7_4b72_9ea9_ddae593f4a5c.slice/crio-ca0857551b27a7006fb79b329201642664bab265234a570dd356cb7ef7827de9 WatchSource:0}: Error finding container ca0857551b27a7006fb79b329201642664bab265234a570dd356cb7ef7827de9: Status 404 returned error can't find the container with id ca0857551b27a7006fb79b329201642664bab265234a570dd356cb7ef7827de9 Apr 17 18:58:14.085922 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:14.085880 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dgj5q" event={"ID":"075446b0-6eb7-4b72-9ea9-ddae593f4a5c","Type":"ContainerStarted","Data":"862ae7acf677f7c87c2d6f7a14e4452a6cdefb88fc217165331c156b42e08780"} Apr 17 18:58:14.085922 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:14.085921 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dgj5q" event={"ID":"075446b0-6eb7-4b72-9ea9-ddae593f4a5c","Type":"ContainerStarted","Data":"ca0857551b27a7006fb79b329201642664bab265234a570dd356cb7ef7827de9"} Apr 17 18:58:14.086390 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:14.086021 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dgj5q" Apr 17 18:58:14.113862 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:14.113808 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dgj5q" podStartSLOduration=2.113790919 podStartE2EDuration="2.113790919s" podCreationTimestamp="2026-04-17 18:58:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:58:14.112212151 +0000 UTC m=+533.386229445" watchObservedRunningTime="2026-04-17 18:58:14.113790919 +0000 UTC m=+533.387808213" Apr 17 18:58:16.057333 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:16.057303 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-rpcm7" Apr 17 18:58:24.100628 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:24.100566 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7b5fc8f596-88zsz" podUID="4ffc8cb3-d10c-4e68-8c65-fac351b80c4d" containerName="console" containerID="cri-o://b26bdf3969bff9fef8f7e34a03b0760f170ed8feecc55f695cac3892b36c7276" gracePeriod=15 Apr 17 18:58:24.330355 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:24.330333 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b5fc8f596-88zsz_4ffc8cb3-d10c-4e68-8c65-fac351b80c4d/console/0.log" Apr 17 18:58:24.330484 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:24.330389 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b5fc8f596-88zsz" Apr 17 18:58:24.452885 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:24.452797 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwmwl\" (UniqueName: \"kubernetes.io/projected/4ffc8cb3-d10c-4e68-8c65-fac351b80c4d-kube-api-access-mwmwl\") pod \"4ffc8cb3-d10c-4e68-8c65-fac351b80c4d\" (UID: \"4ffc8cb3-d10c-4e68-8c65-fac351b80c4d\") " Apr 17 18:58:24.452885 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:24.452845 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4ffc8cb3-d10c-4e68-8c65-fac351b80c4d-console-config\") pod \"4ffc8cb3-d10c-4e68-8c65-fac351b80c4d\" (UID: \"4ffc8cb3-d10c-4e68-8c65-fac351b80c4d\") " Apr 17 18:58:24.452885 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:24.452873 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4ffc8cb3-d10c-4e68-8c65-fac351b80c4d-console-serving-cert\") pod \"4ffc8cb3-d10c-4e68-8c65-fac351b80c4d\" (UID: \"4ffc8cb3-d10c-4e68-8c65-fac351b80c4d\") " Apr 17 18:58:24.453155 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:24.452897 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4ffc8cb3-d10c-4e68-8c65-fac351b80c4d-console-oauth-config\") pod \"4ffc8cb3-d10c-4e68-8c65-fac351b80c4d\" (UID: \"4ffc8cb3-d10c-4e68-8c65-fac351b80c4d\") " Apr 17 18:58:24.453155 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:24.452918 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ffc8cb3-d10c-4e68-8c65-fac351b80c4d-trusted-ca-bundle\") pod \"4ffc8cb3-d10c-4e68-8c65-fac351b80c4d\" (UID: \"4ffc8cb3-d10c-4e68-8c65-fac351b80c4d\") " Apr 17 18:58:24.453155 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:24.452936 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4ffc8cb3-d10c-4e68-8c65-fac351b80c4d-oauth-serving-cert\") pod \"4ffc8cb3-d10c-4e68-8c65-fac351b80c4d\" (UID: \"4ffc8cb3-d10c-4e68-8c65-fac351b80c4d\") " Apr 17 18:58:24.453155 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:24.452991 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4ffc8cb3-d10c-4e68-8c65-fac351b80c4d-service-ca\") pod \"4ffc8cb3-d10c-4e68-8c65-fac351b80c4d\" (UID: \"4ffc8cb3-d10c-4e68-8c65-fac351b80c4d\") " Apr 17 18:58:24.453389 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:24.453370 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ffc8cb3-d10c-4e68-8c65-fac351b80c4d-console-config" (OuterVolumeSpecName: "console-config") pod "4ffc8cb3-d10c-4e68-8c65-fac351b80c4d" (UID: "4ffc8cb3-d10c-4e68-8c65-fac351b80c4d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:58:24.453448 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:24.453427 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ffc8cb3-d10c-4e68-8c65-fac351b80c4d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "4ffc8cb3-d10c-4e68-8c65-fac351b80c4d" (UID: "4ffc8cb3-d10c-4e68-8c65-fac351b80c4d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:58:24.453535 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:24.453487 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ffc8cb3-d10c-4e68-8c65-fac351b80c4d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4ffc8cb3-d10c-4e68-8c65-fac351b80c4d" (UID: "4ffc8cb3-d10c-4e68-8c65-fac351b80c4d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:58:24.453794 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:24.453769 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ffc8cb3-d10c-4e68-8c65-fac351b80c4d-service-ca" (OuterVolumeSpecName: "service-ca") pod "4ffc8cb3-d10c-4e68-8c65-fac351b80c4d" (UID: "4ffc8cb3-d10c-4e68-8c65-fac351b80c4d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:58:24.455297 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:24.455273 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ffc8cb3-d10c-4e68-8c65-fac351b80c4d-kube-api-access-mwmwl" (OuterVolumeSpecName: "kube-api-access-mwmwl") pod "4ffc8cb3-d10c-4e68-8c65-fac351b80c4d" (UID: "4ffc8cb3-d10c-4e68-8c65-fac351b80c4d"). InnerVolumeSpecName "kube-api-access-mwmwl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:58:24.455297 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:24.455289 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ffc8cb3-d10c-4e68-8c65-fac351b80c4d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4ffc8cb3-d10c-4e68-8c65-fac351b80c4d" (UID: "4ffc8cb3-d10c-4e68-8c65-fac351b80c4d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:58:24.455568 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:24.455359 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ffc8cb3-d10c-4e68-8c65-fac351b80c4d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4ffc8cb3-d10c-4e68-8c65-fac351b80c4d" (UID: "4ffc8cb3-d10c-4e68-8c65-fac351b80c4d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:58:24.553784 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:24.553755 2571 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4ffc8cb3-d10c-4e68-8c65-fac351b80c4d-console-serving-cert\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:58:24.553784 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:24.553780 2571 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4ffc8cb3-d10c-4e68-8c65-fac351b80c4d-console-oauth-config\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:58:24.553934 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:24.553791 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ffc8cb3-d10c-4e68-8c65-fac351b80c4d-trusted-ca-bundle\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:58:24.553934 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:24.553800 2571 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4ffc8cb3-d10c-4e68-8c65-fac351b80c4d-oauth-serving-cert\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:58:24.553934 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:24.553808 2571 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4ffc8cb3-d10c-4e68-8c65-fac351b80c4d-service-ca\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:58:24.553934 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:24.553817 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mwmwl\" (UniqueName: \"kubernetes.io/projected/4ffc8cb3-d10c-4e68-8c65-fac351b80c4d-kube-api-access-mwmwl\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:58:24.553934 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:24.553827 2571 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4ffc8cb3-d10c-4e68-8c65-fac351b80c4d-console-config\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:58:25.093068 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:25.093033 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dgj5q" Apr 17 18:58:25.128436 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:25.128411 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b5fc8f596-88zsz_4ffc8cb3-d10c-4e68-8c65-fac351b80c4d/console/0.log" Apr 17 18:58:25.128943 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:25.128451 2571 generic.go:358] "Generic (PLEG): container finished" podID="4ffc8cb3-d10c-4e68-8c65-fac351b80c4d" containerID="b26bdf3969bff9fef8f7e34a03b0760f170ed8feecc55f695cac3892b36c7276" exitCode=2 Apr 17 18:58:25.128943 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:25.128544 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b5fc8f596-88zsz" Apr 17 18:58:25.128943 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:25.128604 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b5fc8f596-88zsz" event={"ID":"4ffc8cb3-d10c-4e68-8c65-fac351b80c4d","Type":"ContainerDied","Data":"b26bdf3969bff9fef8f7e34a03b0760f170ed8feecc55f695cac3892b36c7276"} Apr 17 18:58:25.128943 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:25.128637 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b5fc8f596-88zsz" event={"ID":"4ffc8cb3-d10c-4e68-8c65-fac351b80c4d","Type":"ContainerDied","Data":"fce4225b0d41aea0efed0d46ae7e9bb5c8ba2e5f7eed445fce5029c880f3da98"} Apr 17 18:58:25.128943 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:25.128656 2571 scope.go:117] "RemoveContainer" containerID="b26bdf3969bff9fef8f7e34a03b0760f170ed8feecc55f695cac3892b36c7276" Apr 17 18:58:25.139295 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:25.139276 2571 scope.go:117] "RemoveContainer" containerID="b26bdf3969bff9fef8f7e34a03b0760f170ed8feecc55f695cac3892b36c7276" Apr 17 18:58:25.139544 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:58:25.139525 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b26bdf3969bff9fef8f7e34a03b0760f170ed8feecc55f695cac3892b36c7276\": container with ID starting with b26bdf3969bff9fef8f7e34a03b0760f170ed8feecc55f695cac3892b36c7276 not found: ID does not exist" containerID="b26bdf3969bff9fef8f7e34a03b0760f170ed8feecc55f695cac3892b36c7276" Apr 17 18:58:25.139619 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:25.139552 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b26bdf3969bff9fef8f7e34a03b0760f170ed8feecc55f695cac3892b36c7276"} err="failed to get container status \"b26bdf3969bff9fef8f7e34a03b0760f170ed8feecc55f695cac3892b36c7276\": rpc error: code = NotFound desc = could not find container \"b26bdf3969bff9fef8f7e34a03b0760f170ed8feecc55f695cac3892b36c7276\": container with ID starting with b26bdf3969bff9fef8f7e34a03b0760f170ed8feecc55f695cac3892b36c7276 not found: ID does not exist" Apr 17 18:58:25.152704 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:25.152681 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7b5fc8f596-88zsz"] Apr 17 18:58:25.156956 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:25.156931 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7b5fc8f596-88zsz"] Apr 17 18:58:25.237553 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:25.237524 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ffc8cb3-d10c-4e68-8c65-fac351b80c4d" path="/var/lib/kubelet/pods/4ffc8cb3-d10c-4e68-8c65-fac351b80c4d/volumes" Apr 17 18:58:30.517830 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:30.517796 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dgj5q"] Apr 17 18:58:30.518223 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:30.518021 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dgj5q" podUID="075446b0-6eb7-4b72-9ea9-ddae593f4a5c" containerName="manager" containerID="cri-o://862ae7acf677f7c87c2d6f7a14e4452a6cdefb88fc217165331c156b42e08780" gracePeriod=10 Apr 17 18:58:30.761168 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:30.761145 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dgj5q" Apr 17 18:58:30.913072 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:30.912996 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tb9rb\" (UniqueName: \"kubernetes.io/projected/075446b0-6eb7-4b72-9ea9-ddae593f4a5c-kube-api-access-tb9rb\") pod \"075446b0-6eb7-4b72-9ea9-ddae593f4a5c\" (UID: \"075446b0-6eb7-4b72-9ea9-ddae593f4a5c\") " Apr 17 18:58:30.913072 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:30.913061 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/075446b0-6eb7-4b72-9ea9-ddae593f4a5c-extensions-socket-volume\") pod \"075446b0-6eb7-4b72-9ea9-ddae593f4a5c\" (UID: \"075446b0-6eb7-4b72-9ea9-ddae593f4a5c\") " Apr 17 18:58:30.913430 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:30.913406 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/075446b0-6eb7-4b72-9ea9-ddae593f4a5c-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "075446b0-6eb7-4b72-9ea9-ddae593f4a5c" (UID: "075446b0-6eb7-4b72-9ea9-ddae593f4a5c"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:58:30.915011 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:30.914992 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/075446b0-6eb7-4b72-9ea9-ddae593f4a5c-kube-api-access-tb9rb" (OuterVolumeSpecName: "kube-api-access-tb9rb") pod "075446b0-6eb7-4b72-9ea9-ddae593f4a5c" (UID: "075446b0-6eb7-4b72-9ea9-ddae593f4a5c"). InnerVolumeSpecName "kube-api-access-tb9rb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:58:31.014470 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:31.014431 2571 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/075446b0-6eb7-4b72-9ea9-ddae593f4a5c-extensions-socket-volume\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:58:31.014631 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:31.014478 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tb9rb\" (UniqueName: \"kubernetes.io/projected/075446b0-6eb7-4b72-9ea9-ddae593f4a5c-kube-api-access-tb9rb\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:58:31.153507 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:31.153442 2571 generic.go:358] "Generic (PLEG): container finished" podID="075446b0-6eb7-4b72-9ea9-ddae593f4a5c" containerID="862ae7acf677f7c87c2d6f7a14e4452a6cdefb88fc217165331c156b42e08780" exitCode=0 Apr 17 18:58:31.153663 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:31.153518 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dgj5q" event={"ID":"075446b0-6eb7-4b72-9ea9-ddae593f4a5c","Type":"ContainerDied","Data":"862ae7acf677f7c87c2d6f7a14e4452a6cdefb88fc217165331c156b42e08780"} Apr 17 18:58:31.153663 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:31.153532 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dgj5q" Apr 17 18:58:31.153663 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:31.153546 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dgj5q" event={"ID":"075446b0-6eb7-4b72-9ea9-ddae593f4a5c","Type":"ContainerDied","Data":"ca0857551b27a7006fb79b329201642664bab265234a570dd356cb7ef7827de9"} Apr 17 18:58:31.153663 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:31.153562 2571 scope.go:117] "RemoveContainer" containerID="862ae7acf677f7c87c2d6f7a14e4452a6cdefb88fc217165331c156b42e08780" Apr 17 18:58:31.162676 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:31.162652 2571 scope.go:117] "RemoveContainer" containerID="862ae7acf677f7c87c2d6f7a14e4452a6cdefb88fc217165331c156b42e08780" Apr 17 18:58:31.162997 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:58:31.162973 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"862ae7acf677f7c87c2d6f7a14e4452a6cdefb88fc217165331c156b42e08780\": container with ID starting with 862ae7acf677f7c87c2d6f7a14e4452a6cdefb88fc217165331c156b42e08780 not found: ID does not exist" containerID="862ae7acf677f7c87c2d6f7a14e4452a6cdefb88fc217165331c156b42e08780" Apr 17 18:58:31.163118 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:31.163014 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"862ae7acf677f7c87c2d6f7a14e4452a6cdefb88fc217165331c156b42e08780"} err="failed to get container status \"862ae7acf677f7c87c2d6f7a14e4452a6cdefb88fc217165331c156b42e08780\": rpc error: code = NotFound desc = could not find container \"862ae7acf677f7c87c2d6f7a14e4452a6cdefb88fc217165331c156b42e08780\": container with ID starting with 862ae7acf677f7c87c2d6f7a14e4452a6cdefb88fc217165331c156b42e08780 not found: ID does not exist" Apr 17 18:58:31.177956 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:31.177933 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dgj5q"] Apr 17 18:58:31.180978 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:31.180953 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dgj5q"] Apr 17 18:58:31.237446 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:31.237415 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="075446b0-6eb7-4b72-9ea9-ddae593f4a5c" path="/var/lib/kubelet/pods/075446b0-6eb7-4b72-9ea9-ddae593f4a5c/volumes" Apr 17 18:58:46.825331 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:46.825302 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldr4c"] Apr 17 18:58:46.825753 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:46.825651 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ffc8cb3-d10c-4e68-8c65-fac351b80c4d" containerName="console" Apr 17 18:58:46.825753 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:46.825662 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ffc8cb3-d10c-4e68-8c65-fac351b80c4d" containerName="console" Apr 17 18:58:46.825753 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:46.825675 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="075446b0-6eb7-4b72-9ea9-ddae593f4a5c" containerName="manager" Apr 17 18:58:46.825753 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:46.825681 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="075446b0-6eb7-4b72-9ea9-ddae593f4a5c" containerName="manager" Apr 17 18:58:46.825753 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:46.825735 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ffc8cb3-d10c-4e68-8c65-fac351b80c4d" containerName="console" Apr 17 18:58:46.825753 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:46.825746 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="075446b0-6eb7-4b72-9ea9-ddae593f4a5c" containerName="manager" Apr 17 18:58:46.829923 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:46.829902 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldr4c" Apr 17 18:58:46.832210 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:46.832186 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-xkpbx\"" Apr 17 18:58:46.843381 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:46.843356 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldr4c"] Apr 17 18:58:46.847913 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:46.847884 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/9c067f1a-52b1-40a7-a39b-61a914c4a305-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-ldr4c\" (UID: \"9c067f1a-52b1-40a7-a39b-61a914c4a305\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldr4c" Apr 17 18:58:46.848025 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:46.847926 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/9c067f1a-52b1-40a7-a39b-61a914c4a305-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-ldr4c\" (UID: \"9c067f1a-52b1-40a7-a39b-61a914c4a305\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldr4c" Apr 17 18:58:46.848025 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:46.847959 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/9c067f1a-52b1-40a7-a39b-61a914c4a305-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-ldr4c\" (UID: \"9c067f1a-52b1-40a7-a39b-61a914c4a305\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldr4c" Apr 17 18:58:46.848152 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:46.848028 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/9c067f1a-52b1-40a7-a39b-61a914c4a305-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-ldr4c\" (UID: \"9c067f1a-52b1-40a7-a39b-61a914c4a305\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldr4c" Apr 17 18:58:46.848152 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:46.848079 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/9c067f1a-52b1-40a7-a39b-61a914c4a305-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-ldr4c\" (UID: \"9c067f1a-52b1-40a7-a39b-61a914c4a305\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldr4c" Apr 17 18:58:46.848152 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:46.848135 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/9c067f1a-52b1-40a7-a39b-61a914c4a305-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-ldr4c\" (UID: \"9c067f1a-52b1-40a7-a39b-61a914c4a305\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldr4c" Apr 17 18:58:46.848312 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:46.848165 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/9c067f1a-52b1-40a7-a39b-61a914c4a305-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-ldr4c\" (UID: \"9c067f1a-52b1-40a7-a39b-61a914c4a305\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldr4c" Apr 17 18:58:46.848312 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:46.848196 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8cqx\" (UniqueName: \"kubernetes.io/projected/9c067f1a-52b1-40a7-a39b-61a914c4a305-kube-api-access-v8cqx\") pod \"maas-default-gateway-openshift-default-58b6f876-ldr4c\" (UID: \"9c067f1a-52b1-40a7-a39b-61a914c4a305\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldr4c" Apr 17 18:58:46.848312 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:46.848225 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/9c067f1a-52b1-40a7-a39b-61a914c4a305-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-ldr4c\" (UID: \"9c067f1a-52b1-40a7-a39b-61a914c4a305\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldr4c" Apr 17 18:58:46.948984 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:46.948950 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/9c067f1a-52b1-40a7-a39b-61a914c4a305-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-ldr4c\" (UID: \"9c067f1a-52b1-40a7-a39b-61a914c4a305\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldr4c" Apr 17 18:58:46.949154 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:46.949003 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/9c067f1a-52b1-40a7-a39b-61a914c4a305-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-ldr4c\" (UID: \"9c067f1a-52b1-40a7-a39b-61a914c4a305\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldr4c" Apr 17 18:58:46.949154 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:46.949128 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/9c067f1a-52b1-40a7-a39b-61a914c4a305-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-ldr4c\" (UID: \"9c067f1a-52b1-40a7-a39b-61a914c4a305\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldr4c" Apr 17 18:58:46.949269 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:46.949169 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/9c067f1a-52b1-40a7-a39b-61a914c4a305-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-ldr4c\" (UID: \"9c067f1a-52b1-40a7-a39b-61a914c4a305\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldr4c" Apr 17 18:58:46.949269 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:46.949209 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v8cqx\" (UniqueName: \"kubernetes.io/projected/9c067f1a-52b1-40a7-a39b-61a914c4a305-kube-api-access-v8cqx\") pod \"maas-default-gateway-openshift-default-58b6f876-ldr4c\" (UID: \"9c067f1a-52b1-40a7-a39b-61a914c4a305\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldr4c" Apr 17 18:58:46.949430 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:46.949402 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/9c067f1a-52b1-40a7-a39b-61a914c4a305-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-ldr4c\" (UID: \"9c067f1a-52b1-40a7-a39b-61a914c4a305\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldr4c" Apr 17 18:58:46.949549 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:46.949525 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/9c067f1a-52b1-40a7-a39b-61a914c4a305-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-ldr4c\" (UID: \"9c067f1a-52b1-40a7-a39b-61a914c4a305\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldr4c" Apr 17 18:58:46.949610 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:46.949554 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/9c067f1a-52b1-40a7-a39b-61a914c4a305-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-ldr4c\" (UID: \"9c067f1a-52b1-40a7-a39b-61a914c4a305\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldr4c" Apr 17 18:58:46.949610 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:46.949583 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/9c067f1a-52b1-40a7-a39b-61a914c4a305-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-ldr4c\" (UID: \"9c067f1a-52b1-40a7-a39b-61a914c4a305\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldr4c" Apr 17 18:58:46.949610 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:46.949588 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/9c067f1a-52b1-40a7-a39b-61a914c4a305-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-ldr4c\" (UID: \"9c067f1a-52b1-40a7-a39b-61a914c4a305\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldr4c" Apr 17 18:58:46.949768 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:46.949519 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/9c067f1a-52b1-40a7-a39b-61a914c4a305-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-ldr4c\" (UID: \"9c067f1a-52b1-40a7-a39b-61a914c4a305\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldr4c" Apr 17 18:58:46.949826 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:46.949795 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/9c067f1a-52b1-40a7-a39b-61a914c4a305-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-ldr4c\" (UID: \"9c067f1a-52b1-40a7-a39b-61a914c4a305\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldr4c" Apr 17 18:58:46.949880 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:46.949799 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/9c067f1a-52b1-40a7-a39b-61a914c4a305-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-ldr4c\" (UID: \"9c067f1a-52b1-40a7-a39b-61a914c4a305\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldr4c" Apr 17 18:58:46.949880 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:46.949846 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/9c067f1a-52b1-40a7-a39b-61a914c4a305-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-ldr4c\" (UID: \"9c067f1a-52b1-40a7-a39b-61a914c4a305\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldr4c" Apr 17 18:58:46.951451 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:46.951431 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/9c067f1a-52b1-40a7-a39b-61a914c4a305-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-ldr4c\" (UID: \"9c067f1a-52b1-40a7-a39b-61a914c4a305\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldr4c" Apr 17 18:58:46.951556 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:46.951452 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/9c067f1a-52b1-40a7-a39b-61a914c4a305-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-ldr4c\" (UID: \"9c067f1a-52b1-40a7-a39b-61a914c4a305\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldr4c" Apr 17 18:58:46.957614 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:46.957593 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/9c067f1a-52b1-40a7-a39b-61a914c4a305-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-ldr4c\" (UID: \"9c067f1a-52b1-40a7-a39b-61a914c4a305\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldr4c" Apr 17 18:58:46.958124 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:46.958100 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8cqx\" (UniqueName: \"kubernetes.io/projected/9c067f1a-52b1-40a7-a39b-61a914c4a305-kube-api-access-v8cqx\") pod \"maas-default-gateway-openshift-default-58b6f876-ldr4c\" (UID: \"9c067f1a-52b1-40a7-a39b-61a914c4a305\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldr4c" Apr 17 18:58:47.141133 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:47.141047 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldr4c" Apr 17 18:58:47.274100 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:47.274070 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldr4c"] Apr 17 18:58:47.275620 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:58:47.275590 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c067f1a_52b1_40a7_a39b_61a914c4a305.slice/crio-c9f003e451c2db53dab12d057a1fa8447049be79029c96e482b8cc054660be61 WatchSource:0}: Error finding container c9f003e451c2db53dab12d057a1fa8447049be79029c96e482b8cc054660be61: Status 404 returned error can't find the container with id c9f003e451c2db53dab12d057a1fa8447049be79029c96e482b8cc054660be61 Apr 17 18:58:47.278186 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:47.278149 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236232Ki","pods":"250"} Apr 17 18:58:47.278447 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:47.278235 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236232Ki","pods":"250"} Apr 17 18:58:47.278447 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:47.278270 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236232Ki","pods":"250"} Apr 17 18:58:48.221518 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:48.221480 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldr4c" event={"ID":"9c067f1a-52b1-40a7-a39b-61a914c4a305","Type":"ContainerStarted","Data":"14f61da3e6945cb28332c1b7d94ddf4efb0c86f3e4cea3dae371854c14d33087"} Apr 17 18:58:48.221518 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:48.221522 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldr4c" event={"ID":"9c067f1a-52b1-40a7-a39b-61a914c4a305","Type":"ContainerStarted","Data":"c9f003e451c2db53dab12d057a1fa8447049be79029c96e482b8cc054660be61"} Apr 17 18:58:48.240483 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:48.240416 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldr4c" podStartSLOduration=2.240403509 podStartE2EDuration="2.240403509s" podCreationTimestamp="2026-04-17 18:58:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:58:48.237955622 +0000 UTC m=+567.511972914" watchObservedRunningTime="2026-04-17 18:58:48.240403509 +0000 UTC m=+567.514420801" Apr 17 18:58:49.142158 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:49.142118 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldr4c" Apr 17 18:58:49.146912 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:49.146885 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldr4c" Apr 17 18:58:49.225592 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:49.225560 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldr4c" Apr 17 18:58:49.226550 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:49.226533 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-ldr4c" Apr 17 18:58:51.169715 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:51.169680 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-b2g6k"] Apr 17 18:58:51.172644 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:51.172626 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-b2g6k" Apr 17 18:58:51.174794 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:51.174772 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-zjsrt\"" Apr 17 18:58:51.174905 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:51.174828 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 17 18:58:51.181139 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:51.181121 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/fbc8f9c4-f45a-4aee-9338-8d42b1465507-config-file\") pod \"limitador-limitador-7d549b5b-b2g6k\" (UID: \"fbc8f9c4-f45a-4aee-9338-8d42b1465507\") " pod="kuadrant-system/limitador-limitador-7d549b5b-b2g6k" Apr 17 18:58:51.181258 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:51.181228 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm6q7\" (UniqueName: \"kubernetes.io/projected/fbc8f9c4-f45a-4aee-9338-8d42b1465507-kube-api-access-jm6q7\") pod \"limitador-limitador-7d549b5b-b2g6k\" (UID: \"fbc8f9c4-f45a-4aee-9338-8d42b1465507\") " pod="kuadrant-system/limitador-limitador-7d549b5b-b2g6k" Apr 17 18:58:51.181615 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:51.181594 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-b2g6k"] Apr 17 18:58:51.273952 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:51.273918 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-b2g6k"] Apr 17 18:58:51.281737 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:51.281695 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jm6q7\" (UniqueName: \"kubernetes.io/projected/fbc8f9c4-f45a-4aee-9338-8d42b1465507-kube-api-access-jm6q7\") pod \"limitador-limitador-7d549b5b-b2g6k\" (UID: \"fbc8f9c4-f45a-4aee-9338-8d42b1465507\") " pod="kuadrant-system/limitador-limitador-7d549b5b-b2g6k" Apr 17 18:58:51.281899 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:51.281792 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/fbc8f9c4-f45a-4aee-9338-8d42b1465507-config-file\") pod \"limitador-limitador-7d549b5b-b2g6k\" (UID: \"fbc8f9c4-f45a-4aee-9338-8d42b1465507\") " pod="kuadrant-system/limitador-limitador-7d549b5b-b2g6k" Apr 17 18:58:51.282437 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:51.282414 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/fbc8f9c4-f45a-4aee-9338-8d42b1465507-config-file\") pod \"limitador-limitador-7d549b5b-b2g6k\" (UID: \"fbc8f9c4-f45a-4aee-9338-8d42b1465507\") " pod="kuadrant-system/limitador-limitador-7d549b5b-b2g6k" Apr 17 18:58:51.289123 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:51.289099 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm6q7\" (UniqueName: \"kubernetes.io/projected/fbc8f9c4-f45a-4aee-9338-8d42b1465507-kube-api-access-jm6q7\") pod \"limitador-limitador-7d549b5b-b2g6k\" (UID: \"fbc8f9c4-f45a-4aee-9338-8d42b1465507\") " pod="kuadrant-system/limitador-limitador-7d549b5b-b2g6k" Apr 17 18:58:51.484092 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:51.484012 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-b2g6k" Apr 17 18:58:51.605769 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:51.605746 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-b2g6k"] Apr 17 18:58:51.608046 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:58:51.608017 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbc8f9c4_f45a_4aee_9338_8d42b1465507.slice/crio-680c7c0de483379487f0e3a0b41a6edba39baffe798e56f7e1fb22cbfedae795 WatchSource:0}: Error finding container 680c7c0de483379487f0e3a0b41a6edba39baffe798e56f7e1fb22cbfedae795: Status 404 returned error can't find the container with id 680c7c0de483379487f0e3a0b41a6edba39baffe798e56f7e1fb22cbfedae795 Apr 17 18:58:51.963875 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:51.963845 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-wglqq"] Apr 17 18:58:51.967849 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:51.967828 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-wglqq" Apr 17 18:58:51.970186 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:51.970159 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-mr96b\"" Apr 17 18:58:51.972844 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:51.972816 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-wglqq"] Apr 17 18:58:51.987728 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:51.987703 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv7cv\" (UniqueName: \"kubernetes.io/projected/b7906150-0a99-4302-bcba-0b00bb22dc24-kube-api-access-xv7cv\") pod \"authorino-f99f4b5cd-wglqq\" (UID: \"b7906150-0a99-4302-bcba-0b00bb22dc24\") " pod="kuadrant-system/authorino-f99f4b5cd-wglqq" Apr 17 18:58:51.997740 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:51.997718 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-pljxl"] Apr 17 18:58:52.001166 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:52.001149 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-pljxl" Apr 17 18:58:52.005407 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:52.005383 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-pljxl"] Apr 17 18:58:52.088899 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:52.088864 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4wnd\" (UniqueName: \"kubernetes.io/projected/3a7a630a-f3ab-4089-8bd5-9bd8aa11a34b-kube-api-access-d4wnd\") pod \"authorino-7498df8756-pljxl\" (UID: \"3a7a630a-f3ab-4089-8bd5-9bd8aa11a34b\") " pod="kuadrant-system/authorino-7498df8756-pljxl" Apr 17 18:58:52.089031 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:52.088919 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xv7cv\" (UniqueName: \"kubernetes.io/projected/b7906150-0a99-4302-bcba-0b00bb22dc24-kube-api-access-xv7cv\") pod \"authorino-f99f4b5cd-wglqq\" (UID: \"b7906150-0a99-4302-bcba-0b00bb22dc24\") " pod="kuadrant-system/authorino-f99f4b5cd-wglqq" Apr 17 18:58:52.096709 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:52.096686 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv7cv\" (UniqueName: \"kubernetes.io/projected/b7906150-0a99-4302-bcba-0b00bb22dc24-kube-api-access-xv7cv\") pod \"authorino-f99f4b5cd-wglqq\" (UID: \"b7906150-0a99-4302-bcba-0b00bb22dc24\") " pod="kuadrant-system/authorino-f99f4b5cd-wglqq" Apr 17 18:58:52.189902 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:52.189871 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d4wnd\" (UniqueName: \"kubernetes.io/projected/3a7a630a-f3ab-4089-8bd5-9bd8aa11a34b-kube-api-access-d4wnd\") pod \"authorino-7498df8756-pljxl\" (UID: \"3a7a630a-f3ab-4089-8bd5-9bd8aa11a34b\") " pod="kuadrant-system/authorino-7498df8756-pljxl" Apr 17 18:58:52.197349 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:52.197325 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4wnd\" (UniqueName: \"kubernetes.io/projected/3a7a630a-f3ab-4089-8bd5-9bd8aa11a34b-kube-api-access-d4wnd\") pod \"authorino-7498df8756-pljxl\" (UID: \"3a7a630a-f3ab-4089-8bd5-9bd8aa11a34b\") " pod="kuadrant-system/authorino-7498df8756-pljxl" Apr 17 18:58:52.242181 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:52.242070 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-b2g6k" event={"ID":"fbc8f9c4-f45a-4aee-9338-8d42b1465507","Type":"ContainerStarted","Data":"680c7c0de483379487f0e3a0b41a6edba39baffe798e56f7e1fb22cbfedae795"} Apr 17 18:58:52.279162 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:52.279134 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-wglqq" Apr 17 18:58:52.310965 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:52.310850 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-pljxl" Apr 17 18:58:52.467165 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:52.466039 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-wglqq"] Apr 17 18:58:52.473860 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:58:52.473821 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7906150_0a99_4302_bcba_0b00bb22dc24.slice/crio-973d5969ee03812e26b2e6681d17007101f20fca380143b3dbf8735dc2cc46f7 WatchSource:0}: Error finding container 973d5969ee03812e26b2e6681d17007101f20fca380143b3dbf8735dc2cc46f7: Status 404 returned error can't find the container with id 973d5969ee03812e26b2e6681d17007101f20fca380143b3dbf8735dc2cc46f7 Apr 17 18:58:52.515063 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:52.515032 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-pljxl"] Apr 17 18:58:52.515195 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:58:52.515170 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a7a630a_f3ab_4089_8bd5_9bd8aa11a34b.slice/crio-a449b67f5153f49325026f622fccd84a9c26cf77a20c42e60090b5293d175d6d WatchSource:0}: Error finding container a449b67f5153f49325026f622fccd84a9c26cf77a20c42e60090b5293d175d6d: Status 404 returned error can't find the container with id a449b67f5153f49325026f622fccd84a9c26cf77a20c42e60090b5293d175d6d Apr 17 18:58:53.250326 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:53.250289 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-pljxl" event={"ID":"3a7a630a-f3ab-4089-8bd5-9bd8aa11a34b","Type":"ContainerStarted","Data":"a449b67f5153f49325026f622fccd84a9c26cf77a20c42e60090b5293d175d6d"} Apr 17 18:58:53.253108 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:53.253080 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-wglqq" event={"ID":"b7906150-0a99-4302-bcba-0b00bb22dc24","Type":"ContainerStarted","Data":"973d5969ee03812e26b2e6681d17007101f20fca380143b3dbf8735dc2cc46f7"} Apr 17 18:58:56.266939 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:56.266899 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-b2g6k" event={"ID":"fbc8f9c4-f45a-4aee-9338-8d42b1465507","Type":"ContainerStarted","Data":"9c310174b97dc4e734b47304419e68adf6f63627753338d32a4abc332348f5db"} Apr 17 18:58:56.267413 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:56.266996 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-b2g6k" Apr 17 18:58:56.268346 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:56.268315 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-wglqq" event={"ID":"b7906150-0a99-4302-bcba-0b00bb22dc24","Type":"ContainerStarted","Data":"f97ba6db9861dd1449788671fa92e7536e6e8887d199b8c245e39592e7fcfa1d"} Apr 17 18:58:56.269583 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:56.269532 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-pljxl" event={"ID":"3a7a630a-f3ab-4089-8bd5-9bd8aa11a34b","Type":"ContainerStarted","Data":"82d28b1c861faedb6895109c0b43f1b54d0d82771b4fe0daa77189c0f4b83073"} Apr 17 18:58:56.282041 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:56.281996 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-b2g6k" podStartSLOduration=1.176840858 podStartE2EDuration="5.281985206s" podCreationTimestamp="2026-04-17 18:58:51 +0000 UTC" firstStartedPulling="2026-04-17 18:58:51.610236807 +0000 UTC m=+570.884254078" lastFinishedPulling="2026-04-17 18:58:55.715381151 +0000 UTC m=+574.989398426" observedRunningTime="2026-04-17 18:58:56.280422773 +0000 UTC m=+575.554440067" watchObservedRunningTime="2026-04-17 18:58:56.281985206 +0000 UTC m=+575.556002499" Apr 17 18:58:56.293079 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:56.293033 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-wglqq" podStartSLOduration=2.119501139 podStartE2EDuration="5.293017777s" podCreationTimestamp="2026-04-17 18:58:51 +0000 UTC" firstStartedPulling="2026-04-17 18:58:52.478773499 +0000 UTC m=+571.752790770" lastFinishedPulling="2026-04-17 18:58:55.652290137 +0000 UTC m=+574.926307408" observedRunningTime="2026-04-17 18:58:56.292493787 +0000 UTC m=+575.566511077" watchObservedRunningTime="2026-04-17 18:58:56.293017777 +0000 UTC m=+575.567035070" Apr 17 18:58:56.304827 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:56.304754 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-pljxl" podStartSLOduration=2.170351198 podStartE2EDuration="5.304720316s" podCreationTimestamp="2026-04-17 18:58:51 +0000 UTC" firstStartedPulling="2026-04-17 18:58:52.517919416 +0000 UTC m=+571.791936703" lastFinishedPulling="2026-04-17 18:58:55.65228855 +0000 UTC m=+574.926305821" observedRunningTime="2026-04-17 18:58:56.304166933 +0000 UTC m=+575.578184229" watchObservedRunningTime="2026-04-17 18:58:56.304720316 +0000 UTC m=+575.578737610" Apr 17 18:58:56.326050 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:56.326020 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-wglqq"] Apr 17 18:58:58.276475 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:58.276422 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-wglqq" podUID="b7906150-0a99-4302-bcba-0b00bb22dc24" containerName="authorino" containerID="cri-o://f97ba6db9861dd1449788671fa92e7536e6e8887d199b8c245e39592e7fcfa1d" gracePeriod=30 Apr 17 18:58:58.509368 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:58.509341 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-wglqq" Apr 17 18:58:58.539800 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:58.539738 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xv7cv\" (UniqueName: \"kubernetes.io/projected/b7906150-0a99-4302-bcba-0b00bb22dc24-kube-api-access-xv7cv\") pod \"b7906150-0a99-4302-bcba-0b00bb22dc24\" (UID: \"b7906150-0a99-4302-bcba-0b00bb22dc24\") " Apr 17 18:58:58.541661 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:58.541638 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7906150-0a99-4302-bcba-0b00bb22dc24-kube-api-access-xv7cv" (OuterVolumeSpecName: "kube-api-access-xv7cv") pod "b7906150-0a99-4302-bcba-0b00bb22dc24" (UID: "b7906150-0a99-4302-bcba-0b00bb22dc24"). InnerVolumeSpecName "kube-api-access-xv7cv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:58:58.640447 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:58.640418 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xv7cv\" (UniqueName: \"kubernetes.io/projected/b7906150-0a99-4302-bcba-0b00bb22dc24-kube-api-access-xv7cv\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:58:59.281002 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:59.280964 2571 generic.go:358] "Generic (PLEG): container finished" podID="b7906150-0a99-4302-bcba-0b00bb22dc24" containerID="f97ba6db9861dd1449788671fa92e7536e6e8887d199b8c245e39592e7fcfa1d" exitCode=0 Apr 17 18:58:59.281431 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:59.281012 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-wglqq" Apr 17 18:58:59.281431 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:59.281045 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-wglqq" event={"ID":"b7906150-0a99-4302-bcba-0b00bb22dc24","Type":"ContainerDied","Data":"f97ba6db9861dd1449788671fa92e7536e6e8887d199b8c245e39592e7fcfa1d"} Apr 17 18:58:59.281431 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:59.281103 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-wglqq" event={"ID":"b7906150-0a99-4302-bcba-0b00bb22dc24","Type":"ContainerDied","Data":"973d5969ee03812e26b2e6681d17007101f20fca380143b3dbf8735dc2cc46f7"} Apr 17 18:58:59.281431 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:59.281122 2571 scope.go:117] "RemoveContainer" containerID="f97ba6db9861dd1449788671fa92e7536e6e8887d199b8c245e39592e7fcfa1d" Apr 17 18:58:59.289993 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:59.289978 2571 scope.go:117] "RemoveContainer" containerID="f97ba6db9861dd1449788671fa92e7536e6e8887d199b8c245e39592e7fcfa1d" Apr 17 18:58:59.290249 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:58:59.290225 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f97ba6db9861dd1449788671fa92e7536e6e8887d199b8c245e39592e7fcfa1d\": container with ID starting with f97ba6db9861dd1449788671fa92e7536e6e8887d199b8c245e39592e7fcfa1d not found: ID does not exist" containerID="f97ba6db9861dd1449788671fa92e7536e6e8887d199b8c245e39592e7fcfa1d" Apr 17 18:58:59.290314 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:59.290255 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f97ba6db9861dd1449788671fa92e7536e6e8887d199b8c245e39592e7fcfa1d"} err="failed to get container status \"f97ba6db9861dd1449788671fa92e7536e6e8887d199b8c245e39592e7fcfa1d\": rpc error: code = NotFound desc = could not find container \"f97ba6db9861dd1449788671fa92e7536e6e8887d199b8c245e39592e7fcfa1d\": container with ID starting with f97ba6db9861dd1449788671fa92e7536e6e8887d199b8c245e39592e7fcfa1d not found: ID does not exist" Apr 17 18:58:59.296576 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:59.296554 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-wglqq"] Apr 17 18:58:59.298755 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:58:59.298737 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-wglqq"] Apr 17 18:59:01.238148 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:01.238113 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7906150-0a99-4302-bcba-0b00bb22dc24" path="/var/lib/kubelet/pods/b7906150-0a99-4302-bcba-0b00bb22dc24/volumes" Apr 17 18:59:06.007894 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:06.007862 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-b2g6k"] Apr 17 18:59:06.008237 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:06.008076 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-b2g6k" podUID="fbc8f9c4-f45a-4aee-9338-8d42b1465507" containerName="limitador" containerID="cri-o://9c310174b97dc4e734b47304419e68adf6f63627753338d32a4abc332348f5db" gracePeriod=30 Apr 17 18:59:06.008689 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:06.008678 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-b2g6k" Apr 17 18:59:06.553962 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:06.553938 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-b2g6k" Apr 17 18:59:06.604068 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:06.604001 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jm6q7\" (UniqueName: \"kubernetes.io/projected/fbc8f9c4-f45a-4aee-9338-8d42b1465507-kube-api-access-jm6q7\") pod \"fbc8f9c4-f45a-4aee-9338-8d42b1465507\" (UID: \"fbc8f9c4-f45a-4aee-9338-8d42b1465507\") " Apr 17 18:59:06.604208 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:06.604076 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/fbc8f9c4-f45a-4aee-9338-8d42b1465507-config-file\") pod \"fbc8f9c4-f45a-4aee-9338-8d42b1465507\" (UID: \"fbc8f9c4-f45a-4aee-9338-8d42b1465507\") " Apr 17 18:59:06.604415 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:06.604392 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbc8f9c4-f45a-4aee-9338-8d42b1465507-config-file" (OuterVolumeSpecName: "config-file") pod "fbc8f9c4-f45a-4aee-9338-8d42b1465507" (UID: "fbc8f9c4-f45a-4aee-9338-8d42b1465507"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:59:06.606090 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:06.606062 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbc8f9c4-f45a-4aee-9338-8d42b1465507-kube-api-access-jm6q7" (OuterVolumeSpecName: "kube-api-access-jm6q7") pod "fbc8f9c4-f45a-4aee-9338-8d42b1465507" (UID: "fbc8f9c4-f45a-4aee-9338-8d42b1465507"). InnerVolumeSpecName "kube-api-access-jm6q7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:59:06.705417 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:06.705383 2571 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/fbc8f9c4-f45a-4aee-9338-8d42b1465507-config-file\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:59:06.705417 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:06.705410 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jm6q7\" (UniqueName: \"kubernetes.io/projected/fbc8f9c4-f45a-4aee-9338-8d42b1465507-kube-api-access-jm6q7\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:59:07.162757 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:07.162718 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-cvvzj"] Apr 17 18:59:07.163132 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:07.163051 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b7906150-0a99-4302-bcba-0b00bb22dc24" containerName="authorino" Apr 17 18:59:07.163132 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:07.163062 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7906150-0a99-4302-bcba-0b00bb22dc24" containerName="authorino" Apr 17 18:59:07.163132 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:07.163080 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fbc8f9c4-f45a-4aee-9338-8d42b1465507" containerName="limitador" Apr 17 18:59:07.163132 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:07.163086 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbc8f9c4-f45a-4aee-9338-8d42b1465507" containerName="limitador" Apr 17 18:59:07.163277 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:07.163140 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="b7906150-0a99-4302-bcba-0b00bb22dc24" containerName="authorino" Apr 17 18:59:07.163277 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:07.163153 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="fbc8f9c4-f45a-4aee-9338-8d42b1465507" containerName="limitador" Apr 17 18:59:07.166341 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:07.166318 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-cvvzj" Apr 17 18:59:07.168127 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:07.168104 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 17 18:59:07.168264 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:07.168142 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-4mb64\"" Apr 17 18:59:07.172334 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:07.172313 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-cvvzj"] Apr 17 18:59:07.208575 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:07.208544 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc6vb\" (UniqueName: \"kubernetes.io/projected/d98f31e4-a83d-4a7f-819a-0d9fb01a99a3-kube-api-access-nc6vb\") pod \"postgres-868db5846d-cvvzj\" (UID: \"d98f31e4-a83d-4a7f-819a-0d9fb01a99a3\") " pod="opendatahub/postgres-868db5846d-cvvzj" Apr 17 18:59:07.208699 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:07.208652 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/d98f31e4-a83d-4a7f-819a-0d9fb01a99a3-data\") pod \"postgres-868db5846d-cvvzj\" (UID: \"d98f31e4-a83d-4a7f-819a-0d9fb01a99a3\") " pod="opendatahub/postgres-868db5846d-cvvzj" Apr 17 18:59:07.309029 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:07.309001 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/d98f31e4-a83d-4a7f-819a-0d9fb01a99a3-data\") pod \"postgres-868db5846d-cvvzj\" (UID: \"d98f31e4-a83d-4a7f-819a-0d9fb01a99a3\") " pod="opendatahub/postgres-868db5846d-cvvzj" Apr 17 18:59:07.309215 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:07.309055 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nc6vb\" (UniqueName: \"kubernetes.io/projected/d98f31e4-a83d-4a7f-819a-0d9fb01a99a3-kube-api-access-nc6vb\") pod \"postgres-868db5846d-cvvzj\" (UID: \"d98f31e4-a83d-4a7f-819a-0d9fb01a99a3\") " pod="opendatahub/postgres-868db5846d-cvvzj" Apr 17 18:59:07.309480 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:07.309439 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/d98f31e4-a83d-4a7f-819a-0d9fb01a99a3-data\") pod \"postgres-868db5846d-cvvzj\" (UID: \"d98f31e4-a83d-4a7f-819a-0d9fb01a99a3\") " pod="opendatahub/postgres-868db5846d-cvvzj" Apr 17 18:59:07.312070 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:07.312047 2571 generic.go:358] "Generic (PLEG): container finished" podID="fbc8f9c4-f45a-4aee-9338-8d42b1465507" containerID="9c310174b97dc4e734b47304419e68adf6f63627753338d32a4abc332348f5db" exitCode=0 Apr 17 18:59:07.312179 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:07.312105 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-b2g6k" Apr 17 18:59:07.312179 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:07.312130 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-b2g6k" event={"ID":"fbc8f9c4-f45a-4aee-9338-8d42b1465507","Type":"ContainerDied","Data":"9c310174b97dc4e734b47304419e68adf6f63627753338d32a4abc332348f5db"} Apr 17 18:59:07.312179 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:07.312167 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-b2g6k" event={"ID":"fbc8f9c4-f45a-4aee-9338-8d42b1465507","Type":"ContainerDied","Data":"680c7c0de483379487f0e3a0b41a6edba39baffe798e56f7e1fb22cbfedae795"} Apr 17 18:59:07.312313 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:07.312186 2571 scope.go:117] "RemoveContainer" containerID="9c310174b97dc4e734b47304419e68adf6f63627753338d32a4abc332348f5db" Apr 17 18:59:07.316688 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:07.316670 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc6vb\" (UniqueName: \"kubernetes.io/projected/d98f31e4-a83d-4a7f-819a-0d9fb01a99a3-kube-api-access-nc6vb\") pod \"postgres-868db5846d-cvvzj\" (UID: \"d98f31e4-a83d-4a7f-819a-0d9fb01a99a3\") " pod="opendatahub/postgres-868db5846d-cvvzj" Apr 17 18:59:07.326974 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:07.326952 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-b2g6k"] Apr 17 18:59:07.328035 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:07.328021 2571 scope.go:117] "RemoveContainer" containerID="9c310174b97dc4e734b47304419e68adf6f63627753338d32a4abc332348f5db" Apr 17 18:59:07.328300 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:59:07.328280 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c310174b97dc4e734b47304419e68adf6f63627753338d32a4abc332348f5db\": container with ID starting with 9c310174b97dc4e734b47304419e68adf6f63627753338d32a4abc332348f5db not found: ID does not exist" containerID="9c310174b97dc4e734b47304419e68adf6f63627753338d32a4abc332348f5db" Apr 17 18:59:07.328368 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:07.328310 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c310174b97dc4e734b47304419e68adf6f63627753338d32a4abc332348f5db"} err="failed to get container status \"9c310174b97dc4e734b47304419e68adf6f63627753338d32a4abc332348f5db\": rpc error: code = NotFound desc = could not find container \"9c310174b97dc4e734b47304419e68adf6f63627753338d32a4abc332348f5db\": container with ID starting with 9c310174b97dc4e734b47304419e68adf6f63627753338d32a4abc332348f5db not found: ID does not exist" Apr 17 18:59:07.331328 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:07.331309 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-b2g6k"] Apr 17 18:59:07.479781 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:07.479704 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-cvvzj" Apr 17 18:59:07.804326 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:07.804305 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-cvvzj"] Apr 17 18:59:07.805976 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:59:07.805936 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd98f31e4_a83d_4a7f_819a_0d9fb01a99a3.slice/crio-7ca58256d3f95215f850f297e4a5fab9bd25a7d71edb2c0bd0da549f3418e66d WatchSource:0}: Error finding container 7ca58256d3f95215f850f297e4a5fab9bd25a7d71edb2c0bd0da549f3418e66d: Status 404 returned error can't find the container with id 7ca58256d3f95215f850f297e4a5fab9bd25a7d71edb2c0bd0da549f3418e66d Apr 17 18:59:08.317044 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:08.317012 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-cvvzj" event={"ID":"d98f31e4-a83d-4a7f-819a-0d9fb01a99a3","Type":"ContainerStarted","Data":"7ca58256d3f95215f850f297e4a5fab9bd25a7d71edb2c0bd0da549f3418e66d"} Apr 17 18:59:09.240001 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:09.239963 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbc8f9c4-f45a-4aee-9338-8d42b1465507" path="/var/lib/kubelet/pods/fbc8f9c4-f45a-4aee-9338-8d42b1465507/volumes" Apr 17 18:59:13.342711 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:13.342671 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-cvvzj" event={"ID":"d98f31e4-a83d-4a7f-819a-0d9fb01a99a3","Type":"ContainerStarted","Data":"edffa4e2a7a4cfe04c395ee3a0bc14a5ed05a46080757549728cc0a04e9da6e9"} Apr 17 18:59:13.343106 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:13.342824 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-cvvzj" Apr 17 18:59:13.358009 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:13.357961 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-cvvzj" podStartSLOduration=1.170365873 podStartE2EDuration="6.35794806s" podCreationTimestamp="2026-04-17 18:59:07 +0000 UTC" firstStartedPulling="2026-04-17 18:59:07.807290157 +0000 UTC m=+587.081307432" lastFinishedPulling="2026-04-17 18:59:12.994872336 +0000 UTC m=+592.268889619" observedRunningTime="2026-04-17 18:59:13.355701862 +0000 UTC m=+592.629719154" watchObservedRunningTime="2026-04-17 18:59:13.35794806 +0000 UTC m=+592.631965353" Apr 17 18:59:19.374978 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:19.374950 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-cvvzj" Apr 17 18:59:20.187893 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:20.187860 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-7dccdbf55f-qlfjm"] Apr 17 18:59:20.194023 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:20.194004 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7dccdbf55f-qlfjm" Apr 17 18:59:20.196526 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:20.196501 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 17 18:59:20.196652 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:20.196557 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 17 18:59:20.196652 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:20.196606 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-8nwjw\"" Apr 17 18:59:20.201425 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:20.201399 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-7dccdbf55f-qlfjm"] Apr 17 18:59:20.321233 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:20.321203 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2rkj\" (UniqueName: \"kubernetes.io/projected/0fd3fecf-019d-458a-91f7-e2cff9904e0d-kube-api-access-g2rkj\") pod \"maas-api-7dccdbf55f-qlfjm\" (UID: \"0fd3fecf-019d-458a-91f7-e2cff9904e0d\") " pod="opendatahub/maas-api-7dccdbf55f-qlfjm" Apr 17 18:59:20.321387 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:20.321306 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/0fd3fecf-019d-458a-91f7-e2cff9904e0d-maas-api-tls\") pod \"maas-api-7dccdbf55f-qlfjm\" (UID: \"0fd3fecf-019d-458a-91f7-e2cff9904e0d\") " pod="opendatahub/maas-api-7dccdbf55f-qlfjm" Apr 17 18:59:20.422111 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:20.422077 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g2rkj\" (UniqueName: \"kubernetes.io/projected/0fd3fecf-019d-458a-91f7-e2cff9904e0d-kube-api-access-g2rkj\") pod \"maas-api-7dccdbf55f-qlfjm\" (UID: \"0fd3fecf-019d-458a-91f7-e2cff9904e0d\") " pod="opendatahub/maas-api-7dccdbf55f-qlfjm" Apr 17 18:59:20.422529 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:20.422136 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/0fd3fecf-019d-458a-91f7-e2cff9904e0d-maas-api-tls\") pod \"maas-api-7dccdbf55f-qlfjm\" (UID: \"0fd3fecf-019d-458a-91f7-e2cff9904e0d\") " pod="opendatahub/maas-api-7dccdbf55f-qlfjm" Apr 17 18:59:20.424559 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:20.424529 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/0fd3fecf-019d-458a-91f7-e2cff9904e0d-maas-api-tls\") pod \"maas-api-7dccdbf55f-qlfjm\" (UID: \"0fd3fecf-019d-458a-91f7-e2cff9904e0d\") " pod="opendatahub/maas-api-7dccdbf55f-qlfjm" Apr 17 18:59:20.429098 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:20.429076 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2rkj\" (UniqueName: \"kubernetes.io/projected/0fd3fecf-019d-458a-91f7-e2cff9904e0d-kube-api-access-g2rkj\") pod \"maas-api-7dccdbf55f-qlfjm\" (UID: \"0fd3fecf-019d-458a-91f7-e2cff9904e0d\") " pod="opendatahub/maas-api-7dccdbf55f-qlfjm" Apr 17 18:59:20.506797 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:20.506769 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7dccdbf55f-qlfjm" Apr 17 18:59:20.627963 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:20.627940 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-7dccdbf55f-qlfjm"] Apr 17 18:59:21.057244 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:21.057213 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-7bd689b8-2dsbv"] Apr 17 18:59:21.062728 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:21.062712 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7bd689b8-2dsbv" Apr 17 18:59:21.067749 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:21.067722 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-7bd689b8-2dsbv"] Apr 17 18:59:21.127910 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:21.127877 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2kbr\" (UniqueName: \"kubernetes.io/projected/7a1556c7-83af-4c76-adc1-75b2252706f3-kube-api-access-h2kbr\") pod \"maas-api-7bd689b8-2dsbv\" (UID: \"7a1556c7-83af-4c76-adc1-75b2252706f3\") " pod="opendatahub/maas-api-7bd689b8-2dsbv" Apr 17 18:59:21.128059 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:21.128011 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/7a1556c7-83af-4c76-adc1-75b2252706f3-maas-api-tls\") pod \"maas-api-7bd689b8-2dsbv\" (UID: \"7a1556c7-83af-4c76-adc1-75b2252706f3\") " pod="opendatahub/maas-api-7bd689b8-2dsbv" Apr 17 18:59:21.229672 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:21.229635 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h2kbr\" (UniqueName: \"kubernetes.io/projected/7a1556c7-83af-4c76-adc1-75b2252706f3-kube-api-access-h2kbr\") pod \"maas-api-7bd689b8-2dsbv\" (UID: \"7a1556c7-83af-4c76-adc1-75b2252706f3\") " pod="opendatahub/maas-api-7bd689b8-2dsbv" Apr 17 18:59:21.229835 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:21.229766 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/7a1556c7-83af-4c76-adc1-75b2252706f3-maas-api-tls\") pod \"maas-api-7bd689b8-2dsbv\" (UID: \"7a1556c7-83af-4c76-adc1-75b2252706f3\") " pod="opendatahub/maas-api-7bd689b8-2dsbv" Apr 17 18:59:21.233894 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:21.233864 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/7a1556c7-83af-4c76-adc1-75b2252706f3-maas-api-tls\") pod \"maas-api-7bd689b8-2dsbv\" (UID: \"7a1556c7-83af-4c76-adc1-75b2252706f3\") " pod="opendatahub/maas-api-7bd689b8-2dsbv" Apr 17 18:59:21.238622 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:21.238574 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2kbr\" (UniqueName: \"kubernetes.io/projected/7a1556c7-83af-4c76-adc1-75b2252706f3-kube-api-access-h2kbr\") pod \"maas-api-7bd689b8-2dsbv\" (UID: \"7a1556c7-83af-4c76-adc1-75b2252706f3\") " pod="opendatahub/maas-api-7bd689b8-2dsbv" Apr 17 18:59:21.374539 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:21.374364 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7bd689b8-2dsbv" Apr 17 18:59:21.375505 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:21.375319 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7dccdbf55f-qlfjm" event={"ID":"0fd3fecf-019d-458a-91f7-e2cff9904e0d","Type":"ContainerStarted","Data":"8a0124b2dd423613a4ab3865c90917c4497c9639c3c0500e7c8e187d2d0060e6"} Apr 17 18:59:21.570892 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:21.570842 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-7bd689b8-2dsbv"] Apr 17 18:59:21.576208 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:59:21.576173 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a1556c7_83af_4c76_adc1_75b2252706f3.slice/crio-0c199f208eda92f1034a6c7f851c0e13018f7fab5cf3585cbcf60f67593ef8f1 WatchSource:0}: Error finding container 0c199f208eda92f1034a6c7f851c0e13018f7fab5cf3585cbcf60f67593ef8f1: Status 404 returned error can't find the container with id 0c199f208eda92f1034a6c7f851c0e13018f7fab5cf3585cbcf60f67593ef8f1 Apr 17 18:59:22.383988 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:22.383951 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7bd689b8-2dsbv" event={"ID":"7a1556c7-83af-4c76-adc1-75b2252706f3","Type":"ContainerStarted","Data":"0c199f208eda92f1034a6c7f851c0e13018f7fab5cf3585cbcf60f67593ef8f1"} Apr 17 18:59:23.719906 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:23.719875 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 17 18:59:24.392241 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:24.392209 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7bd689b8-2dsbv" event={"ID":"7a1556c7-83af-4c76-adc1-75b2252706f3","Type":"ContainerStarted","Data":"6b169f5cf4676ffb16ebc0d34cf79de59b3228347c0762966628726a79eb0a41"} Apr 17 18:59:24.392405 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:24.392320 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-7bd689b8-2dsbv" Apr 17 18:59:24.393565 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:24.393541 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7dccdbf55f-qlfjm" event={"ID":"0fd3fecf-019d-458a-91f7-e2cff9904e0d","Type":"ContainerStarted","Data":"9e00efb5f06eb5e53c6f4af9f6bb575e549dcb4e27447251ac1f268ada90ebd7"} Apr 17 18:59:24.393691 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:24.393654 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-7dccdbf55f-qlfjm" Apr 17 18:59:24.409048 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:24.408999 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-7bd689b8-2dsbv" podStartSLOduration=1.265303079 podStartE2EDuration="3.408988707s" podCreationTimestamp="2026-04-17 18:59:21 +0000 UTC" firstStartedPulling="2026-04-17 18:59:21.578408964 +0000 UTC m=+600.852426242" lastFinishedPulling="2026-04-17 18:59:23.722094596 +0000 UTC m=+602.996111870" observedRunningTime="2026-04-17 18:59:24.406119095 +0000 UTC m=+603.680136388" watchObservedRunningTime="2026-04-17 18:59:24.408988707 +0000 UTC m=+603.683006000" Apr 17 18:59:24.420173 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:24.420131 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-7dccdbf55f-qlfjm" podStartSLOduration=1.343029233 podStartE2EDuration="4.420120583s" podCreationTimestamp="2026-04-17 18:59:20 +0000 UTC" firstStartedPulling="2026-04-17 18:59:20.638041566 +0000 UTC m=+599.912058842" lastFinishedPulling="2026-04-17 18:59:23.715132916 +0000 UTC m=+602.989150192" observedRunningTime="2026-04-17 18:59:24.418958749 +0000 UTC m=+603.692976041" watchObservedRunningTime="2026-04-17 18:59:24.420120583 +0000 UTC m=+603.694137876" Apr 17 18:59:25.056428 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:25.056396 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-pljxl"] Apr 17 18:59:25.056899 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:25.056680 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-pljxl" podUID="3a7a630a-f3ab-4089-8bd5-9bd8aa11a34b" containerName="authorino" containerID="cri-o://82d28b1c861faedb6895109c0b43f1b54d0d82771b4fe0daa77189c0f4b83073" gracePeriod=30 Apr 17 18:59:25.303377 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:25.303356 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-pljxl" Apr 17 18:59:25.343612 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:25.343542 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-ff598dcc8-xbh5x"] Apr 17 18:59:25.343890 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:25.343877 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a7a630a-f3ab-4089-8bd5-9bd8aa11a34b" containerName="authorino" Apr 17 18:59:25.343940 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:25.343891 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a7a630a-f3ab-4089-8bd5-9bd8aa11a34b" containerName="authorino" Apr 17 18:59:25.343972 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:25.343966 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a7a630a-f3ab-4089-8bd5-9bd8aa11a34b" containerName="authorino" Apr 17 18:59:25.388203 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:25.388182 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4wnd\" (UniqueName: \"kubernetes.io/projected/3a7a630a-f3ab-4089-8bd5-9bd8aa11a34b-kube-api-access-d4wnd\") pod \"3a7a630a-f3ab-4089-8bd5-9bd8aa11a34b\" (UID: \"3a7a630a-f3ab-4089-8bd5-9bd8aa11a34b\") " Apr 17 18:59:25.390420 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:25.390396 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-ff598dcc8-xbh5x"] Apr 17 18:59:25.390574 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:25.390533 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-ff598dcc8-xbh5x" Apr 17 18:59:25.390710 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:25.390679 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a7a630a-f3ab-4089-8bd5-9bd8aa11a34b-kube-api-access-d4wnd" (OuterVolumeSpecName: "kube-api-access-d4wnd") pod "3a7a630a-f3ab-4089-8bd5-9bd8aa11a34b" (UID: "3a7a630a-f3ab-4089-8bd5-9bd8aa11a34b"). InnerVolumeSpecName "kube-api-access-d4wnd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:59:25.392695 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:25.392673 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 17 18:59:25.398760 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:25.398738 2571 generic.go:358] "Generic (PLEG): container finished" podID="3a7a630a-f3ab-4089-8bd5-9bd8aa11a34b" containerID="82d28b1c861faedb6895109c0b43f1b54d0d82771b4fe0daa77189c0f4b83073" exitCode=0 Apr 17 18:59:25.398840 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:25.398778 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-pljxl" Apr 17 18:59:25.398840 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:25.398819 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-pljxl" event={"ID":"3a7a630a-f3ab-4089-8bd5-9bd8aa11a34b","Type":"ContainerDied","Data":"82d28b1c861faedb6895109c0b43f1b54d0d82771b4fe0daa77189c0f4b83073"} Apr 17 18:59:25.398921 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:25.398856 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-pljxl" event={"ID":"3a7a630a-f3ab-4089-8bd5-9bd8aa11a34b","Type":"ContainerDied","Data":"a449b67f5153f49325026f622fccd84a9c26cf77a20c42e60090b5293d175d6d"} Apr 17 18:59:25.398921 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:25.398879 2571 scope.go:117] "RemoveContainer" containerID="82d28b1c861faedb6895109c0b43f1b54d0d82771b4fe0daa77189c0f4b83073" Apr 17 18:59:25.409407 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:25.409389 2571 scope.go:117] "RemoveContainer" containerID="82d28b1c861faedb6895109c0b43f1b54d0d82771b4fe0daa77189c0f4b83073" Apr 17 18:59:25.409749 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:59:25.409728 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82d28b1c861faedb6895109c0b43f1b54d0d82771b4fe0daa77189c0f4b83073\": container with ID starting with 82d28b1c861faedb6895109c0b43f1b54d0d82771b4fe0daa77189c0f4b83073 not found: ID does not exist" containerID="82d28b1c861faedb6895109c0b43f1b54d0d82771b4fe0daa77189c0f4b83073" Apr 17 18:59:25.409835 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:25.409754 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82d28b1c861faedb6895109c0b43f1b54d0d82771b4fe0daa77189c0f4b83073"} err="failed to get container status \"82d28b1c861faedb6895109c0b43f1b54d0d82771b4fe0daa77189c0f4b83073\": rpc error: code = NotFound desc = could not find container \"82d28b1c861faedb6895109c0b43f1b54d0d82771b4fe0daa77189c0f4b83073\": container with ID starting with 82d28b1c861faedb6895109c0b43f1b54d0d82771b4fe0daa77189c0f4b83073 not found: ID does not exist" Apr 17 18:59:25.419534 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:25.419512 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-pljxl"] Apr 17 18:59:25.422834 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:25.422812 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-pljxl"] Apr 17 18:59:25.489542 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:25.489505 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrtp9\" (UniqueName: \"kubernetes.io/projected/66d519dd-b5e0-4df4-8672-0f14a3d00b08-kube-api-access-hrtp9\") pod \"authorino-ff598dcc8-xbh5x\" (UID: \"66d519dd-b5e0-4df4-8672-0f14a3d00b08\") " pod="kuadrant-system/authorino-ff598dcc8-xbh5x" Apr 17 18:59:25.489745 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:25.489725 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/66d519dd-b5e0-4df4-8672-0f14a3d00b08-tls-cert\") pod \"authorino-ff598dcc8-xbh5x\" (UID: \"66d519dd-b5e0-4df4-8672-0f14a3d00b08\") " pod="kuadrant-system/authorino-ff598dcc8-xbh5x" Apr 17 18:59:25.489819 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:25.489791 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d4wnd\" (UniqueName: \"kubernetes.io/projected/3a7a630a-f3ab-4089-8bd5-9bd8aa11a34b-kube-api-access-d4wnd\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:59:25.590308 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:25.590277 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/66d519dd-b5e0-4df4-8672-0f14a3d00b08-tls-cert\") pod \"authorino-ff598dcc8-xbh5x\" (UID: \"66d519dd-b5e0-4df4-8672-0f14a3d00b08\") " pod="kuadrant-system/authorino-ff598dcc8-xbh5x" Apr 17 18:59:25.590479 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:25.590320 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hrtp9\" (UniqueName: \"kubernetes.io/projected/66d519dd-b5e0-4df4-8672-0f14a3d00b08-kube-api-access-hrtp9\") pod \"authorino-ff598dcc8-xbh5x\" (UID: \"66d519dd-b5e0-4df4-8672-0f14a3d00b08\") " pod="kuadrant-system/authorino-ff598dcc8-xbh5x" Apr 17 18:59:25.592700 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:25.592680 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/66d519dd-b5e0-4df4-8672-0f14a3d00b08-tls-cert\") pod \"authorino-ff598dcc8-xbh5x\" (UID: \"66d519dd-b5e0-4df4-8672-0f14a3d00b08\") " pod="kuadrant-system/authorino-ff598dcc8-xbh5x" Apr 17 18:59:25.597289 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:25.597231 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrtp9\" (UniqueName: \"kubernetes.io/projected/66d519dd-b5e0-4df4-8672-0f14a3d00b08-kube-api-access-hrtp9\") pod \"authorino-ff598dcc8-xbh5x\" (UID: \"66d519dd-b5e0-4df4-8672-0f14a3d00b08\") " pod="kuadrant-system/authorino-ff598dcc8-xbh5x" Apr 17 18:59:25.707551 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:25.707527 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-ff598dcc8-xbh5x" Apr 17 18:59:25.824186 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:25.824160 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-ff598dcc8-xbh5x"] Apr 17 18:59:25.826015 ip-10-0-132-192 kubenswrapper[2571]: W0417 18:59:25.825976 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66d519dd_b5e0_4df4_8672_0f14a3d00b08.slice/crio-98c07168606a3012a45685f6341e7065c69be9fcca879d17fef3f4442efce597 WatchSource:0}: Error finding container 98c07168606a3012a45685f6341e7065c69be9fcca879d17fef3f4442efce597: Status 404 returned error can't find the container with id 98c07168606a3012a45685f6341e7065c69be9fcca879d17fef3f4442efce597 Apr 17 18:59:26.405845 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:26.405811 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-ff598dcc8-xbh5x" event={"ID":"66d519dd-b5e0-4df4-8672-0f14a3d00b08","Type":"ContainerStarted","Data":"98c07168606a3012a45685f6341e7065c69be9fcca879d17fef3f4442efce597"} Apr 17 18:59:27.238091 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:27.238058 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a7a630a-f3ab-4089-8bd5-9bd8aa11a34b" path="/var/lib/kubelet/pods/3a7a630a-f3ab-4089-8bd5-9bd8aa11a34b/volumes" Apr 17 18:59:27.411010 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:27.410970 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-ff598dcc8-xbh5x" event={"ID":"66d519dd-b5e0-4df4-8672-0f14a3d00b08","Type":"ContainerStarted","Data":"3ab3748a1100e3e958aab48eab39bbe6eb3e9a32164cf1f697eb74c38693ae8c"} Apr 17 18:59:27.427917 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:27.427872 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-ff598dcc8-xbh5x" podStartSLOduration=1.937501347 podStartE2EDuration="2.427858975s" podCreationTimestamp="2026-04-17 18:59:25 +0000 UTC" firstStartedPulling="2026-04-17 18:59:25.82736843 +0000 UTC m=+605.101385701" lastFinishedPulling="2026-04-17 18:59:26.317726055 +0000 UTC m=+605.591743329" observedRunningTime="2026-04-17 18:59:27.424615241 +0000 UTC m=+606.698632533" watchObservedRunningTime="2026-04-17 18:59:27.427858975 +0000 UTC m=+606.701876268" Apr 17 18:59:30.403870 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:30.403841 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-7dccdbf55f-qlfjm" Apr 17 18:59:30.404356 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:30.404337 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-7bd689b8-2dsbv" Apr 17 18:59:30.461090 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:30.461062 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-7dccdbf55f-qlfjm"] Apr 17 18:59:30.461319 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:30.461272 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-7dccdbf55f-qlfjm" podUID="0fd3fecf-019d-458a-91f7-e2cff9904e0d" containerName="maas-api" containerID="cri-o://9e00efb5f06eb5e53c6f4af9f6bb575e549dcb4e27447251ac1f268ada90ebd7" gracePeriod=30 Apr 17 18:59:30.703567 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:30.703543 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7dccdbf55f-qlfjm" Apr 17 18:59:30.841064 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:30.841035 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/0fd3fecf-019d-458a-91f7-e2cff9904e0d-maas-api-tls\") pod \"0fd3fecf-019d-458a-91f7-e2cff9904e0d\" (UID: \"0fd3fecf-019d-458a-91f7-e2cff9904e0d\") " Apr 17 18:59:30.841223 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:30.841114 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2rkj\" (UniqueName: \"kubernetes.io/projected/0fd3fecf-019d-458a-91f7-e2cff9904e0d-kube-api-access-g2rkj\") pod \"0fd3fecf-019d-458a-91f7-e2cff9904e0d\" (UID: \"0fd3fecf-019d-458a-91f7-e2cff9904e0d\") " Apr 17 18:59:30.843211 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:30.843182 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fd3fecf-019d-458a-91f7-e2cff9904e0d-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "0fd3fecf-019d-458a-91f7-e2cff9904e0d" (UID: "0fd3fecf-019d-458a-91f7-e2cff9904e0d"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:59:30.843311 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:30.843254 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fd3fecf-019d-458a-91f7-e2cff9904e0d-kube-api-access-g2rkj" (OuterVolumeSpecName: "kube-api-access-g2rkj") pod "0fd3fecf-019d-458a-91f7-e2cff9904e0d" (UID: "0fd3fecf-019d-458a-91f7-e2cff9904e0d"). InnerVolumeSpecName "kube-api-access-g2rkj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:59:30.942541 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:30.942479 2571 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/0fd3fecf-019d-458a-91f7-e2cff9904e0d-maas-api-tls\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:59:30.942541 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:30.942508 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g2rkj\" (UniqueName: \"kubernetes.io/projected/0fd3fecf-019d-458a-91f7-e2cff9904e0d-kube-api-access-g2rkj\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 18:59:31.426617 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:31.426582 2571 generic.go:358] "Generic (PLEG): container finished" podID="0fd3fecf-019d-458a-91f7-e2cff9904e0d" containerID="9e00efb5f06eb5e53c6f4af9f6bb575e549dcb4e27447251ac1f268ada90ebd7" exitCode=0 Apr 17 18:59:31.427014 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:31.426649 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7dccdbf55f-qlfjm" Apr 17 18:59:31.427014 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:31.426662 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7dccdbf55f-qlfjm" event={"ID":"0fd3fecf-019d-458a-91f7-e2cff9904e0d","Type":"ContainerDied","Data":"9e00efb5f06eb5e53c6f4af9f6bb575e549dcb4e27447251ac1f268ada90ebd7"} Apr 17 18:59:31.427014 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:31.426705 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7dccdbf55f-qlfjm" event={"ID":"0fd3fecf-019d-458a-91f7-e2cff9904e0d","Type":"ContainerDied","Data":"8a0124b2dd423613a4ab3865c90917c4497c9639c3c0500e7c8e187d2d0060e6"} Apr 17 18:59:31.427014 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:31.426726 2571 scope.go:117] "RemoveContainer" containerID="9e00efb5f06eb5e53c6f4af9f6bb575e549dcb4e27447251ac1f268ada90ebd7" Apr 17 18:59:31.438324 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:31.438306 2571 scope.go:117] "RemoveContainer" containerID="9e00efb5f06eb5e53c6f4af9f6bb575e549dcb4e27447251ac1f268ada90ebd7" Apr 17 18:59:31.438588 ip-10-0-132-192 kubenswrapper[2571]: E0417 18:59:31.438567 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e00efb5f06eb5e53c6f4af9f6bb575e549dcb4e27447251ac1f268ada90ebd7\": container with ID starting with 9e00efb5f06eb5e53c6f4af9f6bb575e549dcb4e27447251ac1f268ada90ebd7 not found: ID does not exist" containerID="9e00efb5f06eb5e53c6f4af9f6bb575e549dcb4e27447251ac1f268ada90ebd7" Apr 17 18:59:31.438643 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:31.438598 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e00efb5f06eb5e53c6f4af9f6bb575e549dcb4e27447251ac1f268ada90ebd7"} err="failed to get container status \"9e00efb5f06eb5e53c6f4af9f6bb575e549dcb4e27447251ac1f268ada90ebd7\": rpc error: code = NotFound desc = could not find container \"9e00efb5f06eb5e53c6f4af9f6bb575e549dcb4e27447251ac1f268ada90ebd7\": container with ID starting with 9e00efb5f06eb5e53c6f4af9f6bb575e549dcb4e27447251ac1f268ada90ebd7 not found: ID does not exist" Apr 17 18:59:31.442066 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:31.442045 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-7dccdbf55f-qlfjm"] Apr 17 18:59:31.445662 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:31.445643 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-7dccdbf55f-qlfjm"] Apr 17 18:59:33.238538 ip-10-0-132-192 kubenswrapper[2571]: I0417 18:59:33.238441 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fd3fecf-019d-458a-91f7-e2cff9904e0d" path="/var/lib/kubelet/pods/0fd3fecf-019d-458a-91f7-e2cff9904e0d/volumes" Apr 17 19:00:09.915472 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:09.915421 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz"] Apr 17 19:00:09.916030 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:09.915948 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0fd3fecf-019d-458a-91f7-e2cff9904e0d" containerName="maas-api" Apr 17 19:00:09.916030 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:09.915967 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fd3fecf-019d-458a-91f7-e2cff9904e0d" containerName="maas-api" Apr 17 19:00:09.916148 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:09.916067 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="0fd3fecf-019d-458a-91f7-e2cff9904e0d" containerName="maas-api" Apr 17 19:00:09.918263 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:09.918242 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz" Apr 17 19:00:09.926338 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:09.920735 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 17 19:00:09.927525 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:09.927504 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 17 19:00:09.927654 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:09.927645 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-xc4r9\"" Apr 17 19:00:09.927865 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:09.927837 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 17 19:00:09.931895 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:09.931870 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz"] Apr 17 19:00:10.068673 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:10.068641 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/03d7e0c5-238a-4f28-baa9-d78291c936f7-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz\" (UID: \"03d7e0c5-238a-4f28-baa9-d78291c936f7\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz" Apr 17 19:00:10.068673 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:10.068677 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/03d7e0c5-238a-4f28-baa9-d78291c936f7-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz\" (UID: \"03d7e0c5-238a-4f28-baa9-d78291c936f7\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz" Apr 17 19:00:10.068878 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:10.068699 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/03d7e0c5-238a-4f28-baa9-d78291c936f7-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz\" (UID: \"03d7e0c5-238a-4f28-baa9-d78291c936f7\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz" Apr 17 19:00:10.068878 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:10.068787 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/03d7e0c5-238a-4f28-baa9-d78291c936f7-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz\" (UID: \"03d7e0c5-238a-4f28-baa9-d78291c936f7\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz" Apr 17 19:00:10.068878 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:10.068841 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hj25\" (UniqueName: \"kubernetes.io/projected/03d7e0c5-238a-4f28-baa9-d78291c936f7-kube-api-access-5hj25\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz\" (UID: \"03d7e0c5-238a-4f28-baa9-d78291c936f7\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz" Apr 17 19:00:10.068988 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:10.068908 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/03d7e0c5-238a-4f28-baa9-d78291c936f7-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz\" (UID: \"03d7e0c5-238a-4f28-baa9-d78291c936f7\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz" Apr 17 19:00:10.170306 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:10.170217 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/03d7e0c5-238a-4f28-baa9-d78291c936f7-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz\" (UID: \"03d7e0c5-238a-4f28-baa9-d78291c936f7\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz" Apr 17 19:00:10.170306 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:10.170263 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/03d7e0c5-238a-4f28-baa9-d78291c936f7-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz\" (UID: \"03d7e0c5-238a-4f28-baa9-d78291c936f7\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz" Apr 17 19:00:10.170306 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:10.170291 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/03d7e0c5-238a-4f28-baa9-d78291c936f7-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz\" (UID: \"03d7e0c5-238a-4f28-baa9-d78291c936f7\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz" Apr 17 19:00:10.170614 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:10.170321 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/03d7e0c5-238a-4f28-baa9-d78291c936f7-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz\" (UID: \"03d7e0c5-238a-4f28-baa9-d78291c936f7\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz" Apr 17 19:00:10.170614 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:10.170369 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/03d7e0c5-238a-4f28-baa9-d78291c936f7-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz\" (UID: \"03d7e0c5-238a-4f28-baa9-d78291c936f7\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz" Apr 17 19:00:10.170614 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:10.170423 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5hj25\" (UniqueName: \"kubernetes.io/projected/03d7e0c5-238a-4f28-baa9-d78291c936f7-kube-api-access-5hj25\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz\" (UID: \"03d7e0c5-238a-4f28-baa9-d78291c936f7\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz" Apr 17 19:00:10.170765 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:10.170662 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/03d7e0c5-238a-4f28-baa9-d78291c936f7-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz\" (UID: \"03d7e0c5-238a-4f28-baa9-d78291c936f7\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz" Apr 17 19:00:10.170765 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:10.170720 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/03d7e0c5-238a-4f28-baa9-d78291c936f7-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz\" (UID: \"03d7e0c5-238a-4f28-baa9-d78291c936f7\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz" Apr 17 19:00:10.170844 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:10.170763 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/03d7e0c5-238a-4f28-baa9-d78291c936f7-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz\" (UID: \"03d7e0c5-238a-4f28-baa9-d78291c936f7\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz" Apr 17 19:00:10.172498 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:10.172475 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/03d7e0c5-238a-4f28-baa9-d78291c936f7-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz\" (UID: \"03d7e0c5-238a-4f28-baa9-d78291c936f7\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz" Apr 17 19:00:10.172740 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:10.172723 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/03d7e0c5-238a-4f28-baa9-d78291c936f7-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz\" (UID: \"03d7e0c5-238a-4f28-baa9-d78291c936f7\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz" Apr 17 19:00:10.177259 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:10.177240 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hj25\" (UniqueName: \"kubernetes.io/projected/03d7e0c5-238a-4f28-baa9-d78291c936f7-kube-api-access-5hj25\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz\" (UID: \"03d7e0c5-238a-4f28-baa9-d78291c936f7\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz" Apr 17 19:00:10.236068 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:10.236033 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz" Apr 17 19:00:10.360646 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:10.360619 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz"] Apr 17 19:00:10.362526 ip-10-0-132-192 kubenswrapper[2571]: W0417 19:00:10.362501 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03d7e0c5_238a_4f28_baa9_d78291c936f7.slice/crio-9e5697232f6e8a68065bfe63f7ffa0660ac2f34c38986cdbbbdbf38562471a9e WatchSource:0}: Error finding container 9e5697232f6e8a68065bfe63f7ffa0660ac2f34c38986cdbbbdbf38562471a9e: Status 404 returned error can't find the container with id 9e5697232f6e8a68065bfe63f7ffa0660ac2f34c38986cdbbbdbf38562471a9e Apr 17 19:00:10.364312 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:10.364293 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 19:00:10.577480 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:10.577422 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz" event={"ID":"03d7e0c5-238a-4f28-baa9-d78291c936f7","Type":"ContainerStarted","Data":"9e5697232f6e8a68065bfe63f7ffa0660ac2f34c38986cdbbbdbf38562471a9e"} Apr 17 19:00:16.612365 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:16.612320 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz" event={"ID":"03d7e0c5-238a-4f28-baa9-d78291c936f7","Type":"ContainerStarted","Data":"7bca71c425cfcc40ba250853d10729215855b3cf0aa21a9525f8feb4c36d192c"} Apr 17 19:00:21.634819 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:21.634783 2571 generic.go:358] "Generic (PLEG): container finished" podID="03d7e0c5-238a-4f28-baa9-d78291c936f7" containerID="7bca71c425cfcc40ba250853d10729215855b3cf0aa21a9525f8feb4c36d192c" exitCode=0 Apr 17 19:00:21.635188 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:21.634859 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz" event={"ID":"03d7e0c5-238a-4f28-baa9-d78291c936f7","Type":"ContainerDied","Data":"7bca71c425cfcc40ba250853d10729215855b3cf0aa21a9525f8feb4c36d192c"} Apr 17 19:00:23.407925 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:23.407889 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-gbdcb"] Apr 17 19:00:23.410381 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:23.410364 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gbdcb" Apr 17 19:00:23.412343 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:23.412323 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 17 19:00:23.421503 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:23.421479 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-gbdcb"] Apr 17 19:00:23.489303 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:23.489277 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqxkp\" (UniqueName: \"kubernetes.io/projected/e9e59b59-c087-43af-9cd5-880aa3027e37-kube-api-access-xqxkp\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gbdcb\" (UID: \"e9e59b59-c087-43af-9cd5-880aa3027e37\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gbdcb" Apr 17 19:00:23.489423 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:23.489330 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e9e59b59-c087-43af-9cd5-880aa3027e37-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gbdcb\" (UID: \"e9e59b59-c087-43af-9cd5-880aa3027e37\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gbdcb" Apr 17 19:00:23.489423 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:23.489383 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e9e59b59-c087-43af-9cd5-880aa3027e37-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gbdcb\" (UID: \"e9e59b59-c087-43af-9cd5-880aa3027e37\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gbdcb" Apr 17 19:00:23.489423 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:23.489411 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e9e59b59-c087-43af-9cd5-880aa3027e37-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gbdcb\" (UID: \"e9e59b59-c087-43af-9cd5-880aa3027e37\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gbdcb" Apr 17 19:00:23.489558 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:23.489492 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e9e59b59-c087-43af-9cd5-880aa3027e37-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gbdcb\" (UID: \"e9e59b59-c087-43af-9cd5-880aa3027e37\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gbdcb" Apr 17 19:00:23.489558 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:23.489554 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e9e59b59-c087-43af-9cd5-880aa3027e37-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gbdcb\" (UID: \"e9e59b59-c087-43af-9cd5-880aa3027e37\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gbdcb" Apr 17 19:00:23.590548 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:23.590517 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e9e59b59-c087-43af-9cd5-880aa3027e37-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gbdcb\" (UID: \"e9e59b59-c087-43af-9cd5-880aa3027e37\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gbdcb" Apr 17 19:00:23.590675 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:23.590558 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e9e59b59-c087-43af-9cd5-880aa3027e37-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gbdcb\" (UID: \"e9e59b59-c087-43af-9cd5-880aa3027e37\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gbdcb" Apr 17 19:00:23.590675 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:23.590580 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e9e59b59-c087-43af-9cd5-880aa3027e37-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gbdcb\" (UID: \"e9e59b59-c087-43af-9cd5-880aa3027e37\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gbdcb" Apr 17 19:00:23.590675 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:23.590623 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e9e59b59-c087-43af-9cd5-880aa3027e37-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gbdcb\" (UID: \"e9e59b59-c087-43af-9cd5-880aa3027e37\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gbdcb" Apr 17 19:00:23.590675 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:23.590670 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e9e59b59-c087-43af-9cd5-880aa3027e37-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gbdcb\" (UID: \"e9e59b59-c087-43af-9cd5-880aa3027e37\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gbdcb" Apr 17 19:00:23.590897 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:23.590712 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xqxkp\" (UniqueName: \"kubernetes.io/projected/e9e59b59-c087-43af-9cd5-880aa3027e37-kube-api-access-xqxkp\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gbdcb\" (UID: \"e9e59b59-c087-43af-9cd5-880aa3027e37\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gbdcb" Apr 17 19:00:23.590976 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:23.590953 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e9e59b59-c087-43af-9cd5-880aa3027e37-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gbdcb\" (UID: \"e9e59b59-c087-43af-9cd5-880aa3027e37\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gbdcb" Apr 17 19:00:23.591048 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:23.590974 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e9e59b59-c087-43af-9cd5-880aa3027e37-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gbdcb\" (UID: \"e9e59b59-c087-43af-9cd5-880aa3027e37\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gbdcb" Apr 17 19:00:23.591103 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:23.591060 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e9e59b59-c087-43af-9cd5-880aa3027e37-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gbdcb\" (UID: \"e9e59b59-c087-43af-9cd5-880aa3027e37\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gbdcb" Apr 17 19:00:23.592867 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:23.592844 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e9e59b59-c087-43af-9cd5-880aa3027e37-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gbdcb\" (UID: \"e9e59b59-c087-43af-9cd5-880aa3027e37\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gbdcb" Apr 17 19:00:23.593154 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:23.593133 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e9e59b59-c087-43af-9cd5-880aa3027e37-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gbdcb\" (UID: \"e9e59b59-c087-43af-9cd5-880aa3027e37\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gbdcb" Apr 17 19:00:23.598218 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:23.598195 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqxkp\" (UniqueName: \"kubernetes.io/projected/e9e59b59-c087-43af-9cd5-880aa3027e37-kube-api-access-xqxkp\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gbdcb\" (UID: \"e9e59b59-c087-43af-9cd5-880aa3027e37\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gbdcb" Apr 17 19:00:23.721561 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:23.721492 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gbdcb" Apr 17 19:00:23.848554 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:23.848530 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-gbdcb"] Apr 17 19:00:23.851867 ip-10-0-132-192 kubenswrapper[2571]: W0417 19:00:23.851832 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9e59b59_c087_43af_9cd5_880aa3027e37.slice/crio-c89ac7043427a822f2619bd5d6d64b35530dcdb95fcd3a175fe5a120ec7aa0b9 WatchSource:0}: Error finding container c89ac7043427a822f2619bd5d6d64b35530dcdb95fcd3a175fe5a120ec7aa0b9: Status 404 returned error can't find the container with id c89ac7043427a822f2619bd5d6d64b35530dcdb95fcd3a175fe5a120ec7aa0b9 Apr 17 19:00:24.649071 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:24.649036 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gbdcb" event={"ID":"e9e59b59-c087-43af-9cd5-880aa3027e37","Type":"ContainerStarted","Data":"4ebf4e596d92e1793d31004ab443c0c57a48ea2dad9a3887a45461850c2d4307"} Apr 17 19:00:24.649071 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:24.649071 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gbdcb" event={"ID":"e9e59b59-c087-43af-9cd5-880aa3027e37","Type":"ContainerStarted","Data":"c89ac7043427a822f2619bd5d6d64b35530dcdb95fcd3a175fe5a120ec7aa0b9"} Apr 17 19:00:26.660147 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:26.660111 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz" event={"ID":"03d7e0c5-238a-4f28-baa9-d78291c936f7","Type":"ContainerStarted","Data":"c23d87e5c3d56e14200308d3d68eba3eccf142f5f14c1a0d42016fd9094b76af"} Apr 17 19:00:26.660657 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:26.660379 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz" Apr 17 19:00:26.677026 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:26.676945 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz" podStartSLOduration=1.613927777 podStartE2EDuration="17.676933873s" podCreationTimestamp="2026-04-17 19:00:09 +0000 UTC" firstStartedPulling="2026-04-17 19:00:10.364430028 +0000 UTC m=+649.638447299" lastFinishedPulling="2026-04-17 19:00:26.42743612 +0000 UTC m=+665.701453395" observedRunningTime="2026-04-17 19:00:26.675320217 +0000 UTC m=+665.949337528" watchObservedRunningTime="2026-04-17 19:00:26.676933873 +0000 UTC m=+665.950951165" Apr 17 19:00:29.677585 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:29.677491 2571 generic.go:358] "Generic (PLEG): container finished" podID="e9e59b59-c087-43af-9cd5-880aa3027e37" containerID="4ebf4e596d92e1793d31004ab443c0c57a48ea2dad9a3887a45461850c2d4307" exitCode=0 Apr 17 19:00:29.677585 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:29.677549 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gbdcb" event={"ID":"e9e59b59-c087-43af-9cd5-880aa3027e37","Type":"ContainerDied","Data":"4ebf4e596d92e1793d31004ab443c0c57a48ea2dad9a3887a45461850c2d4307"} Apr 17 19:00:31.116341 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:31.116267 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn"] Apr 17 19:00:31.123792 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:31.123770 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn" Apr 17 19:00:31.126224 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:31.126200 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 17 19:00:31.127348 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:31.127322 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn"] Apr 17 19:00:31.262139 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:31.262111 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d0fba779-814b-477f-8f9a-e560adcb59e2-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn\" (UID: \"d0fba779-814b-477f-8f9a-e560adcb59e2\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn" Apr 17 19:00:31.262139 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:31.262141 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d0fba779-814b-477f-8f9a-e560adcb59e2-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn\" (UID: \"d0fba779-814b-477f-8f9a-e560adcb59e2\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn" Apr 17 19:00:31.262343 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:31.262168 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d0fba779-814b-477f-8f9a-e560adcb59e2-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn\" (UID: \"d0fba779-814b-477f-8f9a-e560adcb59e2\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn" Apr 17 19:00:31.262343 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:31.262267 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d0fba779-814b-477f-8f9a-e560adcb59e2-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn\" (UID: \"d0fba779-814b-477f-8f9a-e560adcb59e2\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn" Apr 17 19:00:31.262343 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:31.262309 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb7gx\" (UniqueName: \"kubernetes.io/projected/d0fba779-814b-477f-8f9a-e560adcb59e2-kube-api-access-mb7gx\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn\" (UID: \"d0fba779-814b-477f-8f9a-e560adcb59e2\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn" Apr 17 19:00:31.262448 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:31.262369 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d0fba779-814b-477f-8f9a-e560adcb59e2-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn\" (UID: \"d0fba779-814b-477f-8f9a-e560adcb59e2\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn" Apr 17 19:00:31.363440 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:31.363401 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d0fba779-814b-477f-8f9a-e560adcb59e2-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn\" (UID: \"d0fba779-814b-477f-8f9a-e560adcb59e2\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn" Apr 17 19:00:31.363642 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:31.363498 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d0fba779-814b-477f-8f9a-e560adcb59e2-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn\" (UID: \"d0fba779-814b-477f-8f9a-e560adcb59e2\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn" Apr 17 19:00:31.363642 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:31.363526 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d0fba779-814b-477f-8f9a-e560adcb59e2-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn\" (UID: \"d0fba779-814b-477f-8f9a-e560adcb59e2\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn" Apr 17 19:00:31.363642 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:31.363575 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d0fba779-814b-477f-8f9a-e560adcb59e2-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn\" (UID: \"d0fba779-814b-477f-8f9a-e560adcb59e2\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn" Apr 17 19:00:31.363642 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:31.363625 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d0fba779-814b-477f-8f9a-e560adcb59e2-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn\" (UID: \"d0fba779-814b-477f-8f9a-e560adcb59e2\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn" Apr 17 19:00:31.363858 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:31.363669 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mb7gx\" (UniqueName: \"kubernetes.io/projected/d0fba779-814b-477f-8f9a-e560adcb59e2-kube-api-access-mb7gx\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn\" (UID: \"d0fba779-814b-477f-8f9a-e560adcb59e2\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn" Apr 17 19:00:31.363948 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:31.363921 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d0fba779-814b-477f-8f9a-e560adcb59e2-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn\" (UID: \"d0fba779-814b-477f-8f9a-e560adcb59e2\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn" Apr 17 19:00:31.364026 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:31.363968 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d0fba779-814b-477f-8f9a-e560adcb59e2-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn\" (UID: \"d0fba779-814b-477f-8f9a-e560adcb59e2\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn" Apr 17 19:00:31.364026 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:31.363984 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d0fba779-814b-477f-8f9a-e560adcb59e2-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn\" (UID: \"d0fba779-814b-477f-8f9a-e560adcb59e2\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn" Apr 17 19:00:31.365872 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:31.365849 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d0fba779-814b-477f-8f9a-e560adcb59e2-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn\" (UID: \"d0fba779-814b-477f-8f9a-e560adcb59e2\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn" Apr 17 19:00:31.366157 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:31.366137 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d0fba779-814b-477f-8f9a-e560adcb59e2-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn\" (UID: \"d0fba779-814b-477f-8f9a-e560adcb59e2\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn" Apr 17 19:00:31.370867 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:31.370808 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb7gx\" (UniqueName: \"kubernetes.io/projected/d0fba779-814b-477f-8f9a-e560adcb59e2-kube-api-access-mb7gx\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn\" (UID: \"d0fba779-814b-477f-8f9a-e560adcb59e2\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn" Apr 17 19:00:31.435060 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:31.435027 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn" Apr 17 19:00:31.771159 ip-10-0-132-192 kubenswrapper[2571]: W0417 19:00:31.767815 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0fba779_814b_477f_8f9a_e560adcb59e2.slice/crio-fa74881e455753bd0e3c7965f747c87b8842b3200f3e14b2a8861c32e6b9e7b5 WatchSource:0}: Error finding container fa74881e455753bd0e3c7965f747c87b8842b3200f3e14b2a8861c32e6b9e7b5: Status 404 returned error can't find the container with id fa74881e455753bd0e3c7965f747c87b8842b3200f3e14b2a8861c32e6b9e7b5 Apr 17 19:00:31.771159 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:31.769259 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn"] Apr 17 19:00:32.691370 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:32.691328 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gbdcb" event={"ID":"e9e59b59-c087-43af-9cd5-880aa3027e37","Type":"ContainerStarted","Data":"4acaf5fbbe1ec246232c665d06265ce5990c966e4d86d1b057187f583d4c18e6"} Apr 17 19:00:32.691872 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:32.691578 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gbdcb" Apr 17 19:00:32.692767 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:32.692739 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn" event={"ID":"d0fba779-814b-477f-8f9a-e560adcb59e2","Type":"ContainerStarted","Data":"ccb8be8f8d0ae4eaee2592df1853ab7d6cc57a31530addae82a509c432f6791d"} Apr 17 19:00:32.692767 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:32.692767 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn" event={"ID":"d0fba779-814b-477f-8f9a-e560adcb59e2","Type":"ContainerStarted","Data":"fa74881e455753bd0e3c7965f747c87b8842b3200f3e14b2a8861c32e6b9e7b5"} Apr 17 19:00:32.711342 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:32.711300 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gbdcb" podStartSLOduration=7.662841806 podStartE2EDuration="9.711287815s" podCreationTimestamp="2026-04-17 19:00:23 +0000 UTC" firstStartedPulling="2026-04-17 19:00:29.678164923 +0000 UTC m=+668.952182193" lastFinishedPulling="2026-04-17 19:00:31.726610927 +0000 UTC m=+671.000628202" observedRunningTime="2026-04-17 19:00:32.708125201 +0000 UTC m=+671.982142497" watchObservedRunningTime="2026-04-17 19:00:32.711287815 +0000 UTC m=+671.985305158" Apr 17 19:00:37.686552 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:37.686520 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz" Apr 17 19:00:37.722071 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:37.722038 2571 generic.go:358] "Generic (PLEG): container finished" podID="d0fba779-814b-477f-8f9a-e560adcb59e2" containerID="ccb8be8f8d0ae4eaee2592df1853ab7d6cc57a31530addae82a509c432f6791d" exitCode=0 Apr 17 19:00:37.722243 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:37.722104 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn" event={"ID":"d0fba779-814b-477f-8f9a-e560adcb59e2","Type":"ContainerDied","Data":"ccb8be8f8d0ae4eaee2592df1853ab7d6cc57a31530addae82a509c432f6791d"} Apr 17 19:00:39.731422 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:39.731386 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn" event={"ID":"d0fba779-814b-477f-8f9a-e560adcb59e2","Type":"ContainerStarted","Data":"2b8d9c9747fd354165470b7c212573db0a71311b138a3665bb3e7724642a751f"} Apr 17 19:00:39.731814 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:39.731630 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn" Apr 17 19:00:39.749243 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:39.749199 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn" podStartSLOduration=7.644473803 podStartE2EDuration="8.749186167s" podCreationTimestamp="2026-04-17 19:00:31 +0000 UTC" firstStartedPulling="2026-04-17 19:00:37.722849934 +0000 UTC m=+676.996867208" lastFinishedPulling="2026-04-17 19:00:38.827562295 +0000 UTC m=+678.101579572" observedRunningTime="2026-04-17 19:00:39.74685512 +0000 UTC m=+679.020872424" watchObservedRunningTime="2026-04-17 19:00:39.749186167 +0000 UTC m=+679.023203457" Apr 17 19:00:43.709528 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:43.709495 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gbdcb" Apr 17 19:00:50.748162 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:00:50.748131 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn" Apr 17 19:01:16.431708 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:01:16.431619 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-ff598dcc8-xbh5x"] Apr 17 19:01:16.432248 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:01:16.431877 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-ff598dcc8-xbh5x" podUID="66d519dd-b5e0-4df4-8672-0f14a3d00b08" containerName="authorino" containerID="cri-o://3ab3748a1100e3e958aab48eab39bbe6eb3e9a32164cf1f697eb74c38693ae8c" gracePeriod=30 Apr 17 19:01:16.676494 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:01:16.676472 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-ff598dcc8-xbh5x" Apr 17 19:01:16.750220 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:01:16.750144 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrtp9\" (UniqueName: \"kubernetes.io/projected/66d519dd-b5e0-4df4-8672-0f14a3d00b08-kube-api-access-hrtp9\") pod \"66d519dd-b5e0-4df4-8672-0f14a3d00b08\" (UID: \"66d519dd-b5e0-4df4-8672-0f14a3d00b08\") " Apr 17 19:01:16.750220 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:01:16.750175 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/66d519dd-b5e0-4df4-8672-0f14a3d00b08-tls-cert\") pod \"66d519dd-b5e0-4df4-8672-0f14a3d00b08\" (UID: \"66d519dd-b5e0-4df4-8672-0f14a3d00b08\") " Apr 17 19:01:16.752257 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:01:16.752226 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66d519dd-b5e0-4df4-8672-0f14a3d00b08-kube-api-access-hrtp9" (OuterVolumeSpecName: "kube-api-access-hrtp9") pod "66d519dd-b5e0-4df4-8672-0f14a3d00b08" (UID: "66d519dd-b5e0-4df4-8672-0f14a3d00b08"). InnerVolumeSpecName "kube-api-access-hrtp9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 19:01:16.759876 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:01:16.759853 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66d519dd-b5e0-4df4-8672-0f14a3d00b08-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "66d519dd-b5e0-4df4-8672-0f14a3d00b08" (UID: "66d519dd-b5e0-4df4-8672-0f14a3d00b08"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 19:01:16.851002 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:01:16.850978 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hrtp9\" (UniqueName: \"kubernetes.io/projected/66d519dd-b5e0-4df4-8672-0f14a3d00b08-kube-api-access-hrtp9\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 19:01:16.851002 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:01:16.851004 2571 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/66d519dd-b5e0-4df4-8672-0f14a3d00b08-tls-cert\") on node \"ip-10-0-132-192.ec2.internal\" DevicePath \"\"" Apr 17 19:01:16.881116 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:01:16.881088 2571 generic.go:358] "Generic (PLEG): container finished" podID="66d519dd-b5e0-4df4-8672-0f14a3d00b08" containerID="3ab3748a1100e3e958aab48eab39bbe6eb3e9a32164cf1f697eb74c38693ae8c" exitCode=0 Apr 17 19:01:16.881225 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:01:16.881139 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-ff598dcc8-xbh5x" Apr 17 19:01:16.881225 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:01:16.881181 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-ff598dcc8-xbh5x" event={"ID":"66d519dd-b5e0-4df4-8672-0f14a3d00b08","Type":"ContainerDied","Data":"3ab3748a1100e3e958aab48eab39bbe6eb3e9a32164cf1f697eb74c38693ae8c"} Apr 17 19:01:16.881302 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:01:16.881224 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-ff598dcc8-xbh5x" event={"ID":"66d519dd-b5e0-4df4-8672-0f14a3d00b08","Type":"ContainerDied","Data":"98c07168606a3012a45685f6341e7065c69be9fcca879d17fef3f4442efce597"} Apr 17 19:01:16.881302 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:01:16.881242 2571 scope.go:117] "RemoveContainer" containerID="3ab3748a1100e3e958aab48eab39bbe6eb3e9a32164cf1f697eb74c38693ae8c" Apr 17 19:01:16.890858 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:01:16.890844 2571 scope.go:117] "RemoveContainer" containerID="3ab3748a1100e3e958aab48eab39bbe6eb3e9a32164cf1f697eb74c38693ae8c" Apr 17 19:01:16.891077 ip-10-0-132-192 kubenswrapper[2571]: E0417 19:01:16.891060 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ab3748a1100e3e958aab48eab39bbe6eb3e9a32164cf1f697eb74c38693ae8c\": container with ID starting with 3ab3748a1100e3e958aab48eab39bbe6eb3e9a32164cf1f697eb74c38693ae8c not found: ID does not exist" containerID="3ab3748a1100e3e958aab48eab39bbe6eb3e9a32164cf1f697eb74c38693ae8c" Apr 17 19:01:16.891127 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:01:16.891085 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ab3748a1100e3e958aab48eab39bbe6eb3e9a32164cf1f697eb74c38693ae8c"} err="failed to get container status \"3ab3748a1100e3e958aab48eab39bbe6eb3e9a32164cf1f697eb74c38693ae8c\": rpc error: code = NotFound desc = could not find container \"3ab3748a1100e3e958aab48eab39bbe6eb3e9a32164cf1f697eb74c38693ae8c\": container with ID starting with 3ab3748a1100e3e958aab48eab39bbe6eb3e9a32164cf1f697eb74c38693ae8c not found: ID does not exist" Apr 17 19:01:16.901910 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:01:16.901885 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-ff598dcc8-xbh5x"] Apr 17 19:01:16.904390 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:01:16.904366 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-ff598dcc8-xbh5x"] Apr 17 19:01:17.238149 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:01:17.238114 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66d519dd-b5e0-4df4-8672-0f14a3d00b08" path="/var/lib/kubelet/pods/66d519dd-b5e0-4df4-8672-0f14a3d00b08/volumes" Apr 17 19:23:40.498710 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:23:40.498626 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-7bd689b8-2dsbv_7a1556c7-83af-4c76-adc1-75b2252706f3/maas-api/0.log" Apr 17 19:23:40.836071 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:23:40.836039 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6fc6488c9d-jbgdp_08b734ec-3c4b-42b6-830b-03d6aa1ece20/manager/0.log" Apr 17 19:23:41.178610 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:23:41.178528 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-cvvzj_d98f31e4-a83d-4a7f-819a-0d9fb01a99a3/postgres/0.log" Apr 17 19:23:41.933835 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:23:41.933801 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hqjx8_5768680b-c657-4605-86d2-05896de9cea7/extract/0.log" Apr 17 19:23:41.939584 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:23:41.939556 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hqjx8_5768680b-c657-4605-86d2-05896de9cea7/util/0.log" Apr 17 19:23:41.945854 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:23:41.945834 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hqjx8_5768680b-c657-4605-86d2-05896de9cea7/pull/0.log" Apr 17 19:23:42.051822 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:23:42.051793 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hb8v7_ee74b3d9-cab8-4e88-9052-dd2d7f28e1df/util/0.log" Apr 17 19:23:42.057973 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:23:42.057949 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hb8v7_ee74b3d9-cab8-4e88-9052-dd2d7f28e1df/pull/0.log" Apr 17 19:23:42.064658 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:23:42.064640 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hb8v7_ee74b3d9-cab8-4e88-9052-dd2d7f28e1df/extract/0.log" Apr 17 19:23:42.176416 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:23:42.176390 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73lndsr_fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d/util/0.log" Apr 17 19:23:42.182892 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:23:42.182867 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73lndsr_fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d/pull/0.log" Apr 17 19:23:42.188953 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:23:42.188894 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73lndsr_fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d/extract/0.log" Apr 17 19:23:42.291818 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:23:42.291794 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1mrjm2_b1b8254e-6889-4a58-99ba-3dfe78089325/extract/0.log" Apr 17 19:23:42.297217 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:23:42.297194 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1mrjm2_b1b8254e-6889-4a58-99ba-3dfe78089325/util/0.log" Apr 17 19:23:42.302967 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:23:42.302913 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1mrjm2_b1b8254e-6889-4a58-99ba-3dfe78089325/pull/0.log" Apr 17 19:23:42.542200 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:23:42.542172 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-rpcm7_7c1cefaf-94fe-4ab4-a072-078ba0be1ec3/manager/0.log" Apr 17 19:23:42.649918 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:23:42.649896 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-d4psq_a2d8d80b-bcb5-4e98-87c8-7bf38bd21c44/manager/0.log" Apr 17 19:23:42.873355 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:23:42.873280 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-66ln6_b2a9405a-505c-4d1a-8245-439bd2294533/registry-server/0.log" Apr 17 19:23:44.586824 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:23:44.586787 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cflfthd_bb4dbe95-70be-41e5-961f-23695c4912f3/istio-proxy/0.log" Apr 17 19:23:45.027543 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:23:45.027512 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-ldr4c_9c067f1a-52b1-40a7-a39b-61a914c4a305/istio-proxy/0.log" Apr 17 19:23:45.474749 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:23:45.474669 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz_03d7e0c5-238a-4f28-baa9-d78291c936f7/storage-initializer/0.log" Apr 17 19:23:45.482901 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:23:45.482875 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-nr2qz_03d7e0c5-238a-4f28-baa9-d78291c936f7/main/0.log" Apr 17 19:23:45.701779 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:23:45.701744 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-gbdcb_e9e59b59-c087-43af-9cd5-880aa3027e37/storage-initializer/0.log" Apr 17 19:23:45.709825 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:23:45.709795 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-gbdcb_e9e59b59-c087-43af-9cd5-880aa3027e37/main/0.log" Apr 17 19:23:45.819478 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:23:45.819425 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn_d0fba779-814b-477f-8f9a-e560adcb59e2/storage-initializer/0.log" Apr 17 19:23:45.827007 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:23:45.826977 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccg49hn_d0fba779-814b-477f-8f9a-e560adcb59e2/main/0.log" Apr 17 19:23:57.535684 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:23:57.535650 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-44ptd_a2d6e798-4c54-4a87-9001-6aa609214c8a/global-pull-secret-syncer/0.log" Apr 17 19:23:57.709993 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:23:57.709964 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-rjfcm_396ffb22-0675-49a2-99f6-018d18d53fe3/konnectivity-agent/0.log" Apr 17 19:23:57.729356 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:23:57.729328 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-132-192.ec2.internal_0286f842e2a4c3c84425f64aac72ff7f/haproxy/0.log" Apr 17 19:24:01.857625 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:01.857599 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hqjx8_5768680b-c657-4605-86d2-05896de9cea7/extract/0.log" Apr 17 19:24:01.897439 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:01.897414 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hqjx8_5768680b-c657-4605-86d2-05896de9cea7/util/0.log" Apr 17 19:24:01.934606 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:01.934583 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hqjx8_5768680b-c657-4605-86d2-05896de9cea7/pull/0.log" Apr 17 19:24:01.982511 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:01.982452 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hb8v7_ee74b3d9-cab8-4e88-9052-dd2d7f28e1df/extract/0.log" Apr 17 19:24:02.023184 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:02.023159 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hb8v7_ee74b3d9-cab8-4e88-9052-dd2d7f28e1df/util/0.log" Apr 17 19:24:02.069667 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:02.069634 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hb8v7_ee74b3d9-cab8-4e88-9052-dd2d7f28e1df/pull/0.log" Apr 17 19:24:02.111544 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:02.111454 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73lndsr_fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d/extract/0.log" Apr 17 19:24:02.148219 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:02.148191 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73lndsr_fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d/util/0.log" Apr 17 19:24:02.194664 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:02.194630 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73lndsr_fbd9f2e3-9354-4ec3-86bc-ddcc80c8cd9d/pull/0.log" Apr 17 19:24:02.241550 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:02.241525 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1mrjm2_b1b8254e-6889-4a58-99ba-3dfe78089325/extract/0.log" Apr 17 19:24:02.284206 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:02.284176 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1mrjm2_b1b8254e-6889-4a58-99ba-3dfe78089325/util/0.log" Apr 17 19:24:02.332857 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:02.332828 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1mrjm2_b1b8254e-6889-4a58-99ba-3dfe78089325/pull/0.log" Apr 17 19:24:02.604307 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:02.604275 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-rpcm7_7c1cefaf-94fe-4ab4-a072-078ba0be1ec3/manager/0.log" Apr 17 19:24:02.647084 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:02.647054 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-d4psq_a2d8d80b-bcb5-4e98-87c8-7bf38bd21c44/manager/0.log" Apr 17 19:24:02.743321 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:02.743290 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-66ln6_b2a9405a-505c-4d1a-8245-439bd2294533/registry-server/0.log" Apr 17 19:24:04.330246 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:04.330217 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c68c979e-cae5-44ab-8530-6033686ab885/alertmanager/0.log" Apr 17 19:24:04.349366 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:04.349342 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c68c979e-cae5-44ab-8530-6033686ab885/config-reloader/0.log" Apr 17 19:24:04.368804 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:04.368776 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c68c979e-cae5-44ab-8530-6033686ab885/kube-rbac-proxy-web/0.log" Apr 17 19:24:04.395157 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:04.395137 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c68c979e-cae5-44ab-8530-6033686ab885/kube-rbac-proxy/0.log" Apr 17 19:24:04.414253 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:04.414228 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c68c979e-cae5-44ab-8530-6033686ab885/kube-rbac-proxy-metric/0.log" Apr 17 19:24:04.438016 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:04.437995 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c68c979e-cae5-44ab-8530-6033686ab885/prom-label-proxy/0.log" Apr 17 19:24:04.457674 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:04.457628 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c68c979e-cae5-44ab-8530-6033686ab885/init-config-reloader/0.log" Apr 17 19:24:04.622897 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:04.622812 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-r9qw6_d2259eb2-1dc7-4fbb-95f8-174464456871/monitoring-plugin/0.log" Apr 17 19:24:04.652166 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:04.652139 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jjbzz_5a3e7ac2-a313-4475-835e-44fbfe441ae1/node-exporter/0.log" Apr 17 19:24:04.670567 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:04.670537 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jjbzz_5a3e7ac2-a313-4475-835e-44fbfe441ae1/kube-rbac-proxy/0.log" Apr 17 19:24:04.692256 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:04.692232 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jjbzz_5a3e7ac2-a313-4475-835e-44fbfe441ae1/init-textfile/0.log" Apr 17 19:24:05.148435 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:05.148397 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-759c9978dc-xj9k4_cc158f8a-b93a-495f-9067-c18fb50820cd/telemeter-client/0.log" Apr 17 19:24:05.167050 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:05.167016 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-759c9978dc-xj9k4_cc158f8a-b93a-495f-9067-c18fb50820cd/reload/0.log" Apr 17 19:24:05.186797 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:05.186762 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-759c9978dc-xj9k4_cc158f8a-b93a-495f-9067-c18fb50820cd/kube-rbac-proxy/0.log" Apr 17 19:24:06.144468 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:06.144436 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-558qr/perf-node-gather-daemonset-gk5zt"] Apr 17 19:24:06.144838 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:06.144824 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="66d519dd-b5e0-4df4-8672-0f14a3d00b08" containerName="authorino" Apr 17 19:24:06.144838 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:06.144836 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="66d519dd-b5e0-4df4-8672-0f14a3d00b08" containerName="authorino" Apr 17 19:24:06.144918 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:06.144899 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="66d519dd-b5e0-4df4-8672-0f14a3d00b08" containerName="authorino" Apr 17 19:24:06.147939 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:06.147923 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-558qr/perf-node-gather-daemonset-gk5zt" Apr 17 19:24:06.150107 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:06.150068 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-558qr\"/\"kube-root-ca.crt\"" Apr 17 19:24:06.151027 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:06.151005 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-558qr\"/\"default-dockercfg-6t7f8\"" Apr 17 19:24:06.151141 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:06.151008 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-558qr\"/\"openshift-service-ca.crt\"" Apr 17 19:24:06.157555 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:06.157534 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-558qr/perf-node-gather-daemonset-gk5zt"] Apr 17 19:24:06.185718 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:06.185688 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/45627a29-1088-4e8f-8f9b-c97bee7c7bfb-sys\") pod \"perf-node-gather-daemonset-gk5zt\" (UID: \"45627a29-1088-4e8f-8f9b-c97bee7c7bfb\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-gk5zt" Apr 17 19:24:06.185871 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:06.185760 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/45627a29-1088-4e8f-8f9b-c97bee7c7bfb-proc\") pod \"perf-node-gather-daemonset-gk5zt\" (UID: \"45627a29-1088-4e8f-8f9b-c97bee7c7bfb\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-gk5zt" Apr 17 19:24:06.185871 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:06.185787 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/45627a29-1088-4e8f-8f9b-c97bee7c7bfb-lib-modules\") pod \"perf-node-gather-daemonset-gk5zt\" (UID: \"45627a29-1088-4e8f-8f9b-c97bee7c7bfb\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-gk5zt" Apr 17 19:24:06.185871 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:06.185808 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/45627a29-1088-4e8f-8f9b-c97bee7c7bfb-podres\") pod \"perf-node-gather-daemonset-gk5zt\" (UID: \"45627a29-1088-4e8f-8f9b-c97bee7c7bfb\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-gk5zt" Apr 17 19:24:06.185871 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:06.185831 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqt72\" (UniqueName: \"kubernetes.io/projected/45627a29-1088-4e8f-8f9b-c97bee7c7bfb-kube-api-access-cqt72\") pod \"perf-node-gather-daemonset-gk5zt\" (UID: \"45627a29-1088-4e8f-8f9b-c97bee7c7bfb\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-gk5zt" Apr 17 19:24:06.286801 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:06.286767 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/45627a29-1088-4e8f-8f9b-c97bee7c7bfb-sys\") pod \"perf-node-gather-daemonset-gk5zt\" (UID: \"45627a29-1088-4e8f-8f9b-c97bee7c7bfb\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-gk5zt" Apr 17 19:24:06.286976 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:06.286817 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/45627a29-1088-4e8f-8f9b-c97bee7c7bfb-proc\") pod \"perf-node-gather-daemonset-gk5zt\" (UID: \"45627a29-1088-4e8f-8f9b-c97bee7c7bfb\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-gk5zt" Apr 17 19:24:06.286976 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:06.286842 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/45627a29-1088-4e8f-8f9b-c97bee7c7bfb-lib-modules\") pod \"perf-node-gather-daemonset-gk5zt\" (UID: \"45627a29-1088-4e8f-8f9b-c97bee7c7bfb\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-gk5zt" Apr 17 19:24:06.286976 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:06.286870 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/45627a29-1088-4e8f-8f9b-c97bee7c7bfb-podres\") pod \"perf-node-gather-daemonset-gk5zt\" (UID: \"45627a29-1088-4e8f-8f9b-c97bee7c7bfb\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-gk5zt" Apr 17 19:24:06.286976 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:06.286893 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cqt72\" (UniqueName: \"kubernetes.io/projected/45627a29-1088-4e8f-8f9b-c97bee7c7bfb-kube-api-access-cqt72\") pod \"perf-node-gather-daemonset-gk5zt\" (UID: \"45627a29-1088-4e8f-8f9b-c97bee7c7bfb\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-gk5zt" Apr 17 19:24:06.286976 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:06.286905 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/45627a29-1088-4e8f-8f9b-c97bee7c7bfb-sys\") pod \"perf-node-gather-daemonset-gk5zt\" (UID: \"45627a29-1088-4e8f-8f9b-c97bee7c7bfb\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-gk5zt" Apr 17 19:24:06.286976 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:06.286913 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/45627a29-1088-4e8f-8f9b-c97bee7c7bfb-proc\") pod \"perf-node-gather-daemonset-gk5zt\" (UID: \"45627a29-1088-4e8f-8f9b-c97bee7c7bfb\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-gk5zt" Apr 17 19:24:06.287220 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:06.287017 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/45627a29-1088-4e8f-8f9b-c97bee7c7bfb-lib-modules\") pod \"perf-node-gather-daemonset-gk5zt\" (UID: \"45627a29-1088-4e8f-8f9b-c97bee7c7bfb\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-gk5zt" Apr 17 19:24:06.287220 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:06.287025 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/45627a29-1088-4e8f-8f9b-c97bee7c7bfb-podres\") pod \"perf-node-gather-daemonset-gk5zt\" (UID: \"45627a29-1088-4e8f-8f9b-c97bee7c7bfb\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-gk5zt" Apr 17 19:24:06.294080 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:06.294057 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqt72\" (UniqueName: \"kubernetes.io/projected/45627a29-1088-4e8f-8f9b-c97bee7c7bfb-kube-api-access-cqt72\") pod \"perf-node-gather-daemonset-gk5zt\" (UID: \"45627a29-1088-4e8f-8f9b-c97bee7c7bfb\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-gk5zt" Apr 17 19:24:06.459324 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:06.459240 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-558qr/perf-node-gather-daemonset-gk5zt" Apr 17 19:24:06.585430 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:06.585402 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-558qr/perf-node-gather-daemonset-gk5zt"] Apr 17 19:24:06.587507 ip-10-0-132-192 kubenswrapper[2571]: W0417 19:24:06.587479 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod45627a29_1088_4e8f_8f9b_c97bee7c7bfb.slice/crio-38ee039df06d024f73a145768a64af2ddf3f7560a406a8c673bf6f977970ae69 WatchSource:0}: Error finding container 38ee039df06d024f73a145768a64af2ddf3f7560a406a8c673bf6f977970ae69: Status 404 returned error can't find the container with id 38ee039df06d024f73a145768a64af2ddf3f7560a406a8c673bf6f977970ae69 Apr 17 19:24:06.589423 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:06.589399 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 19:24:07.194664 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:07.194626 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-558qr/perf-node-gather-daemonset-gk5zt" event={"ID":"45627a29-1088-4e8f-8f9b-c97bee7c7bfb","Type":"ContainerStarted","Data":"1a192923036d2e7f0b96d64fbe1a43594b2907e0b4d8d4163a2ff1822b2b062d"} Apr 17 19:24:07.194664 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:07.194666 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-558qr/perf-node-gather-daemonset-gk5zt" event={"ID":"45627a29-1088-4e8f-8f9b-c97bee7c7bfb","Type":"ContainerStarted","Data":"38ee039df06d024f73a145768a64af2ddf3f7560a406a8c673bf6f977970ae69"} Apr 17 19:24:07.195126 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:07.194690 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-558qr/perf-node-gather-daemonset-gk5zt" Apr 17 19:24:07.210614 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:07.210571 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-558qr/perf-node-gather-daemonset-gk5zt" podStartSLOduration=1.210559033 podStartE2EDuration="1.210559033s" podCreationTimestamp="2026-04-17 19:24:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 19:24:07.207325142 +0000 UTC m=+2086.481342436" watchObservedRunningTime="2026-04-17 19:24:07.210559033 +0000 UTC m=+2086.484576351" Apr 17 19:24:07.248021 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:07.248001 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-556fb96dfc-fjvvd_f49ba530-b751-4126-8940-163ce9a5d35b/console/0.log" Apr 17 19:24:08.471441 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:08.471411 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-gr8p7_029e659d-f8ff-4796-ba06-aba3f3e2a830/dns/0.log" Apr 17 19:24:08.490416 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:08.490392 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-gr8p7_029e659d-f8ff-4796-ba06-aba3f3e2a830/kube-rbac-proxy/0.log" Apr 17 19:24:08.618610 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:08.618583 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-plbd4_8210137d-ed94-434c-897d-f67481261a39/dns-node-resolver/0.log" Apr 17 19:24:09.090806 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:09.090779 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-t5hn4_3d42e5aa-a588-4ce1-a264-4581a72945cb/node-ca/0.log" Apr 17 19:24:09.842364 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:09.842329 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cflfthd_bb4dbe95-70be-41e5-961f-23695c4912f3/istio-proxy/0.log" Apr 17 19:24:10.066956 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:10.066926 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-ldr4c_9c067f1a-52b1-40a7-a39b-61a914c4a305/istio-proxy/0.log" Apr 17 19:24:10.570806 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:10.570779 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-ntz86_ff724f21-8096-4540-9e0b-484999e3ecd1/serve-healthcheck-canary/0.log" Apr 17 19:24:11.005919 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:11.005894 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-88g9b_c5fd97d3-b332-4c85-9344-c0e9f314aed6/insights-operator/1.log" Apr 17 19:24:11.006559 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:11.006542 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-88g9b_c5fd97d3-b332-4c85-9344-c0e9f314aed6/insights-operator/0.log" Apr 17 19:24:11.024125 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:11.024106 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2f7gh_b37994a4-c3f2-4b31-a4bf-86096fa268fb/kube-rbac-proxy/0.log" Apr 17 19:24:11.042129 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:11.042106 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2f7gh_b37994a4-c3f2-4b31-a4bf-86096fa268fb/exporter/0.log" Apr 17 19:24:11.061307 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:11.061287 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2f7gh_b37994a4-c3f2-4b31-a4bf-86096fa268fb/extractor/0.log" Apr 17 19:24:12.991016 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:12.990990 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-7bd689b8-2dsbv_7a1556c7-83af-4c76-adc1-75b2252706f3/maas-api/0.log" Apr 17 19:24:13.092819 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:13.092784 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6fc6488c9d-jbgdp_08b734ec-3c4b-42b6-830b-03d6aa1ece20/manager/0.log" Apr 17 19:24:13.174743 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:13.174717 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-cvvzj_d98f31e4-a83d-4a7f-819a-0d9fb01a99a3/postgres/0.log" Apr 17 19:24:13.208906 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:13.208879 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-558qr/perf-node-gather-daemonset-gk5zt" Apr 17 19:24:14.248444 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:14.248417 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-697b5bd5df-x9vvj_19dba8f8-4b70-40ff-9034-d52c4e0f0ba5/manager/0.log" Apr 17 19:24:14.270211 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:14.270181 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-bgmdb_52f3400c-0624-46b1-a696-6ef36d97c1ef/openshift-lws-operator/0.log" Apr 17 19:24:18.617301 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:18.617267 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-mflf9_4c94010b-eab7-490c-8843-0f4859c4d6fd/migrator/0.log" Apr 17 19:24:18.635376 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:18.635347 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-mflf9_4c94010b-eab7-490c-8843-0f4859c4d6fd/graceful-termination/0.log" Apr 17 19:24:20.255253 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:20.255227 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-svmpv_970c6428-01b9-4aa1-be61-fc714b218008/kube-multus-additional-cni-plugins/0.log" Apr 17 19:24:20.275609 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:20.275581 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-svmpv_970c6428-01b9-4aa1-be61-fc714b218008/egress-router-binary-copy/0.log" Apr 17 19:24:20.295783 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:20.295763 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-svmpv_970c6428-01b9-4aa1-be61-fc714b218008/cni-plugins/0.log" Apr 17 19:24:20.314996 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:20.314978 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-svmpv_970c6428-01b9-4aa1-be61-fc714b218008/bond-cni-plugin/0.log" Apr 17 19:24:20.334115 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:20.334096 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-svmpv_970c6428-01b9-4aa1-be61-fc714b218008/routeoverride-cni/0.log" Apr 17 19:24:20.352899 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:20.352875 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-svmpv_970c6428-01b9-4aa1-be61-fc714b218008/whereabouts-cni-bincopy/0.log" Apr 17 19:24:20.371173 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:20.371148 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-svmpv_970c6428-01b9-4aa1-be61-fc714b218008/whereabouts-cni/0.log" Apr 17 19:24:20.412524 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:20.412497 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mcrqw_d04b9d8e-1775-4cd8-ac25-94161e15b4ee/kube-multus/0.log" Apr 17 19:24:20.482107 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:20.482079 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-d99zz_9cb68ed8-ce9b-48b8-9980-07d87baf968b/network-metrics-daemon/0.log" Apr 17 19:24:20.498077 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:20.498051 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-d99zz_9cb68ed8-ce9b-48b8-9980-07d87baf968b/kube-rbac-proxy/0.log" Apr 17 19:24:21.873271 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:21.873221 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sctgd_ccccfa46-ab60-4610-8d60-6fad3773eedb/ovn-controller/0.log" Apr 17 19:24:21.911473 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:21.911436 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sctgd_ccccfa46-ab60-4610-8d60-6fad3773eedb/ovn-acl-logging/0.log" Apr 17 19:24:21.930872 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:21.930847 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sctgd_ccccfa46-ab60-4610-8d60-6fad3773eedb/kube-rbac-proxy-node/0.log" Apr 17 19:24:21.950438 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:21.950414 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sctgd_ccccfa46-ab60-4610-8d60-6fad3773eedb/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 19:24:21.970486 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:21.970447 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sctgd_ccccfa46-ab60-4610-8d60-6fad3773eedb/northd/0.log" Apr 17 19:24:21.990254 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:21.990222 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sctgd_ccccfa46-ab60-4610-8d60-6fad3773eedb/nbdb/0.log" Apr 17 19:24:22.009469 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:22.009433 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sctgd_ccccfa46-ab60-4610-8d60-6fad3773eedb/sbdb/0.log" Apr 17 19:24:22.173549 ip-10-0-132-192 kubenswrapper[2571]: I0417 19:24:22.173450 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sctgd_ccccfa46-ab60-4610-8d60-6fad3773eedb/ovnkube-controller/0.log"