Apr 16 04:24:11.964670 ip-10-0-133-103 systemd[1]: Starting Kubernetes Kubelet... Apr 16 04:24:12.431548 ip-10-0-133-103 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 04:24:12.431548 ip-10-0-133-103 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 04:24:12.431548 ip-10-0-133-103 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 04:24:12.431548 ip-10-0-133-103 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 04:24:12.431548 ip-10-0-133-103 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 04:24:12.433230 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.433136 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 04:24:12.436250 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436233 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 04:24:12.436250 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436250 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 04:24:12.436314 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436253 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 04:24:12.436314 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436258 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 04:24:12.436314 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436261 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 04:24:12.436314 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436264 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 04:24:12.436314 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436266 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 04:24:12.436314 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436270 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 04:24:12.436314 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436275 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 04:24:12.436314 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436278 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 04:24:12.436314 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436281 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 04:24:12.436314 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436284 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 04:24:12.436314 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436286 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 04:24:12.436314 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436289 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 04:24:12.436314 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436292 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 04:24:12.436314 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436294 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 04:24:12.436314 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436303 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 04:24:12.436314 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436306 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 04:24:12.436314 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436309 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 04:24:12.436314 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436312 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 04:24:12.436314 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436315 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 04:24:12.436314 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436317 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 04:24:12.436822 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436320 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 04:24:12.436822 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436323 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 04:24:12.436822 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436339 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 04:24:12.436822 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436342 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 04:24:12.436822 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436345 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 04:24:12.436822 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436348 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 04:24:12.436822 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436351 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 04:24:12.436822 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436354 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 04:24:12.436822 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436356 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 04:24:12.436822 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436359 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 04:24:12.436822 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436361 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 04:24:12.436822 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436364 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 04:24:12.436822 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436367 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 04:24:12.436822 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436369 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 04:24:12.436822 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436372 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 04:24:12.436822 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436374 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 04:24:12.436822 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436376 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 04:24:12.436822 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436379 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 04:24:12.436822 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436381 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 04:24:12.436822 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436384 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 04:24:12.437343 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436386 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 04:24:12.437343 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436389 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 04:24:12.437343 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436392 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 04:24:12.437343 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436394 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 04:24:12.437343 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436397 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 04:24:12.437343 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436399 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 04:24:12.437343 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436402 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 04:24:12.437343 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436404 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 04:24:12.437343 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436407 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 04:24:12.437343 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436409 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 04:24:12.437343 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436412 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 04:24:12.437343 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436414 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 04:24:12.437343 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436417 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 04:24:12.437343 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436421 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 04:24:12.437343 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436424 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 04:24:12.437343 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436426 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 04:24:12.437343 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436429 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 04:24:12.437343 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436432 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 04:24:12.437343 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436435 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 04:24:12.437891 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436438 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 04:24:12.437891 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436440 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 04:24:12.437891 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436443 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 04:24:12.437891 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436447 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 04:24:12.437891 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436450 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 04:24:12.437891 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436453 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 04:24:12.437891 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436456 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 04:24:12.437891 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436459 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 04:24:12.437891 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436463 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 04:24:12.437891 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436466 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 04:24:12.437891 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436469 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 04:24:12.437891 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436472 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 04:24:12.437891 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436475 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 04:24:12.437891 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436477 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 04:24:12.437891 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436481 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 04:24:12.437891 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436484 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 04:24:12.437891 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436487 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 04:24:12.437891 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436489 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 04:24:12.437891 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436492 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 04:24:12.438407 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436495 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 04:24:12.438407 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436497 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 04:24:12.438407 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436500 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 04:24:12.438407 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436502 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 04:24:12.438407 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436505 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 04:24:12.438407 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.436507 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 04:24:12.438407 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437764 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 04:24:12.438407 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437771 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 04:24:12.438407 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437775 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 04:24:12.438407 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437780 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 04:24:12.438407 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437783 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 04:24:12.438407 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437786 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 04:24:12.438407 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437789 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 04:24:12.438407 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437791 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 04:24:12.438407 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437794 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 04:24:12.438407 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437797 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 04:24:12.438407 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437800 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 04:24:12.438407 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437803 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 04:24:12.438407 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437805 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 04:24:12.438407 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437808 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 04:24:12.438935 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437811 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 04:24:12.438935 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437813 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 04:24:12.438935 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437817 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 04:24:12.438935 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437819 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 04:24:12.438935 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437822 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 04:24:12.438935 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437825 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 04:24:12.438935 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437827 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 04:24:12.438935 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437830 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 04:24:12.438935 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437832 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 04:24:12.438935 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437835 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 04:24:12.438935 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437837 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 04:24:12.438935 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437840 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 04:24:12.438935 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437842 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 04:24:12.438935 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437845 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 04:24:12.438935 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437847 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 04:24:12.438935 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437849 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 04:24:12.438935 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437852 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 04:24:12.438935 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437854 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 04:24:12.438935 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437857 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 04:24:12.438935 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437859 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 04:24:12.439824 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437862 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 04:24:12.439824 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437864 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 04:24:12.439824 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437867 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 04:24:12.439824 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437869 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 04:24:12.439824 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437872 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 04:24:12.439824 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437874 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 04:24:12.439824 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437877 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 04:24:12.439824 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437879 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 04:24:12.439824 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437882 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 04:24:12.439824 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437885 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 04:24:12.439824 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437887 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 04:24:12.439824 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437890 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 04:24:12.439824 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437892 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 04:24:12.439824 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437895 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 04:24:12.439824 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437898 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 04:24:12.439824 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437901 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 04:24:12.439824 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437904 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 04:24:12.439824 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437906 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 04:24:12.439824 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437909 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 04:24:12.439824 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437912 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 04:24:12.440388 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437914 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 04:24:12.440388 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437917 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 04:24:12.440388 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437920 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 04:24:12.440388 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437923 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 04:24:12.440388 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437925 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 04:24:12.440388 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437928 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 04:24:12.440388 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437930 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 04:24:12.440388 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437933 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 04:24:12.440388 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437935 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 04:24:12.440388 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437938 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 04:24:12.440388 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437940 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 04:24:12.440388 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437942 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 04:24:12.440388 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437945 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 04:24:12.440388 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437947 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 04:24:12.440388 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437950 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 04:24:12.440388 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437953 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 04:24:12.440388 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437955 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 04:24:12.440388 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437957 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 04:24:12.440388 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437962 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 04:24:12.440388 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437966 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 04:24:12.440888 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437970 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 04:24:12.440888 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437972 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 04:24:12.440888 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437975 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 04:24:12.440888 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437978 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 04:24:12.440888 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437981 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 04:24:12.440888 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437984 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 04:24:12.440888 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437986 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 04:24:12.440888 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437989 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 04:24:12.440888 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437992 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 04:24:12.440888 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437995 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 04:24:12.440888 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.437997 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 04:24:12.440888 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.438000 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 04:24:12.440888 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438088 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 04:24:12.440888 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438099 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 04:24:12.440888 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438109 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 04:24:12.440888 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438113 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 04:24:12.440888 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438118 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 04:24:12.440888 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438121 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 04:24:12.440888 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438126 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 04:24:12.440888 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438130 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 04:24:12.440888 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438134 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 04:24:12.441419 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438137 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 04:24:12.441419 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438141 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 04:24:12.441419 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438144 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 04:24:12.441419 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438147 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 04:24:12.441419 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438150 2575 flags.go:64] FLAG: --cgroup-root="" Apr 16 04:24:12.441419 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438153 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 04:24:12.441419 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438156 2575 flags.go:64] FLAG: --client-ca-file="" Apr 16 04:24:12.441419 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438159 2575 flags.go:64] FLAG: --cloud-config="" Apr 16 04:24:12.441419 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438162 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 16 04:24:12.441419 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438165 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 04:24:12.441419 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438169 2575 flags.go:64] FLAG: --cluster-domain="" Apr 16 04:24:12.441419 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438172 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 04:24:12.441419 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438175 2575 flags.go:64] FLAG: --config-dir="" Apr 16 04:24:12.441419 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438178 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 04:24:12.441419 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438181 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 04:24:12.441419 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438186 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 04:24:12.441419 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438189 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 04:24:12.441419 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438192 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 04:24:12.441419 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438195 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 04:24:12.441419 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438198 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 16 04:24:12.441419 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438202 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 04:24:12.441419 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438206 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 04:24:12.441419 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438209 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 04:24:12.441419 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438212 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 04:24:12.441419 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438217 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 04:24:12.442023 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438220 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 04:24:12.442023 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438222 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 04:24:12.442023 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438225 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 04:24:12.442023 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438229 2575 flags.go:64] FLAG: --enable-server="true" Apr 16 04:24:12.442023 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438232 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 04:24:12.442023 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438237 2575 flags.go:64] FLAG: --event-burst="100" Apr 16 04:24:12.442023 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438240 2575 flags.go:64] FLAG: --event-qps="50" Apr 16 04:24:12.442023 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438243 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 04:24:12.442023 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438246 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 04:24:12.442023 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438249 2575 flags.go:64] FLAG: --eviction-hard="" Apr 16 04:24:12.442023 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438253 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 04:24:12.442023 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438256 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 04:24:12.442023 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438259 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 04:24:12.442023 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438262 2575 flags.go:64] FLAG: --eviction-soft="" Apr 16 04:24:12.442023 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438265 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 04:24:12.442023 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438268 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 04:24:12.442023 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438271 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 04:24:12.442023 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438275 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 04:24:12.442023 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438278 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 04:24:12.442023 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438281 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 04:24:12.442023 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438283 2575 flags.go:64] FLAG: --feature-gates="" Apr 16 04:24:12.442023 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438287 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 04:24:12.442023 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438291 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 04:24:12.442023 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438294 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 04:24:12.442023 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438297 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 04:24:12.442653 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438300 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 16 04:24:12.442653 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438303 2575 flags.go:64] FLAG: --help="false" Apr 16 04:24:12.442653 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438306 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-133-103.ec2.internal" Apr 16 04:24:12.442653 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438312 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 04:24:12.442653 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438315 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 04:24:12.442653 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438318 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 04:24:12.442653 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438322 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 04:24:12.442653 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438337 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 04:24:12.442653 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438340 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 04:24:12.442653 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438343 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 04:24:12.442653 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438346 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 04:24:12.442653 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438349 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 04:24:12.442653 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438352 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 04:24:12.442653 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438355 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 04:24:12.442653 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438358 2575 flags.go:64] FLAG: --kube-reserved="" Apr 16 04:24:12.442653 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438361 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 04:24:12.442653 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438364 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 04:24:12.442653 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438367 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 04:24:12.442653 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438369 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 04:24:12.442653 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438372 2575 flags.go:64] FLAG: --lock-file="" Apr 16 04:24:12.442653 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438375 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 04:24:12.442653 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438378 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 04:24:12.442653 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438381 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 04:24:12.442653 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438387 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 04:24:12.443234 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438390 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 04:24:12.443234 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438393 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 04:24:12.443234 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438396 2575 flags.go:64] FLAG: --logging-format="text" Apr 16 04:24:12.443234 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438399 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 04:24:12.443234 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438402 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 04:24:12.443234 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438405 2575 flags.go:64] FLAG: --manifest-url="" Apr 16 04:24:12.443234 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438408 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 16 04:24:12.443234 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438412 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 04:24:12.443234 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438415 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 04:24:12.443234 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438420 2575 flags.go:64] FLAG: --max-pods="110" Apr 16 04:24:12.443234 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438423 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 04:24:12.443234 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438426 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 04:24:12.443234 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438429 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 04:24:12.443234 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438432 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 04:24:12.443234 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438435 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 04:24:12.443234 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438439 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 04:24:12.443234 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438442 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 04:24:12.443234 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438450 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 04:24:12.443234 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438453 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 04:24:12.443234 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438456 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 04:24:12.443234 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438460 2575 flags.go:64] FLAG: --pod-cidr="" Apr 16 04:24:12.443234 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438463 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 04:24:12.443234 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438468 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 04:24:12.443808 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438471 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 04:24:12.443808 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438474 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 16 04:24:12.443808 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438477 2575 flags.go:64] FLAG: --port="10250" Apr 16 04:24:12.443808 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438480 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 04:24:12.443808 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438483 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-033bd3d872cd904ba" Apr 16 04:24:12.443808 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438486 2575 flags.go:64] FLAG: --qos-reserved="" Apr 16 04:24:12.443808 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438489 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 16 04:24:12.443808 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438492 2575 flags.go:64] FLAG: --register-node="true" Apr 16 04:24:12.443808 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438495 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 16 04:24:12.443808 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438498 2575 flags.go:64] FLAG: --register-with-taints="" Apr 16 04:24:12.443808 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438507 2575 flags.go:64] FLAG: --registry-burst="10" Apr 16 04:24:12.443808 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438510 2575 flags.go:64] FLAG: --registry-qps="5" Apr 16 04:24:12.443808 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438513 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 16 04:24:12.443808 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438517 2575 flags.go:64] FLAG: --reserved-memory="" Apr 16 04:24:12.443808 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438521 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 04:24:12.443808 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438524 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 04:24:12.443808 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438527 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 04:24:12.443808 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438530 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 04:24:12.443808 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438533 2575 flags.go:64] FLAG: --runonce="false" Apr 16 04:24:12.443808 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438536 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 04:24:12.443808 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438539 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 04:24:12.443808 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438542 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 16 04:24:12.443808 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438546 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 04:24:12.443808 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438549 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 04:24:12.443808 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438552 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 04:24:12.443808 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438555 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 04:24:12.444475 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438559 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 04:24:12.444475 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438562 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 04:24:12.444475 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438565 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 04:24:12.444475 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438568 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 04:24:12.444475 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438571 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 04:24:12.444475 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438574 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 04:24:12.444475 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438577 2575 flags.go:64] FLAG: --system-cgroups="" Apr 16 04:24:12.444475 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438580 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 04:24:12.444475 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438586 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 04:24:12.444475 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438589 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 16 04:24:12.444475 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438592 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 04:24:12.444475 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438595 2575 flags.go:64] FLAG: --tls-min-version="" Apr 16 04:24:12.444475 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438597 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 04:24:12.444475 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438600 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 04:24:12.444475 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438603 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 04:24:12.444475 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438606 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 04:24:12.444475 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438610 2575 flags.go:64] FLAG: --v="2" Apr 16 04:24:12.444475 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438614 2575 flags.go:64] FLAG: --version="false" Apr 16 04:24:12.444475 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438618 2575 flags.go:64] FLAG: --vmodule="" Apr 16 04:24:12.444475 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438623 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 04:24:12.444475 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.438626 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 04:24:12.444475 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439536 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 04:24:12.444475 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439555 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 04:24:12.444475 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439561 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 04:24:12.445062 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439566 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 04:24:12.445062 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439571 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 04:24:12.445062 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439576 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 04:24:12.445062 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439581 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 04:24:12.445062 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439592 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 04:24:12.445062 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439597 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 04:24:12.445062 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439601 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 04:24:12.445062 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439606 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 04:24:12.445062 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439611 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 04:24:12.445062 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439615 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 04:24:12.445062 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439620 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 04:24:12.445062 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439624 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 04:24:12.445062 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439629 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 04:24:12.445062 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439633 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 04:24:12.445062 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439637 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 04:24:12.445062 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439642 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 04:24:12.445062 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439651 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 04:24:12.445062 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439655 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 04:24:12.445062 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439659 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 04:24:12.445599 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439664 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 04:24:12.445599 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439668 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 04:24:12.445599 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439673 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 04:24:12.445599 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439677 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 04:24:12.445599 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439681 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 04:24:12.445599 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439685 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 04:24:12.445599 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439690 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 04:24:12.445599 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439694 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 04:24:12.445599 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439698 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 04:24:12.445599 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439703 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 04:24:12.445599 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439711 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 04:24:12.445599 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439784 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 04:24:12.445599 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439790 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 04:24:12.445599 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439796 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 04:24:12.445599 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439801 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 04:24:12.445599 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439806 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 04:24:12.445599 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439810 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 04:24:12.445599 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439815 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 04:24:12.445599 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439824 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 04:24:12.445599 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439831 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 04:24:12.446105 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439836 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 04:24:12.446105 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439841 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 04:24:12.446105 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439847 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 04:24:12.446105 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439857 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 04:24:12.446105 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439861 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 04:24:12.446105 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439866 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 04:24:12.446105 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439883 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 04:24:12.446105 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439891 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 04:24:12.446105 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.439941 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 04:24:12.446105 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.440009 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 04:24:12.446105 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.440013 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 04:24:12.446105 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.440016 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 04:24:12.446105 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.440018 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 04:24:12.446105 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.440021 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 04:24:12.446105 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.440025 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 04:24:12.446105 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.440027 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 04:24:12.446105 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.440030 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 04:24:12.446105 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.440035 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 04:24:12.446105 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.440038 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 04:24:12.446105 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.440040 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 04:24:12.446612 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.440043 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 04:24:12.446612 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.440046 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 04:24:12.446612 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.440049 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 04:24:12.446612 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.440054 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 04:24:12.446612 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.440060 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 04:24:12.446612 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.440063 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 04:24:12.446612 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.440066 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 04:24:12.446612 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.440069 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 04:24:12.446612 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.440073 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 04:24:12.446612 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.440076 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 04:24:12.446612 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.440080 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 04:24:12.446612 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.440083 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 04:24:12.446612 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.440086 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 04:24:12.446612 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.440097 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 04:24:12.446612 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.440099 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 04:24:12.446612 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.440102 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 04:24:12.446612 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.440105 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 04:24:12.446612 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.440108 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 04:24:12.446612 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.440111 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 04:24:12.447082 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.440114 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 04:24:12.447082 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.440118 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 04:24:12.447082 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.440120 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 04:24:12.447082 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.440123 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 04:24:12.447082 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.440126 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 04:24:12.447082 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.440813 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 04:24:12.447643 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.447621 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 04:24:12.447681 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.447645 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 04:24:12.447712 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447699 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 04:24:12.447712 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447705 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 04:24:12.447712 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447709 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 04:24:12.447712 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447712 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 04:24:12.447815 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447715 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 04:24:12.447815 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447719 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 04:24:12.447815 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447722 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 04:24:12.447815 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447725 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 04:24:12.447815 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447727 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 04:24:12.447815 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447730 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 04:24:12.447815 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447733 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 04:24:12.447815 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447735 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 04:24:12.447815 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447738 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 04:24:12.447815 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447740 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 04:24:12.447815 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447743 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 04:24:12.447815 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447746 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 04:24:12.447815 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447749 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 04:24:12.447815 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447751 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 04:24:12.447815 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447754 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 04:24:12.447815 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447756 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 04:24:12.447815 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447759 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 04:24:12.447815 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447762 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 04:24:12.447815 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447766 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 04:24:12.448299 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447771 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 04:24:12.448299 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447774 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 04:24:12.448299 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447777 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 04:24:12.448299 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447780 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 04:24:12.448299 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447783 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 04:24:12.448299 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447786 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 04:24:12.448299 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447789 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 04:24:12.448299 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447792 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 04:24:12.448299 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447794 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 04:24:12.448299 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447797 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 04:24:12.448299 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447800 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 04:24:12.448299 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447803 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 04:24:12.448299 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447805 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 04:24:12.448299 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447808 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 04:24:12.448299 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447812 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 04:24:12.448299 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447814 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 04:24:12.448299 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447817 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 04:24:12.448299 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447820 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 04:24:12.448299 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447823 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 04:24:12.448299 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447826 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 04:24:12.448818 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447829 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 04:24:12.448818 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447831 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 04:24:12.448818 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447834 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 04:24:12.448818 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447837 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 04:24:12.448818 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447839 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 04:24:12.448818 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447842 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 04:24:12.448818 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447845 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 04:24:12.448818 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447847 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 04:24:12.448818 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447850 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 04:24:12.448818 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447852 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 04:24:12.448818 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447855 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 04:24:12.448818 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447858 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 04:24:12.448818 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447860 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 04:24:12.448818 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447863 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 04:24:12.448818 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447865 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 04:24:12.448818 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447868 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 04:24:12.448818 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447871 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 04:24:12.448818 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447873 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 04:24:12.448818 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447876 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 04:24:12.449321 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447879 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 04:24:12.449321 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447881 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 04:24:12.449321 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447884 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 04:24:12.449321 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447886 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 04:24:12.449321 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447889 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 04:24:12.449321 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447892 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 04:24:12.449321 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447894 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 04:24:12.449321 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447898 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 04:24:12.449321 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447901 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 04:24:12.449321 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447904 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 04:24:12.449321 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447908 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 04:24:12.449321 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447912 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 04:24:12.449321 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447915 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 04:24:12.449321 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447917 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 04:24:12.449321 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447920 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 04:24:12.449321 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447924 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 04:24:12.449321 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447926 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 04:24:12.449321 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447929 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 04:24:12.449321 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447931 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 04:24:12.449321 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447934 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 04:24:12.449832 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447937 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 04:24:12.449832 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447940 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 04:24:12.449832 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447942 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 04:24:12.449832 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.447945 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 04:24:12.449832 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.447950 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 04:24:12.449832 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448061 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 04:24:12.449832 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448066 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 04:24:12.449832 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448069 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 04:24:12.449832 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448072 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 04:24:12.449832 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448075 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 04:24:12.449832 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448078 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 04:24:12.449832 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448081 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 04:24:12.449832 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448084 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 04:24:12.449832 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448087 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 04:24:12.449832 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448114 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 04:24:12.450210 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448119 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 04:24:12.450210 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448124 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 04:24:12.450210 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448127 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 04:24:12.450210 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448130 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 04:24:12.450210 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448134 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 04:24:12.450210 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448138 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 04:24:12.450210 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448141 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 04:24:12.450210 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448144 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 04:24:12.450210 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448146 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 04:24:12.450210 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448149 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 04:24:12.450210 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448152 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 04:24:12.450210 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448155 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 04:24:12.450210 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448158 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 04:24:12.450210 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448160 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 04:24:12.450210 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448163 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 04:24:12.450210 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448165 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 04:24:12.450210 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448168 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 04:24:12.450210 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448170 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 04:24:12.450210 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448173 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 04:24:12.450699 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448175 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 04:24:12.450699 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448178 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 04:24:12.450699 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448181 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 04:24:12.450699 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448183 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 04:24:12.450699 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448186 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 04:24:12.450699 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448188 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 04:24:12.450699 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448191 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 04:24:12.450699 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448193 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 04:24:12.450699 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448196 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 04:24:12.450699 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448199 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 04:24:12.450699 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448201 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 04:24:12.450699 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448203 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 04:24:12.450699 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448206 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 04:24:12.450699 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448210 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 04:24:12.450699 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448213 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 04:24:12.450699 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448215 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 04:24:12.450699 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448218 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 04:24:12.450699 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448221 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 04:24:12.450699 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448224 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 04:24:12.450699 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448227 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 04:24:12.451193 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448229 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 04:24:12.451193 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448232 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 04:24:12.451193 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448234 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 04:24:12.451193 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448237 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 04:24:12.451193 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448239 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 04:24:12.451193 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448242 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 04:24:12.451193 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448244 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 04:24:12.451193 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448247 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 04:24:12.451193 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448249 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 04:24:12.451193 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448252 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 04:24:12.451193 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448255 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 04:24:12.451193 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448257 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 04:24:12.451193 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448259 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 04:24:12.451193 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448262 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 04:24:12.451193 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448264 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 04:24:12.451193 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448267 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 04:24:12.451193 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448270 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 04:24:12.451193 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448272 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 04:24:12.451193 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448274 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 04:24:12.451193 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448277 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 04:24:12.451699 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448279 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 04:24:12.451699 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448282 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 04:24:12.451699 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448285 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 04:24:12.451699 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448287 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 04:24:12.451699 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448290 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 04:24:12.451699 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448292 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 04:24:12.451699 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448295 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 04:24:12.451699 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448297 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 04:24:12.451699 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448300 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 04:24:12.451699 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448302 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 04:24:12.451699 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448306 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 04:24:12.451699 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448309 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 04:24:12.451699 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448312 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 04:24:12.451699 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448315 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 04:24:12.451699 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448317 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 04:24:12.451699 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448320 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 04:24:12.451699 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:12.448322 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 04:24:12.452130 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.448342 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 04:24:12.452130 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.449079 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 04:24:12.452130 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.451662 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 04:24:12.452640 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.452627 2575 server.go:1019] "Starting client certificate rotation" Apr 16 04:24:12.452742 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.452725 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 04:24:12.452779 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.452770 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 04:24:12.478574 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.478550 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 04:24:12.480319 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.480298 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 04:24:12.498907 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.498869 2575 log.go:25] "Validated CRI v1 runtime API" Apr 16 04:24:12.505697 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.505670 2575 log.go:25] "Validated CRI v1 image API" Apr 16 04:24:12.507020 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.506997 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 04:24:12.509522 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.509505 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 04:24:12.511543 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.511523 2575 fs.go:135] Filesystem UUIDs: map[762fafa6-43fa-4cd7-ad51-5090680f54cd:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 c55579d0-6ff9-431c-9499-57fa362ac300:/dev/nvme0n1p3] Apr 16 04:24:12.511603 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.511543 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 04:24:12.517107 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.516985 2575 manager.go:217] Machine: {Timestamp:2026-04-16 04:24:12.515210802 +0000 UTC m=+0.417986791 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098771 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2f18b56a783069f0fc1e52fadcf42d SystemUUID:ec2f18b5-6a78-3069-f0fc-1e52fadcf42d BootID:6f3a652a-7536-4908-9cb2-7468db32ca03 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:7f:e2:74:c7:0d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:7f:e2:74:c7:0d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:22:3e:fe:75:8f:e8 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 04:24:12.517107 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.517097 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 04:24:12.517241 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.517191 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 04:24:12.518336 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.518297 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 04:24:12.518492 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.518340 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-103.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 04:24:12.518545 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.518501 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 04:24:12.518545 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.518516 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 04:24:12.518545 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.518534 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 04:24:12.519243 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.519231 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 04:24:12.520603 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.520592 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 16 04:24:12.520711 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.520702 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 04:24:12.523196 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.523184 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 16 04:24:12.523243 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.523201 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 04:24:12.523243 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.523213 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 04:24:12.523243 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.523222 2575 kubelet.go:397] "Adding apiserver pod source" Apr 16 04:24:12.523243 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.523233 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 04:24:12.524406 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.524393 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 04:24:12.524459 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.524412 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 04:24:12.527277 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.527250 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-n9xs7" Apr 16 04:24:12.527490 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.527474 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 04:24:12.530150 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.530136 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 04:24:12.531443 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.531426 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 04:24:12.531443 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.531443 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 04:24:12.531443 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.531449 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 04:24:12.531649 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.531454 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 04:24:12.531649 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.531460 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 04:24:12.531649 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.531466 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 04:24:12.531649 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.531472 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 04:24:12.531649 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.531478 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 04:24:12.531649 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.531485 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 04:24:12.531649 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.531491 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 04:24:12.531649 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.531499 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 04:24:12.531649 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.531508 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 04:24:12.532968 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.532957 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 04:24:12.532968 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.532968 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 04:24:12.536510 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.536495 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 04:24:12.536601 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.536528 2575 server.go:1295] "Started kubelet" Apr 16 04:24:12.536923 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.536903 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 04:24:12.537175 ip-10-0-133-103 systemd[1]: Started Kubernetes Kubelet. Apr 16 04:24:12.538048 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.538009 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 16 04:24:12.538806 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.538768 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 04:24:12.538870 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.538824 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 04:24:12.540011 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.539995 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 04:24:12.540424 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.540406 2575 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-133-103.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 04:24:12.540559 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:12.540540 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-133-103.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 04:24:12.540628 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:12.540583 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 04:24:12.544489 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.544471 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 04:24:12.544611 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.544591 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 04:24:12.544674 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.544619 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-n9xs7" Apr 16 04:24:12.545248 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.545232 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 04:24:12.545665 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.545626 2575 factory.go:55] Registering systemd factory Apr 16 04:24:12.549773 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.545822 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 04:24:12.549862 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.549855 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 04:24:12.549976 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.549956 2575 factory.go:223] Registration of the systemd container factory successfully Apr 16 04:24:12.550125 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:12.550111 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-103.ec2.internal\" not found" Apr 16 04:24:12.550254 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.550237 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 16 04:24:12.550358 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.550256 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 16 04:24:12.550447 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:12.546210 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-103.ec2.internal.18a6bbac7c2fad68 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-103.ec2.internal,UID:ip-10-0-133-103.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-133-103.ec2.internal,},FirstTimestamp:2026-04-16 04:24:12.536507752 +0000 UTC m=+0.439283741,LastTimestamp:2026-04-16 04:24:12.536507752 +0000 UTC m=+0.439283741,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-103.ec2.internal,}" Apr 16 04:24:12.550513 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.550495 2575 factory.go:153] Registering CRI-O factory Apr 16 04:24:12.550513 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.550509 2575 factory.go:223] Registration of the crio container factory successfully Apr 16 04:24:12.550614 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.550589 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 04:24:12.550663 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.550617 2575 factory.go:103] Registering Raw factory Apr 16 04:24:12.552800 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.552783 2575 manager.go:1196] Started watching for new ooms in manager Apr 16 04:24:12.553497 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.553472 2575 manager.go:319] Starting recovery of all containers Apr 16 04:24:12.557118 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.557095 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 04:24:12.557118 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:12.557119 2575 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-133-103.ec2.internal\" not found" node="ip-10-0-133-103.ec2.internal" Apr 16 04:24:12.558952 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:12.558490 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 04:24:12.566149 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.566016 2575 manager.go:324] Recovery completed Apr 16 04:24:12.570625 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.570605 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 04:24:12.573074 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.573053 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-103.ec2.internal" event="NodeHasSufficientMemory" Apr 16 04:24:12.573163 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.573087 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-103.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 04:24:12.573163 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.573105 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-103.ec2.internal" event="NodeHasSufficientPID" Apr 16 04:24:12.573660 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.573645 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 04:24:12.573714 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.573660 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 04:24:12.573714 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.573677 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 16 04:24:12.576843 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.576820 2575 policy_none.go:49] "None policy: Start" Apr 16 04:24:12.576843 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.576843 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 04:24:12.576949 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.576854 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 16 04:24:12.634258 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.627035 2575 manager.go:341] "Starting Device Plugin manager" Apr 16 04:24:12.634258 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:12.627079 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 04:24:12.634258 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.627091 2575 server.go:85] "Starting device plugin registration server" Apr 16 04:24:12.634258 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.627448 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 04:24:12.634258 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.627478 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 04:24:12.634258 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.627682 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 04:24:12.634258 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.627991 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 04:24:12.634258 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.628003 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 04:24:12.634258 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:12.628324 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 04:24:12.634258 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:12.628380 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-103.ec2.internal\" not found" Apr 16 04:24:12.700670 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.700585 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 04:24:12.701883 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.701865 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 04:24:12.701978 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.701898 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 04:24:12.701978 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.701943 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 04:24:12.701978 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.701954 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 04:24:12.702116 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:12.701999 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 04:24:12.705971 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.705951 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 04:24:12.728022 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.727984 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 04:24:12.728960 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.728942 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-103.ec2.internal" event="NodeHasSufficientMemory" Apr 16 04:24:12.729074 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.728976 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-103.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 04:24:12.729074 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.728993 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-103.ec2.internal" event="NodeHasSufficientPID" Apr 16 04:24:12.729074 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.729029 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-103.ec2.internal" Apr 16 04:24:12.737264 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.737247 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-103.ec2.internal" Apr 16 04:24:12.737358 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:12.737272 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-103.ec2.internal\": node \"ip-10-0-133-103.ec2.internal\" not found" Apr 16 04:24:12.753041 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:12.753018 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-103.ec2.internal\" not found" Apr 16 04:24:12.802437 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.802384 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-103.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-103.ec2.internal"] Apr 16 04:24:12.802581 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.802517 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 04:24:12.804222 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.804205 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-103.ec2.internal" event="NodeHasSufficientMemory" Apr 16 04:24:12.804319 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.804234 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-103.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 04:24:12.804319 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.804254 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-103.ec2.internal" event="NodeHasSufficientPID" Apr 16 04:24:12.806604 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.806591 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 04:24:12.806790 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.806777 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-103.ec2.internal" Apr 16 04:24:12.806860 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.806821 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 04:24:12.807410 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.807394 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-103.ec2.internal" event="NodeHasSufficientMemory" Apr 16 04:24:12.807511 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.807417 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-103.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 04:24:12.807511 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.807398 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-103.ec2.internal" event="NodeHasSufficientMemory" Apr 16 04:24:12.807511 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.807432 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-103.ec2.internal" event="NodeHasSufficientPID" Apr 16 04:24:12.807511 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.807456 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-103.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 04:24:12.807511 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.807473 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-103.ec2.internal" event="NodeHasSufficientPID" Apr 16 04:24:12.810203 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.810188 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-103.ec2.internal" Apr 16 04:24:12.810282 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.810213 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 04:24:12.810987 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.810970 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-103.ec2.internal" event="NodeHasSufficientMemory" Apr 16 04:24:12.811077 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.810992 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-103.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 04:24:12.811077 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.811021 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-103.ec2.internal" event="NodeHasSufficientPID" Apr 16 04:24:12.833147 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:12.833117 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-103.ec2.internal\" not found" node="ip-10-0-133-103.ec2.internal" Apr 16 04:24:12.837621 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:12.837604 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-103.ec2.internal\" not found" node="ip-10-0-133-103.ec2.internal" Apr 16 04:24:12.851159 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.851140 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b587bad82e85625e2d9083084ab92f62-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-103.ec2.internal\" (UID: \"b587bad82e85625e2d9083084ab92f62\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-103.ec2.internal" Apr 16 04:24:12.851237 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.851167 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b587bad82e85625e2d9083084ab92f62-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-103.ec2.internal\" (UID: \"b587bad82e85625e2d9083084ab92f62\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-103.ec2.internal" Apr 16 04:24:12.851237 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.851185 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/036d1e1671f062f79bfb2c499603021f-config\") pod \"kube-apiserver-proxy-ip-10-0-133-103.ec2.internal\" (UID: \"036d1e1671f062f79bfb2c499603021f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-103.ec2.internal" Apr 16 04:24:12.853211 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:12.853195 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-103.ec2.internal\" not found" Apr 16 04:24:12.951925 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.951857 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b587bad82e85625e2d9083084ab92f62-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-103.ec2.internal\" (UID: \"b587bad82e85625e2d9083084ab92f62\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-103.ec2.internal" Apr 16 04:24:12.951925 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.951894 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b587bad82e85625e2d9083084ab92f62-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-103.ec2.internal\" (UID: \"b587bad82e85625e2d9083084ab92f62\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-103.ec2.internal" Apr 16 04:24:12.951925 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.951916 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/036d1e1671f062f79bfb2c499603021f-config\") pod \"kube-apiserver-proxy-ip-10-0-133-103.ec2.internal\" (UID: \"036d1e1671f062f79bfb2c499603021f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-103.ec2.internal" Apr 16 04:24:12.952058 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.951969 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/036d1e1671f062f79bfb2c499603021f-config\") pod \"kube-apiserver-proxy-ip-10-0-133-103.ec2.internal\" (UID: \"036d1e1671f062f79bfb2c499603021f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-103.ec2.internal" Apr 16 04:24:12.952058 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.951975 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b587bad82e85625e2d9083084ab92f62-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-103.ec2.internal\" (UID: \"b587bad82e85625e2d9083084ab92f62\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-103.ec2.internal" Apr 16 04:24:12.952058 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:12.951974 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b587bad82e85625e2d9083084ab92f62-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-103.ec2.internal\" (UID: \"b587bad82e85625e2d9083084ab92f62\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-103.ec2.internal" Apr 16 04:24:12.953927 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:12.953913 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-103.ec2.internal\" not found" Apr 16 04:24:13.054708 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:13.054669 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-103.ec2.internal\" not found" Apr 16 04:24:13.137912 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:13.137871 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-103.ec2.internal" Apr 16 04:24:13.140402 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:13.140388 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-103.ec2.internal" Apr 16 04:24:13.155658 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:13.155627 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-103.ec2.internal\" not found" Apr 16 04:24:13.256244 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:13.256147 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-103.ec2.internal\" not found" Apr 16 04:24:13.356761 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:13.356713 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-103.ec2.internal\" not found" Apr 16 04:24:13.412496 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:13.412471 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 04:24:13.452702 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:13.452678 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 04:24:13.453108 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:13.452814 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 04:24:13.453108 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:13.452857 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 04:24:13.453108 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:13.452857 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 04:24:13.456966 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:13.456941 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-103.ec2.internal\" not found" Apr 16 04:24:13.545436 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:13.545359 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 04:24:13.550110 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:13.550086 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 04:19:12 +0000 UTC" deadline="2027-12-13 06:59:36.280258675 +0000 UTC" Apr 16 04:24:13.550174 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:13.550109 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14546h35m22.73015151s" Apr 16 04:24:13.556520 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:13.556497 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 04:24:13.557560 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:13.557545 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-103.ec2.internal\" not found" Apr 16 04:24:13.576439 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:13.576413 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-rnwmz" Apr 16 04:24:13.581130 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:13.581114 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-rnwmz" Apr 16 04:24:13.658070 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:13.658037 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-103.ec2.internal\" not found" Apr 16 04:24:13.758682 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:13.758652 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-103.ec2.internal\" not found" Apr 16 04:24:13.776345 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:13.776303 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod036d1e1671f062f79bfb2c499603021f.slice/crio-08ba7d2ae040469425967744615dd86e9157d551f9e810a1d3e5a4233429dff3 WatchSource:0}: Error finding container 08ba7d2ae040469425967744615dd86e9157d551f9e810a1d3e5a4233429dff3: Status 404 returned error can't find the container with id 08ba7d2ae040469425967744615dd86e9157d551f9e810a1d3e5a4233429dff3 Apr 16 04:24:13.776916 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:13.776875 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb587bad82e85625e2d9083084ab92f62.slice/crio-62282cd0704e7556fc4dd96d1b7833b1f1da9007c8da74e90f365d3cc5577399 WatchSource:0}: Error finding container 62282cd0704e7556fc4dd96d1b7833b1f1da9007c8da74e90f365d3cc5577399: Status 404 returned error can't find the container with id 62282cd0704e7556fc4dd96d1b7833b1f1da9007c8da74e90f365d3cc5577399 Apr 16 04:24:13.780954 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:13.780936 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 04:24:13.859553 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:13.859462 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-103.ec2.internal\" not found" Apr 16 04:24:13.959965 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:13.959922 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-103.ec2.internal\" not found" Apr 16 04:24:14.005856 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.005829 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 04:24:14.046092 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.046063 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-103.ec2.internal" Apr 16 04:24:14.057504 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.057480 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 04:24:14.058719 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.058705 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-103.ec2.internal" Apr 16 04:24:14.067010 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.066990 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 04:24:14.525611 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.525344 2575 apiserver.go:52] "Watching apiserver" Apr 16 04:24:14.532786 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.532751 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 04:24:14.535631 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.535601 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-cxwbn","openshift-multus/network-metrics-daemon-875wc","openshift-ovn-kubernetes/ovnkube-node-4bdrp","kube-system/konnectivity-agent-c4k78","kube-system/kube-apiserver-proxy-ip-10-0-133-103.ec2.internal","openshift-cluster-node-tuning-operator/tuned-864l5","openshift-dns/node-resolver-w7h7r","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-103.ec2.internal","openshift-multus/multus-qksrt","openshift-network-diagnostics/network-check-target-jv9m2","openshift-network-operator/iptables-alerter-x6z2n","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5qm28","openshift-image-registry/node-ca-dj5pp"] Apr 16 04:24:14.540836 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.540812 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w7h7r" Apr 16 04:24:14.540947 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.540852 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-875wc" Apr 16 04:24:14.540947 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:14.540920 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-875wc" podUID="414aeb12-26d8-4994-8fe8-1cad0342c929" Apr 16 04:24:14.543409 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.543386 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.543557 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.543532 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 04:24:14.543557 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.543545 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 04:24:14.543822 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.543804 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-rg4l9\"" Apr 16 04:24:14.545925 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.545905 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-t579d\"" Apr 16 04:24:14.548578 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.547752 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 04:24:14.548578 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.547768 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 04:24:14.548578 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.548176 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 04:24:14.548578 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.548520 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 04:24:14.548578 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.548555 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 04:24:14.551238 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.550215 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 04:24:14.551866 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.551850 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-c4k78" Apr 16 04:24:14.554394 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.554373 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 04:24:14.554657 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.554639 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 04:24:14.554917 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.554892 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-qxqfm\"" Apr 16 04:24:14.555723 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.555237 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-864l5" Apr 16 04:24:14.557448 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.557428 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 04:24:14.557547 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.557518 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-9r2xw\"" Apr 16 04:24:14.557815 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.557781 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.557815 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.557806 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 04:24:14.558386 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.557698 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-cxwbn" Apr 16 04:24:14.560451 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.560417 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jv9m2" Apr 16 04:24:14.560565 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:14.560487 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jv9m2" podUID="28a44ec7-295d-426e-86f6-f5b9bd257fb5" Apr 16 04:24:14.561955 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.561927 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 04:24:14.562141 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.562124 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 04:24:14.562397 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.562381 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-mt6hr\"" Apr 16 04:24:14.562679 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.562581 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 04:24:14.562679 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.562635 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 04:24:14.562679 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.562657 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-host-slash\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.562847 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.562696 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-run-openvswitch\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.562847 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.562727 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.562847 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.562751 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1b42bbbf-91b9-4c9a-b525-a6832184768c-ovnkube-config\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.562847 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.562779 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/fbd09ec2-88c2-46c7-b3da-09c43c74e6ab-konnectivity-ca\") pod \"konnectivity-agent-c4k78\" (UID: \"fbd09ec2-88c2-46c7-b3da-09c43c74e6ab\") " pod="kube-system/konnectivity-agent-c4k78" Apr 16 04:24:14.562847 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.562803 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bc2d5478-4b9a-4191-bbbd-ecf1ba21e214-lib-modules\") pod \"tuned-864l5\" (UID: \"bc2d5478-4b9a-4191-bbbd-ecf1ba21e214\") " pod="openshift-cluster-node-tuning-operator/tuned-864l5" Apr 16 04:24:14.562847 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.562830 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1d98373a-f359-466d-8c72-1e9b4e835c04-hosts-file\") pod \"node-resolver-w7h7r\" (UID: \"1d98373a-f359-466d-8c72-1e9b4e835c04\") " pod="openshift-dns/node-resolver-w7h7r" Apr 16 04:24:14.563150 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.562876 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 04:24:14.563150 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.562898 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-bhlg2\"" Apr 16 04:24:14.563150 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.562900 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-host-run-ovn-kubernetes\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.563150 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.562928 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bc2d5478-4b9a-4191-bbbd-ecf1ba21e214-etc-kubernetes\") pod \"tuned-864l5\" (UID: \"bc2d5478-4b9a-4191-bbbd-ecf1ba21e214\") " pod="openshift-cluster-node-tuning-operator/tuned-864l5" Apr 16 04:24:14.563150 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.562965 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bc2d5478-4b9a-4191-bbbd-ecf1ba21e214-var-lib-kubelet\") pod \"tuned-864l5\" (UID: \"bc2d5478-4b9a-4191-bbbd-ecf1ba21e214\") " pod="openshift-cluster-node-tuning-operator/tuned-864l5" Apr 16 04:24:14.563150 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.562992 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bc2d5478-4b9a-4191-bbbd-ecf1ba21e214-tmp\") pod \"tuned-864l5\" (UID: \"bc2d5478-4b9a-4191-bbbd-ecf1ba21e214\") " pod="openshift-cluster-node-tuning-operator/tuned-864l5" Apr 16 04:24:14.563150 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.563020 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xxbj\" (UniqueName: \"kubernetes.io/projected/1d98373a-f359-466d-8c72-1e9b4e835c04-kube-api-access-8xxbj\") pod \"node-resolver-w7h7r\" (UID: \"1d98373a-f359-466d-8c72-1e9b4e835c04\") " pod="openshift-dns/node-resolver-w7h7r" Apr 16 04:24:14.563150 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.563045 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/414aeb12-26d8-4994-8fe8-1cad0342c929-metrics-certs\") pod \"network-metrics-daemon-875wc\" (UID: \"414aeb12-26d8-4994-8fe8-1cad0342c929\") " pod="openshift-multus/network-metrics-daemon-875wc" Apr 16 04:24:14.563150 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.563094 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-var-lib-openvswitch\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.563150 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.563123 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-run-ovn\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.563150 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.563149 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-log-socket\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.563702 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.563173 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1b42bbbf-91b9-4c9a-b525-a6832184768c-ovnkube-script-lib\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.563702 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.563200 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fhhw\" (UniqueName: \"kubernetes.io/projected/1b42bbbf-91b9-4c9a-b525-a6832184768c-kube-api-access-6fhhw\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.563702 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.563254 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/fbd09ec2-88c2-46c7-b3da-09c43c74e6ab-agent-certs\") pod \"konnectivity-agent-c4k78\" (UID: \"fbd09ec2-88c2-46c7-b3da-09c43c74e6ab\") " pod="kube-system/konnectivity-agent-c4k78" Apr 16 04:24:14.563702 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.563291 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1d98373a-f359-466d-8c72-1e9b4e835c04-tmp-dir\") pod \"node-resolver-w7h7r\" (UID: \"1d98373a-f359-466d-8c72-1e9b4e835c04\") " pod="openshift-dns/node-resolver-w7h7r" Apr 16 04:24:14.563702 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.563346 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-run-systemd\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.563702 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.563373 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-host-cni-bin\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.563702 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.563406 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1b42bbbf-91b9-4c9a-b525-a6832184768c-ovn-node-metrics-cert\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.563702 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.563432 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 04:24:14.563702 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.563440 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bc2d5478-4b9a-4191-bbbd-ecf1ba21e214-etc-sysctl-d\") pod \"tuned-864l5\" (UID: \"bc2d5478-4b9a-4191-bbbd-ecf1ba21e214\") " pod="openshift-cluster-node-tuning-operator/tuned-864l5" Apr 16 04:24:14.563702 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.563488 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gcfb\" (UniqueName: \"kubernetes.io/projected/bc2d5478-4b9a-4191-bbbd-ecf1ba21e214-kube-api-access-4gcfb\") pod \"tuned-864l5\" (UID: \"bc2d5478-4b9a-4191-bbbd-ecf1ba21e214\") " pod="openshift-cluster-node-tuning-operator/tuned-864l5" Apr 16 04:24:14.563702 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.563540 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-systemd-units\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.563702 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.563564 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bc2d5478-4b9a-4191-bbbd-ecf1ba21e214-etc-sysconfig\") pod \"tuned-864l5\" (UID: \"bc2d5478-4b9a-4191-bbbd-ecf1ba21e214\") " pod="openshift-cluster-node-tuning-operator/tuned-864l5" Apr 16 04:24:14.563702 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.563599 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bc2d5478-4b9a-4191-bbbd-ecf1ba21e214-run\") pod \"tuned-864l5\" (UID: \"bc2d5478-4b9a-4191-bbbd-ecf1ba21e214\") " pod="openshift-cluster-node-tuning-operator/tuned-864l5" Apr 16 04:24:14.563702 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.563641 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-host-run-netns\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.563702 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.563674 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-etc-openvswitch\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.563702 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.563698 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-node-log\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.564596 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.563722 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bc2d5478-4b9a-4191-bbbd-ecf1ba21e214-host\") pod \"tuned-864l5\" (UID: \"bc2d5478-4b9a-4191-bbbd-ecf1ba21e214\") " pod="openshift-cluster-node-tuning-operator/tuned-864l5" Apr 16 04:24:14.564596 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.563789 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bc2d5478-4b9a-4191-bbbd-ecf1ba21e214-etc-tuned\") pod \"tuned-864l5\" (UID: \"bc2d5478-4b9a-4191-bbbd-ecf1ba21e214\") " pod="openshift-cluster-node-tuning-operator/tuned-864l5" Apr 16 04:24:14.564596 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.563824 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rmg9\" (UniqueName: \"kubernetes.io/projected/414aeb12-26d8-4994-8fe8-1cad0342c929-kube-api-access-4rmg9\") pod \"network-metrics-daemon-875wc\" (UID: \"414aeb12-26d8-4994-8fe8-1cad0342c929\") " pod="openshift-multus/network-metrics-daemon-875wc" Apr 16 04:24:14.564596 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.563873 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1b42bbbf-91b9-4c9a-b525-a6832184768c-env-overrides\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.564596 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.563906 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bc2d5478-4b9a-4191-bbbd-ecf1ba21e214-etc-sysctl-conf\") pod \"tuned-864l5\" (UID: \"bc2d5478-4b9a-4191-bbbd-ecf1ba21e214\") " pod="openshift-cluster-node-tuning-operator/tuned-864l5" Apr 16 04:24:14.564596 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.563930 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bc2d5478-4b9a-4191-bbbd-ecf1ba21e214-etc-systemd\") pod \"tuned-864l5\" (UID: \"bc2d5478-4b9a-4191-bbbd-ecf1ba21e214\") " pod="openshift-cluster-node-tuning-operator/tuned-864l5" Apr 16 04:24:14.564596 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.563952 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bc2d5478-4b9a-4191-bbbd-ecf1ba21e214-sys\") pod \"tuned-864l5\" (UID: \"bc2d5478-4b9a-4191-bbbd-ecf1ba21e214\") " pod="openshift-cluster-node-tuning-operator/tuned-864l5" Apr 16 04:24:14.564596 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.563992 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-host-kubelet\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.564596 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.564023 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-host-cni-netd\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.564596 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.564050 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bc2d5478-4b9a-4191-bbbd-ecf1ba21e214-etc-modprobe-d\") pod \"tuned-864l5\" (UID: \"bc2d5478-4b9a-4191-bbbd-ecf1ba21e214\") " pod="openshift-cluster-node-tuning-operator/tuned-864l5" Apr 16 04:24:14.564596 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.564256 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-x6z2n" Apr 16 04:24:14.567393 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.566625 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 04:24:14.567393 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.566690 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5qm28" Apr 16 04:24:14.567393 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.566839 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 04:24:14.567627 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.567610 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-n882s\"" Apr 16 04:24:14.567764 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.567742 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 04:24:14.569062 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.569043 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 04:24:14.569474 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.569455 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 04:24:14.569556 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.569507 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-2l45v\"" Apr 16 04:24:14.569619 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.569554 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-dj5pp" Apr 16 04:24:14.570069 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.570051 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 04:24:14.571941 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.571922 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 04:24:14.572170 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.572149 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 04:24:14.572474 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.572452 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 04:24:14.572682 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.572588 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-4kjlp\"" Apr 16 04:24:14.582979 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.582935 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 04:19:13 +0000 UTC" deadline="2027-11-03 07:58:59.294540206 +0000 UTC" Apr 16 04:24:14.582979 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.582972 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13587h34m44.711571971s" Apr 16 04:24:14.620524 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.620500 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 04:24:14.647646 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.647610 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 04:24:14.665266 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.665235 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-host-cni-bin\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.665266 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.665265 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bc2d5478-4b9a-4191-bbbd-ecf1ba21e214-etc-sysctl-d\") pod \"tuned-864l5\" (UID: \"bc2d5478-4b9a-4191-bbbd-ecf1ba21e214\") " pod="openshift-cluster-node-tuning-operator/tuned-864l5" Apr 16 04:24:14.665526 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.665294 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gcfb\" (UniqueName: \"kubernetes.io/projected/bc2d5478-4b9a-4191-bbbd-ecf1ba21e214-kube-api-access-4gcfb\") pod \"tuned-864l5\" (UID: \"bc2d5478-4b9a-4191-bbbd-ecf1ba21e214\") " pod="openshift-cluster-node-tuning-operator/tuned-864l5" Apr 16 04:24:14.665526 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.665317 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/961e60a9-039d-43e7-8613-150ee4d6b85d-multus-socket-dir-parent\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.665526 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.665358 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/961e60a9-039d-43e7-8613-150ee4d6b85d-host-var-lib-cni-multus\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.665526 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.665374 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-host-cni-bin\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.665526 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.665381 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/961e60a9-039d-43e7-8613-150ee4d6b85d-multus-daemon-config\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.665526 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.665427 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-systemd-units\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.665526 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.665456 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bc2d5478-4b9a-4191-bbbd-ecf1ba21e214-run\") pod \"tuned-864l5\" (UID: \"bc2d5478-4b9a-4191-bbbd-ecf1ba21e214\") " pod="openshift-cluster-node-tuning-operator/tuned-864l5" Apr 16 04:24:14.665526 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.665472 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bc2d5478-4b9a-4191-bbbd-ecf1ba21e214-etc-sysctl-d\") pod \"tuned-864l5\" (UID: \"bc2d5478-4b9a-4191-bbbd-ecf1ba21e214\") " pod="openshift-cluster-node-tuning-operator/tuned-864l5" Apr 16 04:24:14.665526 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.665485 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp5f6\" (UniqueName: \"kubernetes.io/projected/f587afea-01f4-4981-9efb-09779dde706b-kube-api-access-pp5f6\") pod \"node-ca-dj5pp\" (UID: \"f587afea-01f4-4981-9efb-09779dde706b\") " pod="openshift-image-registry/node-ca-dj5pp" Apr 16 04:24:14.665526 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.665529 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/56306845-377c-4423-8ae8-4f5824970788-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-cxwbn\" (UID: \"56306845-377c-4423-8ae8-4f5824970788\") " pod="openshift-multus/multus-additional-cni-plugins-cxwbn" Apr 16 04:24:14.665939 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.665538 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-systemd-units\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.665939 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.665554 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzf2n\" (UniqueName: \"kubernetes.io/projected/da93ce6c-22ad-4a30-a065-834915acdf65-kube-api-access-pzf2n\") pod \"iptables-alerter-x6z2n\" (UID: \"da93ce6c-22ad-4a30-a065-834915acdf65\") " pod="openshift-network-operator/iptables-alerter-x6z2n" Apr 16 04:24:14.665939 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.665558 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bc2d5478-4b9a-4191-bbbd-ecf1ba21e214-run\") pod \"tuned-864l5\" (UID: \"bc2d5478-4b9a-4191-bbbd-ecf1ba21e214\") " pod="openshift-cluster-node-tuning-operator/tuned-864l5" Apr 16 04:24:14.665939 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.665595 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-host-run-netns\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.665939 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.665625 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-node-log\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.665939 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.665651 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bc2d5478-4b9a-4191-bbbd-ecf1ba21e214-etc-tuned\") pod \"tuned-864l5\" (UID: \"bc2d5478-4b9a-4191-bbbd-ecf1ba21e214\") " pod="openshift-cluster-node-tuning-operator/tuned-864l5" Apr 16 04:24:14.665939 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.665668 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-host-run-netns\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.665939 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.665671 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-node-log\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.665939 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.665675 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1b42bbbf-91b9-4c9a-b525-a6832184768c-env-overrides\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.665939 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.665717 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bc2d5478-4b9a-4191-bbbd-ecf1ba21e214-etc-systemd\") pod \"tuned-864l5\" (UID: \"bc2d5478-4b9a-4191-bbbd-ecf1ba21e214\") " pod="openshift-cluster-node-tuning-operator/tuned-864l5" Apr 16 04:24:14.665939 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.665741 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bc2d5478-4b9a-4191-bbbd-ecf1ba21e214-sys\") pod \"tuned-864l5\" (UID: \"bc2d5478-4b9a-4191-bbbd-ecf1ba21e214\") " pod="openshift-cluster-node-tuning-operator/tuned-864l5" Apr 16 04:24:14.665939 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.665767 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f587afea-01f4-4981-9efb-09779dde706b-host\") pod \"node-ca-dj5pp\" (UID: \"f587afea-01f4-4981-9efb-09779dde706b\") " pod="openshift-image-registry/node-ca-dj5pp" Apr 16 04:24:14.665939 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.665822 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bc2d5478-4b9a-4191-bbbd-ecf1ba21e214-sys\") pod \"tuned-864l5\" (UID: \"bc2d5478-4b9a-4191-bbbd-ecf1ba21e214\") " pod="openshift-cluster-node-tuning-operator/tuned-864l5" Apr 16 04:24:14.665939 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.665826 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bc2d5478-4b9a-4191-bbbd-ecf1ba21e214-etc-systemd\") pod \"tuned-864l5\" (UID: \"bc2d5478-4b9a-4191-bbbd-ecf1ba21e214\") " pod="openshift-cluster-node-tuning-operator/tuned-864l5" Apr 16 04:24:14.665939 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.665849 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/56306845-377c-4423-8ae8-4f5824970788-system-cni-dir\") pod \"multus-additional-cni-plugins-cxwbn\" (UID: \"56306845-377c-4423-8ae8-4f5824970788\") " pod="openshift-multus/multus-additional-cni-plugins-cxwbn" Apr 16 04:24:14.665939 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.665876 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/961e60a9-039d-43e7-8613-150ee4d6b85d-cni-binary-copy\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.665939 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.665900 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/961e60a9-039d-43e7-8613-150ee4d6b85d-hostroot\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.665939 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.665925 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/56306845-377c-4423-8ae8-4f5824970788-cnibin\") pod \"multus-additional-cni-plugins-cxwbn\" (UID: \"56306845-377c-4423-8ae8-4f5824970788\") " pod="openshift-multus/multus-additional-cni-plugins-cxwbn" Apr 16 04:24:14.666744 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.665960 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2vh9\" (UniqueName: \"kubernetes.io/projected/1e68ff96-cb19-4476-832d-2335b154a1a7-kube-api-access-h2vh9\") pod \"aws-ebs-csi-driver-node-5qm28\" (UID: \"1e68ff96-cb19-4476-832d-2335b154a1a7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5qm28" Apr 16 04:24:14.666744 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.665997 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-host-kubelet\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.666744 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.666008 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 04:24:14.666744 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.666032 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-host-kubelet\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.666744 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.666056 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bc2d5478-4b9a-4191-bbbd-ecf1ba21e214-etc-modprobe-d\") pod \"tuned-864l5\" (UID: \"bc2d5478-4b9a-4191-bbbd-ecf1ba21e214\") " pod="openshift-cluster-node-tuning-operator/tuned-864l5" Apr 16 04:24:14.666744 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.666102 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/56306845-377c-4423-8ae8-4f5824970788-cni-binary-copy\") pod \"multus-additional-cni-plugins-cxwbn\" (UID: \"56306845-377c-4423-8ae8-4f5824970788\") " pod="openshift-multus/multus-additional-cni-plugins-cxwbn" Apr 16 04:24:14.666744 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.666148 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-host-slash\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.666744 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.666177 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-run-openvswitch\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.666744 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.666180 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1b42bbbf-91b9-4c9a-b525-a6832184768c-env-overrides\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.666744 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.666180 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bc2d5478-4b9a-4191-bbbd-ecf1ba21e214-etc-modprobe-d\") pod \"tuned-864l5\" (UID: \"bc2d5478-4b9a-4191-bbbd-ecf1ba21e214\") " pod="openshift-cluster-node-tuning-operator/tuned-864l5" Apr 16 04:24:14.666744 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.666221 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.666744 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.666253 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-run-openvswitch\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.666744 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.666223 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-host-slash\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.666744 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.666250 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1b42bbbf-91b9-4c9a-b525-a6832184768c-ovnkube-config\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.666744 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.666324 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bc2d5478-4b9a-4191-bbbd-ecf1ba21e214-lib-modules\") pod \"tuned-864l5\" (UID: \"bc2d5478-4b9a-4191-bbbd-ecf1ba21e214\") " pod="openshift-cluster-node-tuning-operator/tuned-864l5" Apr 16 04:24:14.666744 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.666282 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.666744 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.666363 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1d98373a-f359-466d-8c72-1e9b4e835c04-hosts-file\") pod \"node-resolver-w7h7r\" (UID: \"1d98373a-f359-466d-8c72-1e9b4e835c04\") " pod="openshift-dns/node-resolver-w7h7r" Apr 16 04:24:14.667529 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.666388 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/56306845-377c-4423-8ae8-4f5824970788-os-release\") pod \"multus-additional-cni-plugins-cxwbn\" (UID: \"56306845-377c-4423-8ae8-4f5824970788\") " pod="openshift-multus/multus-additional-cni-plugins-cxwbn" Apr 16 04:24:14.667529 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.666412 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-host-run-ovn-kubernetes\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.667529 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.666434 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bc2d5478-4b9a-4191-bbbd-ecf1ba21e214-etc-kubernetes\") pod \"tuned-864l5\" (UID: \"bc2d5478-4b9a-4191-bbbd-ecf1ba21e214\") " pod="openshift-cluster-node-tuning-operator/tuned-864l5" Apr 16 04:24:14.667529 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.666455 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bc2d5478-4b9a-4191-bbbd-ecf1ba21e214-tmp\") pod \"tuned-864l5\" (UID: \"bc2d5478-4b9a-4191-bbbd-ecf1ba21e214\") " pod="openshift-cluster-node-tuning-operator/tuned-864l5" Apr 16 04:24:14.667529 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.666478 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-var-lib-openvswitch\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.667529 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.666499 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-run-ovn\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.667529 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.666517 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-log-socket\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.667529 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.666537 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6fhhw\" (UniqueName: \"kubernetes.io/projected/1b42bbbf-91b9-4c9a-b525-a6832184768c-kube-api-access-6fhhw\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.667529 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.666558 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/fbd09ec2-88c2-46c7-b3da-09c43c74e6ab-agent-certs\") pod \"konnectivity-agent-c4k78\" (UID: \"fbd09ec2-88c2-46c7-b3da-09c43c74e6ab\") " pod="kube-system/konnectivity-agent-c4k78" Apr 16 04:24:14.667529 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.666575 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bc2d5478-4b9a-4191-bbbd-ecf1ba21e214-etc-kubernetes\") pod \"tuned-864l5\" (UID: \"bc2d5478-4b9a-4191-bbbd-ecf1ba21e214\") " pod="openshift-cluster-node-tuning-operator/tuned-864l5" Apr 16 04:24:14.667529 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.666578 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1d98373a-f359-466d-8c72-1e9b4e835c04-tmp-dir\") pod \"node-resolver-w7h7r\" (UID: \"1d98373a-f359-466d-8c72-1e9b4e835c04\") " pod="openshift-dns/node-resolver-w7h7r" Apr 16 04:24:14.667529 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.666623 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/961e60a9-039d-43e7-8613-150ee4d6b85d-etc-kubernetes\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.667529 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.666652 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-run-systemd\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.667529 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.666675 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-log-socket\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.667529 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.666680 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1b42bbbf-91b9-4c9a-b525-a6832184768c-ovn-node-metrics-cert\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.667529 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.666713 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/961e60a9-039d-43e7-8613-150ee4d6b85d-multus-cni-dir\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.667529 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.666745 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/961e60a9-039d-43e7-8613-150ee4d6b85d-host-run-k8s-cni-cncf-io\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.668299 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.666771 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpjhb\" (UniqueName: \"kubernetes.io/projected/961e60a9-039d-43e7-8613-150ee4d6b85d-kube-api-access-fpjhb\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.668299 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.666797 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt8pn\" (UniqueName: \"kubernetes.io/projected/28a44ec7-295d-426e-86f6-f5b9bd257fb5-kube-api-access-rt8pn\") pod \"network-check-target-jv9m2\" (UID: \"28a44ec7-295d-426e-86f6-f5b9bd257fb5\") " pod="openshift-network-diagnostics/network-check-target-jv9m2" Apr 16 04:24:14.668299 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.666815 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1d98373a-f359-466d-8c72-1e9b4e835c04-tmp-dir\") pod \"node-resolver-w7h7r\" (UID: \"1d98373a-f359-466d-8c72-1e9b4e835c04\") " pod="openshift-dns/node-resolver-w7h7r" Apr 16 04:24:14.668299 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.666822 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/da93ce6c-22ad-4a30-a065-834915acdf65-iptables-alerter-script\") pod \"iptables-alerter-x6z2n\" (UID: \"da93ce6c-22ad-4a30-a065-834915acdf65\") " pod="openshift-network-operator/iptables-alerter-x6z2n" Apr 16 04:24:14.668299 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.666850 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/da93ce6c-22ad-4a30-a065-834915acdf65-host-slash\") pod \"iptables-alerter-x6z2n\" (UID: \"da93ce6c-22ad-4a30-a065-834915acdf65\") " pod="openshift-network-operator/iptables-alerter-x6z2n" Apr 16 04:24:14.668299 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.666855 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1b42bbbf-91b9-4c9a-b525-a6832184768c-ovnkube-config\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.668299 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.666878 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bc2d5478-4b9a-4191-bbbd-ecf1ba21e214-etc-sysconfig\") pod \"tuned-864l5\" (UID: \"bc2d5478-4b9a-4191-bbbd-ecf1ba21e214\") " pod="openshift-cluster-node-tuning-operator/tuned-864l5" Apr 16 04:24:14.668299 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.666906 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-etc-openvswitch\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.668299 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.666922 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-host-run-ovn-kubernetes\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.668299 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.666931 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bc2d5478-4b9a-4191-bbbd-ecf1ba21e214-host\") pod \"tuned-864l5\" (UID: \"bc2d5478-4b9a-4191-bbbd-ecf1ba21e214\") " pod="openshift-cluster-node-tuning-operator/tuned-864l5" Apr 16 04:24:14.668299 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.666981 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1d98373a-f359-466d-8c72-1e9b4e835c04-hosts-file\") pod \"node-resolver-w7h7r\" (UID: \"1d98373a-f359-466d-8c72-1e9b4e835c04\") " pod="openshift-dns/node-resolver-w7h7r" Apr 16 04:24:14.668299 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.667009 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-etc-openvswitch\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.668299 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.667012 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bc2d5478-4b9a-4191-bbbd-ecf1ba21e214-etc-sysconfig\") pod \"tuned-864l5\" (UID: \"bc2d5478-4b9a-4191-bbbd-ecf1ba21e214\") " pod="openshift-cluster-node-tuning-operator/tuned-864l5" Apr 16 04:24:14.668299 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.667010 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4rmg9\" (UniqueName: \"kubernetes.io/projected/414aeb12-26d8-4994-8fe8-1cad0342c929-kube-api-access-4rmg9\") pod \"network-metrics-daemon-875wc\" (UID: \"414aeb12-26d8-4994-8fe8-1cad0342c929\") " pod="openshift-multus/network-metrics-daemon-875wc" Apr 16 04:24:14.668299 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.666531 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bc2d5478-4b9a-4191-bbbd-ecf1ba21e214-lib-modules\") pod \"tuned-864l5\" (UID: \"bc2d5478-4b9a-4191-bbbd-ecf1ba21e214\") " pod="openshift-cluster-node-tuning-operator/tuned-864l5" Apr 16 04:24:14.668299 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.667060 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/56306845-377c-4423-8ae8-4f5824970788-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cxwbn\" (UID: \"56306845-377c-4423-8ae8-4f5824970788\") " pod="openshift-multus/multus-additional-cni-plugins-cxwbn" Apr 16 04:24:14.668299 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.667099 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-run-systemd\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.669075 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.667108 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-var-lib-openvswitch\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.669075 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.667115 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-run-ovn\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.669075 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.667149 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bc2d5478-4b9a-4191-bbbd-ecf1ba21e214-host\") pod \"tuned-864l5\" (UID: \"bc2d5478-4b9a-4191-bbbd-ecf1ba21e214\") " pod="openshift-cluster-node-tuning-operator/tuned-864l5" Apr 16 04:24:14.669075 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.667179 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/961e60a9-039d-43e7-8613-150ee4d6b85d-system-cni-dir\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.669075 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.667202 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/961e60a9-039d-43e7-8613-150ee4d6b85d-host-var-lib-kubelet\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.669075 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.667239 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bc2d5478-4b9a-4191-bbbd-ecf1ba21e214-etc-sysctl-conf\") pod \"tuned-864l5\" (UID: \"bc2d5478-4b9a-4191-bbbd-ecf1ba21e214\") " pod="openshift-cluster-node-tuning-operator/tuned-864l5" Apr 16 04:24:14.669075 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.667272 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/961e60a9-039d-43e7-8613-150ee4d6b85d-cnibin\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.669075 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.667363 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-host-cni-netd\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.669075 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.667405 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1e68ff96-cb19-4476-832d-2335b154a1a7-socket-dir\") pod \"aws-ebs-csi-driver-node-5qm28\" (UID: \"1e68ff96-cb19-4476-832d-2335b154a1a7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5qm28" Apr 16 04:24:14.669075 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.667407 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bc2d5478-4b9a-4191-bbbd-ecf1ba21e214-etc-sysctl-conf\") pod \"tuned-864l5\" (UID: \"bc2d5478-4b9a-4191-bbbd-ecf1ba21e214\") " pod="openshift-cluster-node-tuning-operator/tuned-864l5" Apr 16 04:24:14.669075 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.667433 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1e68ff96-cb19-4476-832d-2335b154a1a7-registration-dir\") pod \"aws-ebs-csi-driver-node-5qm28\" (UID: \"1e68ff96-cb19-4476-832d-2335b154a1a7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5qm28" Apr 16 04:24:14.669075 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.667460 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1e68ff96-cb19-4476-832d-2335b154a1a7-device-dir\") pod \"aws-ebs-csi-driver-node-5qm28\" (UID: \"1e68ff96-cb19-4476-832d-2335b154a1a7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5qm28" Apr 16 04:24:14.669075 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.667486 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1b42bbbf-91b9-4c9a-b525-a6832184768c-host-cni-netd\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.669075 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.667529 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1e68ff96-cb19-4476-832d-2335b154a1a7-sys-fs\") pod \"aws-ebs-csi-driver-node-5qm28\" (UID: \"1e68ff96-cb19-4476-832d-2335b154a1a7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5qm28" Apr 16 04:24:14.669075 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.667571 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/fbd09ec2-88c2-46c7-b3da-09c43c74e6ab-konnectivity-ca\") pod \"konnectivity-agent-c4k78\" (UID: \"fbd09ec2-88c2-46c7-b3da-09c43c74e6ab\") " pod="kube-system/konnectivity-agent-c4k78" Apr 16 04:24:14.669075 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.667609 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1e68ff96-cb19-4476-832d-2335b154a1a7-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5qm28\" (UID: \"1e68ff96-cb19-4476-832d-2335b154a1a7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5qm28" Apr 16 04:24:14.669075 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.667708 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv4l2\" (UniqueName: \"kubernetes.io/projected/56306845-377c-4423-8ae8-4f5824970788-kube-api-access-bv4l2\") pod \"multus-additional-cni-plugins-cxwbn\" (UID: \"56306845-377c-4423-8ae8-4f5824970788\") " pod="openshift-multus/multus-additional-cni-plugins-cxwbn" Apr 16 04:24:14.669916 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.667741 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/961e60a9-039d-43e7-8613-150ee4d6b85d-os-release\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.669916 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.667765 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/961e60a9-039d-43e7-8613-150ee4d6b85d-host-var-lib-cni-bin\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.669916 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.667793 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/961e60a9-039d-43e7-8613-150ee4d6b85d-host-run-multus-certs\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.669916 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.667822 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bc2d5478-4b9a-4191-bbbd-ecf1ba21e214-var-lib-kubelet\") pod \"tuned-864l5\" (UID: \"bc2d5478-4b9a-4191-bbbd-ecf1ba21e214\") " pod="openshift-cluster-node-tuning-operator/tuned-864l5" Apr 16 04:24:14.669916 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.667848 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8xxbj\" (UniqueName: \"kubernetes.io/projected/1d98373a-f359-466d-8c72-1e9b4e835c04-kube-api-access-8xxbj\") pod \"node-resolver-w7h7r\" (UID: \"1d98373a-f359-466d-8c72-1e9b4e835c04\") " pod="openshift-dns/node-resolver-w7h7r" Apr 16 04:24:14.669916 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.667873 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/414aeb12-26d8-4994-8fe8-1cad0342c929-metrics-certs\") pod \"network-metrics-daemon-875wc\" (UID: \"414aeb12-26d8-4994-8fe8-1cad0342c929\") " pod="openshift-multus/network-metrics-daemon-875wc" Apr 16 04:24:14.669916 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.667882 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bc2d5478-4b9a-4191-bbbd-ecf1ba21e214-var-lib-kubelet\") pod \"tuned-864l5\" (UID: \"bc2d5478-4b9a-4191-bbbd-ecf1ba21e214\") " pod="openshift-cluster-node-tuning-operator/tuned-864l5" Apr 16 04:24:14.669916 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.667898 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f587afea-01f4-4981-9efb-09779dde706b-serviceca\") pod \"node-ca-dj5pp\" (UID: \"f587afea-01f4-4981-9efb-09779dde706b\") " pod="openshift-image-registry/node-ca-dj5pp" Apr 16 04:24:14.669916 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.667939 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/961e60a9-039d-43e7-8613-150ee4d6b85d-host-run-netns\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.669916 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.667979 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1b42bbbf-91b9-4c9a-b525-a6832184768c-ovnkube-script-lib\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.669916 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.668013 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1e68ff96-cb19-4476-832d-2335b154a1a7-etc-selinux\") pod \"aws-ebs-csi-driver-node-5qm28\" (UID: \"1e68ff96-cb19-4476-832d-2335b154a1a7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5qm28" Apr 16 04:24:14.669916 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.668042 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/56306845-377c-4423-8ae8-4f5824970788-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cxwbn\" (UID: \"56306845-377c-4423-8ae8-4f5824970788\") " pod="openshift-multus/multus-additional-cni-plugins-cxwbn" Apr 16 04:24:14.669916 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.668073 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/961e60a9-039d-43e7-8613-150ee4d6b85d-multus-conf-dir\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.669916 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.668087 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/fbd09ec2-88c2-46c7-b3da-09c43c74e6ab-konnectivity-ca\") pod \"konnectivity-agent-c4k78\" (UID: \"fbd09ec2-88c2-46c7-b3da-09c43c74e6ab\") " pod="kube-system/konnectivity-agent-c4k78" Apr 16 04:24:14.669916 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:14.668284 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 04:24:14.669916 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:14.668392 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/414aeb12-26d8-4994-8fe8-1cad0342c929-metrics-certs podName:414aeb12-26d8-4994-8fe8-1cad0342c929 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:15.168362037 +0000 UTC m=+3.071138031 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/414aeb12-26d8-4994-8fe8-1cad0342c929-metrics-certs") pod "network-metrics-daemon-875wc" (UID: "414aeb12-26d8-4994-8fe8-1cad0342c929") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 04:24:14.669916 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.668621 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1b42bbbf-91b9-4c9a-b525-a6832184768c-ovnkube-script-lib\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.670736 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.669651 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bc2d5478-4b9a-4191-bbbd-ecf1ba21e214-etc-tuned\") pod \"tuned-864l5\" (UID: \"bc2d5478-4b9a-4191-bbbd-ecf1ba21e214\") " pod="openshift-cluster-node-tuning-operator/tuned-864l5" Apr 16 04:24:14.670736 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.669760 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bc2d5478-4b9a-4191-bbbd-ecf1ba21e214-tmp\") pod \"tuned-864l5\" (UID: \"bc2d5478-4b9a-4191-bbbd-ecf1ba21e214\") " pod="openshift-cluster-node-tuning-operator/tuned-864l5" Apr 16 04:24:14.670736 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.670056 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1b42bbbf-91b9-4c9a-b525-a6832184768c-ovn-node-metrics-cert\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.670916 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.670849 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/fbd09ec2-88c2-46c7-b3da-09c43c74e6ab-agent-certs\") pod \"konnectivity-agent-c4k78\" (UID: \"fbd09ec2-88c2-46c7-b3da-09c43c74e6ab\") " pod="kube-system/konnectivity-agent-c4k78" Apr 16 04:24:14.672770 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.672748 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gcfb\" (UniqueName: \"kubernetes.io/projected/bc2d5478-4b9a-4191-bbbd-ecf1ba21e214-kube-api-access-4gcfb\") pod \"tuned-864l5\" (UID: \"bc2d5478-4b9a-4191-bbbd-ecf1ba21e214\") " pod="openshift-cluster-node-tuning-operator/tuned-864l5" Apr 16 04:24:14.674527 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.674485 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fhhw\" (UniqueName: \"kubernetes.io/projected/1b42bbbf-91b9-4c9a-b525-a6832184768c-kube-api-access-6fhhw\") pod \"ovnkube-node-4bdrp\" (UID: \"1b42bbbf-91b9-4c9a-b525-a6832184768c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.674736 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.674716 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rmg9\" (UniqueName: \"kubernetes.io/projected/414aeb12-26d8-4994-8fe8-1cad0342c929-kube-api-access-4rmg9\") pod \"network-metrics-daemon-875wc\" (UID: \"414aeb12-26d8-4994-8fe8-1cad0342c929\") " pod="openshift-multus/network-metrics-daemon-875wc" Apr 16 04:24:14.674989 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.674970 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xxbj\" (UniqueName: \"kubernetes.io/projected/1d98373a-f359-466d-8c72-1e9b4e835c04-kube-api-access-8xxbj\") pod \"node-resolver-w7h7r\" (UID: \"1d98373a-f359-466d-8c72-1e9b4e835c04\") " pod="openshift-dns/node-resolver-w7h7r" Apr 16 04:24:14.706713 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.706655 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-103.ec2.internal" event={"ID":"036d1e1671f062f79bfb2c499603021f","Type":"ContainerStarted","Data":"08ba7d2ae040469425967744615dd86e9157d551f9e810a1d3e5a4233429dff3"} Apr 16 04:24:14.707647 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.707617 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-103.ec2.internal" event={"ID":"b587bad82e85625e2d9083084ab92f62","Type":"ContainerStarted","Data":"62282cd0704e7556fc4dd96d1b7833b1f1da9007c8da74e90f365d3cc5577399"} Apr 16 04:24:14.769303 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.769253 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/961e60a9-039d-43e7-8613-150ee4d6b85d-host-var-lib-kubelet\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.769303 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.769303 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/961e60a9-039d-43e7-8613-150ee4d6b85d-cnibin\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.769549 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.769340 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1e68ff96-cb19-4476-832d-2335b154a1a7-socket-dir\") pod \"aws-ebs-csi-driver-node-5qm28\" (UID: \"1e68ff96-cb19-4476-832d-2335b154a1a7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5qm28" Apr 16 04:24:14.769549 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.769362 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1e68ff96-cb19-4476-832d-2335b154a1a7-registration-dir\") pod \"aws-ebs-csi-driver-node-5qm28\" (UID: \"1e68ff96-cb19-4476-832d-2335b154a1a7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5qm28" Apr 16 04:24:14.769549 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.769382 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1e68ff96-cb19-4476-832d-2335b154a1a7-device-dir\") pod \"aws-ebs-csi-driver-node-5qm28\" (UID: \"1e68ff96-cb19-4476-832d-2335b154a1a7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5qm28" Apr 16 04:24:14.769549 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.769402 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1e68ff96-cb19-4476-832d-2335b154a1a7-sys-fs\") pod \"aws-ebs-csi-driver-node-5qm28\" (UID: \"1e68ff96-cb19-4476-832d-2335b154a1a7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5qm28" Apr 16 04:24:14.769549 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.769423 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1e68ff96-cb19-4476-832d-2335b154a1a7-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5qm28\" (UID: \"1e68ff96-cb19-4476-832d-2335b154a1a7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5qm28" Apr 16 04:24:14.769549 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.769426 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/961e60a9-039d-43e7-8613-150ee4d6b85d-cnibin\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.769549 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.769447 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bv4l2\" (UniqueName: \"kubernetes.io/projected/56306845-377c-4423-8ae8-4f5824970788-kube-api-access-bv4l2\") pod \"multus-additional-cni-plugins-cxwbn\" (UID: \"56306845-377c-4423-8ae8-4f5824970788\") " pod="openshift-multus/multus-additional-cni-plugins-cxwbn" Apr 16 04:24:14.769549 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.769471 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/961e60a9-039d-43e7-8613-150ee4d6b85d-os-release\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.769549 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.769474 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1e68ff96-cb19-4476-832d-2335b154a1a7-registration-dir\") pod \"aws-ebs-csi-driver-node-5qm28\" (UID: \"1e68ff96-cb19-4476-832d-2335b154a1a7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5qm28" Apr 16 04:24:14.769549 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.769488 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1e68ff96-cb19-4476-832d-2335b154a1a7-device-dir\") pod \"aws-ebs-csi-driver-node-5qm28\" (UID: \"1e68ff96-cb19-4476-832d-2335b154a1a7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5qm28" Apr 16 04:24:14.769549 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.769493 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/961e60a9-039d-43e7-8613-150ee4d6b85d-host-var-lib-cni-bin\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.769549 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.769508 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1e68ff96-cb19-4476-832d-2335b154a1a7-sys-fs\") pod \"aws-ebs-csi-driver-node-5qm28\" (UID: \"1e68ff96-cb19-4476-832d-2335b154a1a7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5qm28" Apr 16 04:24:14.769549 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.769510 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1e68ff96-cb19-4476-832d-2335b154a1a7-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5qm28\" (UID: \"1e68ff96-cb19-4476-832d-2335b154a1a7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5qm28" Apr 16 04:24:14.769549 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.769533 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/961e60a9-039d-43e7-8613-150ee4d6b85d-host-var-lib-cni-bin\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.769549 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.769538 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/961e60a9-039d-43e7-8613-150ee4d6b85d-host-run-multus-certs\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.770203 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.769562 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1e68ff96-cb19-4476-832d-2335b154a1a7-socket-dir\") pod \"aws-ebs-csi-driver-node-5qm28\" (UID: \"1e68ff96-cb19-4476-832d-2335b154a1a7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5qm28" Apr 16 04:24:14.770203 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.769567 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/961e60a9-039d-43e7-8613-150ee4d6b85d-os-release\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.770203 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.769574 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/961e60a9-039d-43e7-8613-150ee4d6b85d-host-run-multus-certs\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.770203 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.769588 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f587afea-01f4-4981-9efb-09779dde706b-serviceca\") pod \"node-ca-dj5pp\" (UID: \"f587afea-01f4-4981-9efb-09779dde706b\") " pod="openshift-image-registry/node-ca-dj5pp" Apr 16 04:24:14.770203 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.769625 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/961e60a9-039d-43e7-8613-150ee4d6b85d-host-run-netns\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.770203 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.769664 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1e68ff96-cb19-4476-832d-2335b154a1a7-etc-selinux\") pod \"aws-ebs-csi-driver-node-5qm28\" (UID: \"1e68ff96-cb19-4476-832d-2335b154a1a7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5qm28" Apr 16 04:24:14.770203 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.769689 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/56306845-377c-4423-8ae8-4f5824970788-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cxwbn\" (UID: \"56306845-377c-4423-8ae8-4f5824970788\") " pod="openshift-multus/multus-additional-cni-plugins-cxwbn" Apr 16 04:24:14.770203 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.769697 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/961e60a9-039d-43e7-8613-150ee4d6b85d-host-run-netns\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.770203 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.769713 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/961e60a9-039d-43e7-8613-150ee4d6b85d-multus-conf-dir\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.770203 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.769741 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/961e60a9-039d-43e7-8613-150ee4d6b85d-multus-socket-dir-parent\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.770203 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.769779 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/961e60a9-039d-43e7-8613-150ee4d6b85d-host-var-lib-cni-multus\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.770203 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.769803 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/961e60a9-039d-43e7-8613-150ee4d6b85d-multus-daemon-config\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.770203 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.769824 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1e68ff96-cb19-4476-832d-2335b154a1a7-etc-selinux\") pod \"aws-ebs-csi-driver-node-5qm28\" (UID: \"1e68ff96-cb19-4476-832d-2335b154a1a7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5qm28" Apr 16 04:24:14.770203 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.769832 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pp5f6\" (UniqueName: \"kubernetes.io/projected/f587afea-01f4-4981-9efb-09779dde706b-kube-api-access-pp5f6\") pod \"node-ca-dj5pp\" (UID: \"f587afea-01f4-4981-9efb-09779dde706b\") " pod="openshift-image-registry/node-ca-dj5pp" Apr 16 04:24:14.770203 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.769839 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/961e60a9-039d-43e7-8613-150ee4d6b85d-multus-conf-dir\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.770203 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.769848 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/961e60a9-039d-43e7-8613-150ee4d6b85d-multus-socket-dir-parent\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.770203 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.769870 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/961e60a9-039d-43e7-8613-150ee4d6b85d-host-var-lib-cni-multus\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.770908 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.769898 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/56306845-377c-4423-8ae8-4f5824970788-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-cxwbn\" (UID: \"56306845-377c-4423-8ae8-4f5824970788\") " pod="openshift-multus/multus-additional-cni-plugins-cxwbn" Apr 16 04:24:14.770908 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.769933 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pzf2n\" (UniqueName: \"kubernetes.io/projected/da93ce6c-22ad-4a30-a065-834915acdf65-kube-api-access-pzf2n\") pod \"iptables-alerter-x6z2n\" (UID: \"da93ce6c-22ad-4a30-a065-834915acdf65\") " pod="openshift-network-operator/iptables-alerter-x6z2n" Apr 16 04:24:14.770908 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.769968 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f587afea-01f4-4981-9efb-09779dde706b-host\") pod \"node-ca-dj5pp\" (UID: \"f587afea-01f4-4981-9efb-09779dde706b\") " pod="openshift-image-registry/node-ca-dj5pp" Apr 16 04:24:14.770908 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.769995 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/56306845-377c-4423-8ae8-4f5824970788-system-cni-dir\") pod \"multus-additional-cni-plugins-cxwbn\" (UID: \"56306845-377c-4423-8ae8-4f5824970788\") " pod="openshift-multus/multus-additional-cni-plugins-cxwbn" Apr 16 04:24:14.770908 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.770002 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/961e60a9-039d-43e7-8613-150ee4d6b85d-host-var-lib-kubelet\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.770908 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.770023 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/961e60a9-039d-43e7-8613-150ee4d6b85d-cni-binary-copy\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.770908 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.770040 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f587afea-01f4-4981-9efb-09779dde706b-serviceca\") pod \"node-ca-dj5pp\" (UID: \"f587afea-01f4-4981-9efb-09779dde706b\") " pod="openshift-image-registry/node-ca-dj5pp" Apr 16 04:24:14.770908 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.770050 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/961e60a9-039d-43e7-8613-150ee4d6b85d-hostroot\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.770908 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.770060 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f587afea-01f4-4981-9efb-09779dde706b-host\") pod \"node-ca-dj5pp\" (UID: \"f587afea-01f4-4981-9efb-09779dde706b\") " pod="openshift-image-registry/node-ca-dj5pp" Apr 16 04:24:14.770908 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.770074 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/56306845-377c-4423-8ae8-4f5824970788-cnibin\") pod \"multus-additional-cni-plugins-cxwbn\" (UID: \"56306845-377c-4423-8ae8-4f5824970788\") " pod="openshift-multus/multus-additional-cni-plugins-cxwbn" Apr 16 04:24:14.770908 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.770101 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h2vh9\" (UniqueName: \"kubernetes.io/projected/1e68ff96-cb19-4476-832d-2335b154a1a7-kube-api-access-h2vh9\") pod \"aws-ebs-csi-driver-node-5qm28\" (UID: \"1e68ff96-cb19-4476-832d-2335b154a1a7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5qm28" Apr 16 04:24:14.770908 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.770126 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/56306845-377c-4423-8ae8-4f5824970788-cni-binary-copy\") pod \"multus-additional-cni-plugins-cxwbn\" (UID: \"56306845-377c-4423-8ae8-4f5824970788\") " pod="openshift-multus/multus-additional-cni-plugins-cxwbn" Apr 16 04:24:14.770908 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.770149 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/56306845-377c-4423-8ae8-4f5824970788-os-release\") pod \"multus-additional-cni-plugins-cxwbn\" (UID: \"56306845-377c-4423-8ae8-4f5824970788\") " pod="openshift-multus/multus-additional-cni-plugins-cxwbn" Apr 16 04:24:14.770908 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.770174 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/961e60a9-039d-43e7-8613-150ee4d6b85d-etc-kubernetes\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.770908 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.770219 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/961e60a9-039d-43e7-8613-150ee4d6b85d-multus-cni-dir\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.770908 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.770235 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/961e60a9-039d-43e7-8613-150ee4d6b85d-host-run-k8s-cni-cncf-io\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.770908 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.770252 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fpjhb\" (UniqueName: \"kubernetes.io/projected/961e60a9-039d-43e7-8613-150ee4d6b85d-kube-api-access-fpjhb\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.771440 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.770132 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/56306845-377c-4423-8ae8-4f5824970788-cnibin\") pod \"multus-additional-cni-plugins-cxwbn\" (UID: \"56306845-377c-4423-8ae8-4f5824970788\") " pod="openshift-multus/multus-additional-cni-plugins-cxwbn" Apr 16 04:24:14.771440 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.770299 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rt8pn\" (UniqueName: \"kubernetes.io/projected/28a44ec7-295d-426e-86f6-f5b9bd257fb5-kube-api-access-rt8pn\") pod \"network-check-target-jv9m2\" (UID: \"28a44ec7-295d-426e-86f6-f5b9bd257fb5\") " pod="openshift-network-diagnostics/network-check-target-jv9m2" Apr 16 04:24:14.771440 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.770317 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/961e60a9-039d-43e7-8613-150ee4d6b85d-etc-kubernetes\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.771440 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.770349 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/56306845-377c-4423-8ae8-4f5824970788-os-release\") pod \"multus-additional-cni-plugins-cxwbn\" (UID: \"56306845-377c-4423-8ae8-4f5824970788\") " pod="openshift-multus/multus-additional-cni-plugins-cxwbn" Apr 16 04:24:14.771440 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.770355 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/da93ce6c-22ad-4a30-a065-834915acdf65-iptables-alerter-script\") pod \"iptables-alerter-x6z2n\" (UID: \"da93ce6c-22ad-4a30-a065-834915acdf65\") " pod="openshift-network-operator/iptables-alerter-x6z2n" Apr 16 04:24:14.771440 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.770379 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/56306845-377c-4423-8ae8-4f5824970788-system-cni-dir\") pod \"multus-additional-cni-plugins-cxwbn\" (UID: \"56306845-377c-4423-8ae8-4f5824970788\") " pod="openshift-multus/multus-additional-cni-plugins-cxwbn" Apr 16 04:24:14.771440 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.770386 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/56306845-377c-4423-8ae8-4f5824970788-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cxwbn\" (UID: \"56306845-377c-4423-8ae8-4f5824970788\") " pod="openshift-multus/multus-additional-cni-plugins-cxwbn" Apr 16 04:24:14.771440 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.770381 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/961e60a9-039d-43e7-8613-150ee4d6b85d-host-run-k8s-cni-cncf-io\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.771440 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.770428 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/961e60a9-039d-43e7-8613-150ee4d6b85d-hostroot\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.771440 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.770441 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/da93ce6c-22ad-4a30-a065-834915acdf65-host-slash\") pod \"iptables-alerter-x6z2n\" (UID: \"da93ce6c-22ad-4a30-a065-834915acdf65\") " pod="openshift-network-operator/iptables-alerter-x6z2n" Apr 16 04:24:14.771440 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.770489 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/56306845-377c-4423-8ae8-4f5824970788-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-cxwbn\" (UID: \"56306845-377c-4423-8ae8-4f5824970788\") " pod="openshift-multus/multus-additional-cni-plugins-cxwbn" Apr 16 04:24:14.771440 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.770518 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/961e60a9-039d-43e7-8613-150ee4d6b85d-multus-cni-dir\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.771440 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.770556 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/da93ce6c-22ad-4a30-a065-834915acdf65-host-slash\") pod \"iptables-alerter-x6z2n\" (UID: \"da93ce6c-22ad-4a30-a065-834915acdf65\") " pod="openshift-network-operator/iptables-alerter-x6z2n" Apr 16 04:24:14.771440 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.770590 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/961e60a9-039d-43e7-8613-150ee4d6b85d-cni-binary-copy\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.771440 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.770585 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/56306845-377c-4423-8ae8-4f5824970788-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cxwbn\" (UID: \"56306845-377c-4423-8ae8-4f5824970788\") " pod="openshift-multus/multus-additional-cni-plugins-cxwbn" Apr 16 04:24:14.771440 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.770645 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/961e60a9-039d-43e7-8613-150ee4d6b85d-system-cni-dir\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.771440 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.770720 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/961e60a9-039d-43e7-8613-150ee4d6b85d-system-cni-dir\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.771968 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.770805 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/da93ce6c-22ad-4a30-a065-834915acdf65-iptables-alerter-script\") pod \"iptables-alerter-x6z2n\" (UID: \"da93ce6c-22ad-4a30-a065-834915acdf65\") " pod="openshift-network-operator/iptables-alerter-x6z2n" Apr 16 04:24:14.771968 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.770814 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/56306845-377c-4423-8ae8-4f5824970788-cni-binary-copy\") pod \"multus-additional-cni-plugins-cxwbn\" (UID: \"56306845-377c-4423-8ae8-4f5824970788\") " pod="openshift-multus/multus-additional-cni-plugins-cxwbn" Apr 16 04:24:14.771968 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.770952 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/961e60a9-039d-43e7-8613-150ee4d6b85d-multus-daemon-config\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.771968 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.770985 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/56306845-377c-4423-8ae8-4f5824970788-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cxwbn\" (UID: \"56306845-377c-4423-8ae8-4f5824970788\") " pod="openshift-multus/multus-additional-cni-plugins-cxwbn" Apr 16 04:24:14.776002 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:14.775873 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 04:24:14.776002 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:14.775896 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 04:24:14.776002 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:14.775909 2575 projected.go:194] Error preparing data for projected volume kube-api-access-rt8pn for pod openshift-network-diagnostics/network-check-target-jv9m2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 04:24:14.776002 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:14.775987 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/28a44ec7-295d-426e-86f6-f5b9bd257fb5-kube-api-access-rt8pn podName:28a44ec7-295d-426e-86f6-f5b9bd257fb5 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:15.275967743 +0000 UTC m=+3.178743735 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-rt8pn" (UniqueName: "kubernetes.io/projected/28a44ec7-295d-426e-86f6-f5b9bd257fb5-kube-api-access-rt8pn") pod "network-check-target-jv9m2" (UID: "28a44ec7-295d-426e-86f6-f5b9bd257fb5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 04:24:14.778625 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.778570 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzf2n\" (UniqueName: \"kubernetes.io/projected/da93ce6c-22ad-4a30-a065-834915acdf65-kube-api-access-pzf2n\") pod \"iptables-alerter-x6z2n\" (UID: \"da93ce6c-22ad-4a30-a065-834915acdf65\") " pod="openshift-network-operator/iptables-alerter-x6z2n" Apr 16 04:24:14.778888 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.778863 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv4l2\" (UniqueName: \"kubernetes.io/projected/56306845-377c-4423-8ae8-4f5824970788-kube-api-access-bv4l2\") pod \"multus-additional-cni-plugins-cxwbn\" (UID: \"56306845-377c-4423-8ae8-4f5824970788\") " pod="openshift-multus/multus-additional-cni-plugins-cxwbn" Apr 16 04:24:14.778988 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.778958 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2vh9\" (UniqueName: \"kubernetes.io/projected/1e68ff96-cb19-4476-832d-2335b154a1a7-kube-api-access-h2vh9\") pod \"aws-ebs-csi-driver-node-5qm28\" (UID: \"1e68ff96-cb19-4476-832d-2335b154a1a7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5qm28" Apr 16 04:24:14.779034 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.778965 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp5f6\" (UniqueName: \"kubernetes.io/projected/f587afea-01f4-4981-9efb-09779dde706b-kube-api-access-pp5f6\") pod \"node-ca-dj5pp\" (UID: \"f587afea-01f4-4981-9efb-09779dde706b\") " pod="openshift-image-registry/node-ca-dj5pp" Apr 16 04:24:14.779085 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.779032 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpjhb\" (UniqueName: \"kubernetes.io/projected/961e60a9-039d-43e7-8613-150ee4d6b85d-kube-api-access-fpjhb\") pod \"multus-qksrt\" (UID: \"961e60a9-039d-43e7-8613-150ee4d6b85d\") " pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.854805 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.854771 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w7h7r" Apr 16 04:24:14.863695 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.863665 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:14.872679 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.872650 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 04:24:14.873720 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.873691 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-c4k78" Apr 16 04:24:14.880307 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.880283 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-864l5" Apr 16 04:24:14.887010 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.886984 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qksrt" Apr 16 04:24:14.894752 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.894729 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-cxwbn" Apr 16 04:24:14.900376 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.900356 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-x6z2n" Apr 16 04:24:14.906017 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.905993 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5qm28" Apr 16 04:24:14.911663 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:14.911630 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-dj5pp" Apr 16 04:24:15.173196 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:15.173097 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/414aeb12-26d8-4994-8fe8-1cad0342c929-metrics-certs\") pod \"network-metrics-daemon-875wc\" (UID: \"414aeb12-26d8-4994-8fe8-1cad0342c929\") " pod="openshift-multus/network-metrics-daemon-875wc" Apr 16 04:24:15.173359 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:15.173255 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 04:24:15.173359 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:15.173322 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/414aeb12-26d8-4994-8fe8-1cad0342c929-metrics-certs podName:414aeb12-26d8-4994-8fe8-1cad0342c929 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:16.173302012 +0000 UTC m=+4.076078007 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/414aeb12-26d8-4994-8fe8-1cad0342c929-metrics-certs") pod "network-metrics-daemon-875wc" (UID: "414aeb12-26d8-4994-8fe8-1cad0342c929") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 04:24:15.375188 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:15.375149 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rt8pn\" (UniqueName: \"kubernetes.io/projected/28a44ec7-295d-426e-86f6-f5b9bd257fb5-kube-api-access-rt8pn\") pod \"network-check-target-jv9m2\" (UID: \"28a44ec7-295d-426e-86f6-f5b9bd257fb5\") " pod="openshift-network-diagnostics/network-check-target-jv9m2" Apr 16 04:24:15.375360 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:15.375342 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 04:24:15.375410 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:15.375366 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 04:24:15.375410 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:15.375377 2575 projected.go:194] Error preparing data for projected volume kube-api-access-rt8pn for pod openshift-network-diagnostics/network-check-target-jv9m2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 04:24:15.375517 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:15.375430 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/28a44ec7-295d-426e-86f6-f5b9bd257fb5-kube-api-access-rt8pn podName:28a44ec7-295d-426e-86f6-f5b9bd257fb5 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:16.375416763 +0000 UTC m=+4.278192759 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-rt8pn" (UniqueName: "kubernetes.io/projected/28a44ec7-295d-426e-86f6-f5b9bd257fb5-kube-api-access-rt8pn") pod "network-check-target-jv9m2" (UID: "28a44ec7-295d-426e-86f6-f5b9bd257fb5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 04:24:15.437594 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:15.437561 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda93ce6c_22ad_4a30_a065_834915acdf65.slice/crio-e7bf5f6584251e242e2f3e4f179924ce161bf096b3fff5b575ebf15e648f2506 WatchSource:0}: Error finding container e7bf5f6584251e242e2f3e4f179924ce161bf096b3fff5b575ebf15e648f2506: Status 404 returned error can't find the container with id e7bf5f6584251e242e2f3e4f179924ce161bf096b3fff5b575ebf15e648f2506 Apr 16 04:24:15.438435 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:15.438379 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbd09ec2_88c2_46c7_b3da_09c43c74e6ab.slice/crio-a18b1affbc9edd597013efc62f6b18778db1b7532671f00d1f8379512b59cc8f WatchSource:0}: Error finding container a18b1affbc9edd597013efc62f6b18778db1b7532671f00d1f8379512b59cc8f: Status 404 returned error can't find the container with id a18b1affbc9edd597013efc62f6b18778db1b7532671f00d1f8379512b59cc8f Apr 16 04:24:15.441886 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:15.441864 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b42bbbf_91b9_4c9a_b525_a6832184768c.slice/crio-ef50dfac17a89a37f724b7e90f532d24a2a4d68479892fd10edf6b8a3684f3b7 WatchSource:0}: Error finding container ef50dfac17a89a37f724b7e90f532d24a2a4d68479892fd10edf6b8a3684f3b7: Status 404 returned error can't find the container with id ef50dfac17a89a37f724b7e90f532d24a2a4d68479892fd10edf6b8a3684f3b7 Apr 16 04:24:15.443110 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:15.443085 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod961e60a9_039d_43e7_8613_150ee4d6b85d.slice/crio-acab8901befab1b11e1ee7d08eb2cf05a04a432212727633aaa7ddc880b25f65 WatchSource:0}: Error finding container acab8901befab1b11e1ee7d08eb2cf05a04a432212727633aaa7ddc880b25f65: Status 404 returned error can't find the container with id acab8901befab1b11e1ee7d08eb2cf05a04a432212727633aaa7ddc880b25f65 Apr 16 04:24:15.444251 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:15.444226 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf587afea_01f4_4981_9efb_09779dde706b.slice/crio-563286226e2b4e24e68942b44223c9264a3da5c9e7a4ab692b724ef06bf4bccf WatchSource:0}: Error finding container 563286226e2b4e24e68942b44223c9264a3da5c9e7a4ab692b724ef06bf4bccf: Status 404 returned error can't find the container with id 563286226e2b4e24e68942b44223c9264a3da5c9e7a4ab692b724ef06bf4bccf Apr 16 04:24:15.445126 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:15.444698 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e68ff96_cb19_4476_832d_2335b154a1a7.slice/crio-d0c2a4021ad519a5512856e296b2e916fb96f642c0fcf442eb63b5c42431b1c3 WatchSource:0}: Error finding container d0c2a4021ad519a5512856e296b2e916fb96f642c0fcf442eb63b5c42431b1c3: Status 404 returned error can't find the container with id d0c2a4021ad519a5512856e296b2e916fb96f642c0fcf442eb63b5c42431b1c3 Apr 16 04:24:15.445617 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:15.445595 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc2d5478_4b9a_4191_bbbd_ecf1ba21e214.slice/crio-53540a9ffd5ef217c480702f66357d58d61bc7cd7b852628131d3436b7683864 WatchSource:0}: Error finding container 53540a9ffd5ef217c480702f66357d58d61bc7cd7b852628131d3436b7683864: Status 404 returned error can't find the container with id 53540a9ffd5ef217c480702f66357d58d61bc7cd7b852628131d3436b7683864 Apr 16 04:24:15.467567 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:15.467535 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56306845_377c_4423_8ae8_4f5824970788.slice/crio-d6c3cbee510c7badfa411808f7b3453dac3ebbaae0cd368688834bbcc9110924 WatchSource:0}: Error finding container d6c3cbee510c7badfa411808f7b3453dac3ebbaae0cd368688834bbcc9110924: Status 404 returned error can't find the container with id d6c3cbee510c7badfa411808f7b3453dac3ebbaae0cd368688834bbcc9110924 Apr 16 04:24:15.468457 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:15.468348 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d98373a_f359_466d_8c72_1e9b4e835c04.slice/crio-590035c03cd325642f2f19084cc3e2d5de117f66f1d8df40ec82be363a01107e WatchSource:0}: Error finding container 590035c03cd325642f2f19084cc3e2d5de117f66f1d8df40ec82be363a01107e: Status 404 returned error can't find the container with id 590035c03cd325642f2f19084cc3e2d5de117f66f1d8df40ec82be363a01107e Apr 16 04:24:15.583943 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:15.583753 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 04:19:13 +0000 UTC" deadline="2027-09-21 23:06:41.966568976 +0000 UTC" Apr 16 04:24:15.583943 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:15.583938 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12570h42m26.382634212s" Apr 16 04:24:15.702481 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:15.702364 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-875wc" Apr 16 04:24:15.702610 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:15.702494 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-875wc" podUID="414aeb12-26d8-4994-8fe8-1cad0342c929" Apr 16 04:24:15.710484 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:15.710448 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cxwbn" event={"ID":"56306845-377c-4423-8ae8-4f5824970788","Type":"ContainerStarted","Data":"d6c3cbee510c7badfa411808f7b3453dac3ebbaae0cd368688834bbcc9110924"} Apr 16 04:24:15.711417 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:15.711394 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-864l5" event={"ID":"bc2d5478-4b9a-4191-bbbd-ecf1ba21e214","Type":"ContainerStarted","Data":"53540a9ffd5ef217c480702f66357d58d61bc7cd7b852628131d3436b7683864"} Apr 16 04:24:15.712310 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:15.712289 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5qm28" event={"ID":"1e68ff96-cb19-4476-832d-2335b154a1a7","Type":"ContainerStarted","Data":"d0c2a4021ad519a5512856e296b2e916fb96f642c0fcf442eb63b5c42431b1c3"} Apr 16 04:24:15.713200 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:15.713180 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qksrt" event={"ID":"961e60a9-039d-43e7-8613-150ee4d6b85d","Type":"ContainerStarted","Data":"acab8901befab1b11e1ee7d08eb2cf05a04a432212727633aaa7ddc880b25f65"} Apr 16 04:24:15.715358 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:15.715311 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" event={"ID":"1b42bbbf-91b9-4c9a-b525-a6832184768c","Type":"ContainerStarted","Data":"ef50dfac17a89a37f724b7e90f532d24a2a4d68479892fd10edf6b8a3684f3b7"} Apr 16 04:24:15.717124 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:15.717097 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w7h7r" event={"ID":"1d98373a-f359-466d-8c72-1e9b4e835c04","Type":"ContainerStarted","Data":"590035c03cd325642f2f19084cc3e2d5de117f66f1d8df40ec82be363a01107e"} Apr 16 04:24:15.718190 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:15.718172 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-dj5pp" event={"ID":"f587afea-01f4-4981-9efb-09779dde706b","Type":"ContainerStarted","Data":"563286226e2b4e24e68942b44223c9264a3da5c9e7a4ab692b724ef06bf4bccf"} Apr 16 04:24:15.722915 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:15.722895 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 04:24:15.723798 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:15.723769 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-c4k78" event={"ID":"fbd09ec2-88c2-46c7-b3da-09c43c74e6ab","Type":"ContainerStarted","Data":"a18b1affbc9edd597013efc62f6b18778db1b7532671f00d1f8379512b59cc8f"} Apr 16 04:24:15.724666 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:15.724648 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-x6z2n" event={"ID":"da93ce6c-22ad-4a30-a065-834915acdf65","Type":"ContainerStarted","Data":"e7bf5f6584251e242e2f3e4f179924ce161bf096b3fff5b575ebf15e648f2506"} Apr 16 04:24:15.726026 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:15.726008 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-103.ec2.internal" event={"ID":"036d1e1671f062f79bfb2c499603021f","Type":"ContainerStarted","Data":"87738e8ea8b6202d873bca1143cbbf0a285d0456ba7b02ede80cb14d33cdaab4"} Apr 16 04:24:15.738840 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:15.738792 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-103.ec2.internal" podStartSLOduration=1.73877342 podStartE2EDuration="1.73877342s" podCreationTimestamp="2026-04-16 04:24:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 04:24:15.738485083 +0000 UTC m=+3.641261082" watchObservedRunningTime="2026-04-16 04:24:15.73877342 +0000 UTC m=+3.641549419" Apr 16 04:24:16.182118 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:16.180866 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/414aeb12-26d8-4994-8fe8-1cad0342c929-metrics-certs\") pod \"network-metrics-daemon-875wc\" (UID: \"414aeb12-26d8-4994-8fe8-1cad0342c929\") " pod="openshift-multus/network-metrics-daemon-875wc" Apr 16 04:24:16.182118 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:16.181015 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 04:24:16.182118 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:16.181078 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/414aeb12-26d8-4994-8fe8-1cad0342c929-metrics-certs podName:414aeb12-26d8-4994-8fe8-1cad0342c929 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:18.181059986 +0000 UTC m=+6.083835968 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/414aeb12-26d8-4994-8fe8-1cad0342c929-metrics-certs") pod "network-metrics-daemon-875wc" (UID: "414aeb12-26d8-4994-8fe8-1cad0342c929") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 04:24:16.383100 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:16.382933 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rt8pn\" (UniqueName: \"kubernetes.io/projected/28a44ec7-295d-426e-86f6-f5b9bd257fb5-kube-api-access-rt8pn\") pod \"network-check-target-jv9m2\" (UID: \"28a44ec7-295d-426e-86f6-f5b9bd257fb5\") " pod="openshift-network-diagnostics/network-check-target-jv9m2" Apr 16 04:24:16.383281 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:16.383148 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 04:24:16.383281 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:16.383168 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 04:24:16.383281 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:16.383181 2575 projected.go:194] Error preparing data for projected volume kube-api-access-rt8pn for pod openshift-network-diagnostics/network-check-target-jv9m2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 04:24:16.383281 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:16.383238 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/28a44ec7-295d-426e-86f6-f5b9bd257fb5-kube-api-access-rt8pn podName:28a44ec7-295d-426e-86f6-f5b9bd257fb5 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:18.38322056 +0000 UTC m=+6.285996552 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-rt8pn" (UniqueName: "kubernetes.io/projected/28a44ec7-295d-426e-86f6-f5b9bd257fb5-kube-api-access-rt8pn") pod "network-check-target-jv9m2" (UID: "28a44ec7-295d-426e-86f6-f5b9bd257fb5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 04:24:16.705959 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:16.705425 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jv9m2" Apr 16 04:24:16.705959 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:16.705547 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jv9m2" podUID="28a44ec7-295d-426e-86f6-f5b9bd257fb5" Apr 16 04:24:16.733437 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:16.733400 2575 generic.go:358] "Generic (PLEG): container finished" podID="b587bad82e85625e2d9083084ab92f62" containerID="8735c504b3303c583562722a1ac220b710996d1884ddfcfbf1139b57ec685a4f" exitCode=0 Apr 16 04:24:16.734403 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:16.734374 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-103.ec2.internal" event={"ID":"b587bad82e85625e2d9083084ab92f62","Type":"ContainerDied","Data":"8735c504b3303c583562722a1ac220b710996d1884ddfcfbf1139b57ec685a4f"} Apr 16 04:24:17.703009 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:17.702973 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-875wc" Apr 16 04:24:17.703201 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:17.703168 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-875wc" podUID="414aeb12-26d8-4994-8fe8-1cad0342c929" Apr 16 04:24:17.743539 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:17.742841 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-103.ec2.internal" event={"ID":"b587bad82e85625e2d9083084ab92f62","Type":"ContainerStarted","Data":"d35bbf6ebafec3e8abe23f295da24eea18d13bb7c944e3a1723567a238878d3f"} Apr 16 04:24:18.200298 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:18.199675 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/414aeb12-26d8-4994-8fe8-1cad0342c929-metrics-certs\") pod \"network-metrics-daemon-875wc\" (UID: \"414aeb12-26d8-4994-8fe8-1cad0342c929\") " pod="openshift-multus/network-metrics-daemon-875wc" Apr 16 04:24:18.200298 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:18.199835 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 04:24:18.200298 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:18.199897 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/414aeb12-26d8-4994-8fe8-1cad0342c929-metrics-certs podName:414aeb12-26d8-4994-8fe8-1cad0342c929 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:22.199878834 +0000 UTC m=+10.102654826 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/414aeb12-26d8-4994-8fe8-1cad0342c929-metrics-certs") pod "network-metrics-daemon-875wc" (UID: "414aeb12-26d8-4994-8fe8-1cad0342c929") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 04:24:18.400936 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:18.400866 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rt8pn\" (UniqueName: \"kubernetes.io/projected/28a44ec7-295d-426e-86f6-f5b9bd257fb5-kube-api-access-rt8pn\") pod \"network-check-target-jv9m2\" (UID: \"28a44ec7-295d-426e-86f6-f5b9bd257fb5\") " pod="openshift-network-diagnostics/network-check-target-jv9m2" Apr 16 04:24:18.401118 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:18.401066 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 04:24:18.401118 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:18.401087 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 04:24:18.401118 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:18.401099 2575 projected.go:194] Error preparing data for projected volume kube-api-access-rt8pn for pod openshift-network-diagnostics/network-check-target-jv9m2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 04:24:18.401271 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:18.401159 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/28a44ec7-295d-426e-86f6-f5b9bd257fb5-kube-api-access-rt8pn podName:28a44ec7-295d-426e-86f6-f5b9bd257fb5 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:22.401139634 +0000 UTC m=+10.303915625 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-rt8pn" (UniqueName: "kubernetes.io/projected/28a44ec7-295d-426e-86f6-f5b9bd257fb5-kube-api-access-rt8pn") pod "network-check-target-jv9m2" (UID: "28a44ec7-295d-426e-86f6-f5b9bd257fb5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 04:24:18.703349 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:18.702862 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jv9m2" Apr 16 04:24:18.703349 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:18.702984 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jv9m2" podUID="28a44ec7-295d-426e-86f6-f5b9bd257fb5" Apr 16 04:24:19.703445 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:19.702942 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-875wc" Apr 16 04:24:19.703445 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:19.703085 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-875wc" podUID="414aeb12-26d8-4994-8fe8-1cad0342c929" Apr 16 04:24:20.702615 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:20.702573 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jv9m2" Apr 16 04:24:20.702844 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:20.702709 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jv9m2" podUID="28a44ec7-295d-426e-86f6-f5b9bd257fb5" Apr 16 04:24:21.702624 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:21.702591 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-875wc" Apr 16 04:24:21.703100 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:21.702742 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-875wc" podUID="414aeb12-26d8-4994-8fe8-1cad0342c929" Apr 16 04:24:22.234685 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:22.234619 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/414aeb12-26d8-4994-8fe8-1cad0342c929-metrics-certs\") pod \"network-metrics-daemon-875wc\" (UID: \"414aeb12-26d8-4994-8fe8-1cad0342c929\") " pod="openshift-multus/network-metrics-daemon-875wc" Apr 16 04:24:22.234905 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:22.234788 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 04:24:22.234905 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:22.234866 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/414aeb12-26d8-4994-8fe8-1cad0342c929-metrics-certs podName:414aeb12-26d8-4994-8fe8-1cad0342c929 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:30.234844802 +0000 UTC m=+18.137620795 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/414aeb12-26d8-4994-8fe8-1cad0342c929-metrics-certs") pod "network-metrics-daemon-875wc" (UID: "414aeb12-26d8-4994-8fe8-1cad0342c929") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 04:24:22.436268 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:22.436234 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rt8pn\" (UniqueName: \"kubernetes.io/projected/28a44ec7-295d-426e-86f6-f5b9bd257fb5-kube-api-access-rt8pn\") pod \"network-check-target-jv9m2\" (UID: \"28a44ec7-295d-426e-86f6-f5b9bd257fb5\") " pod="openshift-network-diagnostics/network-check-target-jv9m2" Apr 16 04:24:22.436442 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:22.436414 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 04:24:22.436442 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:22.436433 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 04:24:22.436555 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:22.436445 2575 projected.go:194] Error preparing data for projected volume kube-api-access-rt8pn for pod openshift-network-diagnostics/network-check-target-jv9m2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 04:24:22.436555 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:22.436503 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/28a44ec7-295d-426e-86f6-f5b9bd257fb5-kube-api-access-rt8pn podName:28a44ec7-295d-426e-86f6-f5b9bd257fb5 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:30.436485155 +0000 UTC m=+18.339261145 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-rt8pn" (UniqueName: "kubernetes.io/projected/28a44ec7-295d-426e-86f6-f5b9bd257fb5-kube-api-access-rt8pn") pod "network-check-target-jv9m2" (UID: "28a44ec7-295d-426e-86f6-f5b9bd257fb5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 04:24:22.704427 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:22.703908 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jv9m2" Apr 16 04:24:22.704427 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:22.704013 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jv9m2" podUID="28a44ec7-295d-426e-86f6-f5b9bd257fb5" Apr 16 04:24:23.702489 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:23.702450 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-875wc" Apr 16 04:24:23.702681 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:23.702606 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-875wc" podUID="414aeb12-26d8-4994-8fe8-1cad0342c929" Apr 16 04:24:24.702445 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:24.702407 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jv9m2" Apr 16 04:24:24.702823 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:24.702551 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jv9m2" podUID="28a44ec7-295d-426e-86f6-f5b9bd257fb5" Apr 16 04:24:25.702650 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:25.702614 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-875wc" Apr 16 04:24:25.703061 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:25.702766 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-875wc" podUID="414aeb12-26d8-4994-8fe8-1cad0342c929" Apr 16 04:24:26.702496 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:26.702466 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jv9m2" Apr 16 04:24:26.702673 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:26.702586 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jv9m2" podUID="28a44ec7-295d-426e-86f6-f5b9bd257fb5" Apr 16 04:24:27.702583 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:27.702548 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-875wc" Apr 16 04:24:27.702740 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:27.702669 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-875wc" podUID="414aeb12-26d8-4994-8fe8-1cad0342c929" Apr 16 04:24:28.703055 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:28.703022 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jv9m2" Apr 16 04:24:28.703501 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:28.703146 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jv9m2" podUID="28a44ec7-295d-426e-86f6-f5b9bd257fb5" Apr 16 04:24:29.702426 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:29.702385 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-875wc" Apr 16 04:24:29.702712 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:29.702509 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-875wc" podUID="414aeb12-26d8-4994-8fe8-1cad0342c929" Apr 16 04:24:30.291861 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:30.291826 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/414aeb12-26d8-4994-8fe8-1cad0342c929-metrics-certs\") pod \"network-metrics-daemon-875wc\" (UID: \"414aeb12-26d8-4994-8fe8-1cad0342c929\") " pod="openshift-multus/network-metrics-daemon-875wc" Apr 16 04:24:30.292278 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:30.291993 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 04:24:30.292278 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:30.292083 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/414aeb12-26d8-4994-8fe8-1cad0342c929-metrics-certs podName:414aeb12-26d8-4994-8fe8-1cad0342c929 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:46.292063545 +0000 UTC m=+34.194839522 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/414aeb12-26d8-4994-8fe8-1cad0342c929-metrics-certs") pod "network-metrics-daemon-875wc" (UID: "414aeb12-26d8-4994-8fe8-1cad0342c929") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 04:24:30.493384 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:30.493347 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rt8pn\" (UniqueName: \"kubernetes.io/projected/28a44ec7-295d-426e-86f6-f5b9bd257fb5-kube-api-access-rt8pn\") pod \"network-check-target-jv9m2\" (UID: \"28a44ec7-295d-426e-86f6-f5b9bd257fb5\") " pod="openshift-network-diagnostics/network-check-target-jv9m2" Apr 16 04:24:30.493560 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:30.493508 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 04:24:30.493560 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:30.493530 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 04:24:30.493560 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:30.493543 2575 projected.go:194] Error preparing data for projected volume kube-api-access-rt8pn for pod openshift-network-diagnostics/network-check-target-jv9m2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 04:24:30.493684 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:30.493600 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/28a44ec7-295d-426e-86f6-f5b9bd257fb5-kube-api-access-rt8pn podName:28a44ec7-295d-426e-86f6-f5b9bd257fb5 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:46.493587202 +0000 UTC m=+34.396363192 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-rt8pn" (UniqueName: "kubernetes.io/projected/28a44ec7-295d-426e-86f6-f5b9bd257fb5-kube-api-access-rt8pn") pod "network-check-target-jv9m2" (UID: "28a44ec7-295d-426e-86f6-f5b9bd257fb5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 04:24:30.702959 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:30.702873 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jv9m2" Apr 16 04:24:30.703114 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:30.703013 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jv9m2" podUID="28a44ec7-295d-426e-86f6-f5b9bd257fb5" Apr 16 04:24:31.702695 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:31.702607 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-875wc" Apr 16 04:24:31.703171 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:31.702747 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-875wc" podUID="414aeb12-26d8-4994-8fe8-1cad0342c929" Apr 16 04:24:32.703466 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:32.703440 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jv9m2" Apr 16 04:24:32.703835 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:32.703554 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jv9m2" podUID="28a44ec7-295d-426e-86f6-f5b9bd257fb5" Apr 16 04:24:33.702889 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:33.702510 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-875wc" Apr 16 04:24:33.703004 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:33.702981 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-875wc" podUID="414aeb12-26d8-4994-8fe8-1cad0342c929" Apr 16 04:24:33.794150 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:33.794070 2575 generic.go:358] "Generic (PLEG): container finished" podID="56306845-377c-4423-8ae8-4f5824970788" containerID="546cdb399eb959b962b253175afd0e87c0d719fbdd2e346e340c80384f439bff" exitCode=0 Apr 16 04:24:33.794150 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:33.794090 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 04:24:33.794150 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:33.794145 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cxwbn" event={"ID":"56306845-377c-4423-8ae8-4f5824970788","Type":"ContainerDied","Data":"546cdb399eb959b962b253175afd0e87c0d719fbdd2e346e340c80384f439bff"} Apr 16 04:24:33.795747 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:33.795697 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-864l5" event={"ID":"bc2d5478-4b9a-4191-bbbd-ecf1ba21e214","Type":"ContainerStarted","Data":"cd40e33538a11c8a11576455319092f85501a24615b1ecc11047080d3212d550"} Apr 16 04:24:33.797172 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:33.797153 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5qm28" event={"ID":"1e68ff96-cb19-4476-832d-2335b154a1a7","Type":"ContainerStarted","Data":"fc0bb40a4b18b5db5d3a33e9e9a4857c5b532abffe219f1643fb9201a747bf30"} Apr 16 04:24:33.797243 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:33.797179 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5qm28" event={"ID":"1e68ff96-cb19-4476-832d-2335b154a1a7","Type":"ContainerStarted","Data":"aa44d0d67d9d6e71239ed8fc2a853a0eadc5e8089781b5a9e767d750b5729f91"} Apr 16 04:24:33.798441 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:33.798414 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qksrt" event={"ID":"961e60a9-039d-43e7-8613-150ee4d6b85d","Type":"ContainerStarted","Data":"cf1bc61151503e46449ec4aee216d721faf9944e7f29618c54997f87e4239678"} Apr 16 04:24:33.801171 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:33.801147 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" event={"ID":"1b42bbbf-91b9-4c9a-b525-a6832184768c","Type":"ContainerStarted","Data":"3e9600857eef13ca9b6bdd706d3bec4cb6557ac64b5753b4242338c939ac0c27"} Apr 16 04:24:33.801171 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:33.801171 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" event={"ID":"1b42bbbf-91b9-4c9a-b525-a6832184768c","Type":"ContainerStarted","Data":"96dc11eef9939bbb2b08fd36cbb0b8eb00c58125eed014bd49230f3efe0744e9"} Apr 16 04:24:33.801317 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:33.801181 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" event={"ID":"1b42bbbf-91b9-4c9a-b525-a6832184768c","Type":"ContainerStarted","Data":"13526f9cc0ccfa634a7640604ca0fa65d55511babd1072702659194f94ec15d5"} Apr 16 04:24:33.801317 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:33.801190 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" event={"ID":"1b42bbbf-91b9-4c9a-b525-a6832184768c","Type":"ContainerStarted","Data":"2be82e40fd7205c7d61f0eca009ff0a0f01ace3c3fe224813b2a71dccd367a84"} Apr 16 04:24:33.801317 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:33.801199 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" event={"ID":"1b42bbbf-91b9-4c9a-b525-a6832184768c","Type":"ContainerStarted","Data":"34ddda1afcccad830d1e20c56208438fd56be6622c549c87a1011ff6bc351384"} Apr 16 04:24:33.801317 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:33.801212 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" event={"ID":"1b42bbbf-91b9-4c9a-b525-a6832184768c","Type":"ContainerStarted","Data":"0087b13f9a96c66fc955832886718cb4b913583a2f9909c2b529cf60d10c889d"} Apr 16 04:24:33.802385 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:33.802358 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w7h7r" event={"ID":"1d98373a-f359-466d-8c72-1e9b4e835c04","Type":"ContainerStarted","Data":"18d1a126566db03bb1f3e7d7fb8fe05d0b164edb8599e74e3518c1b93496658e"} Apr 16 04:24:33.803716 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:33.803689 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-dj5pp" event={"ID":"f587afea-01f4-4981-9efb-09779dde706b","Type":"ContainerStarted","Data":"86c86de38e27a7fa565cfa27dc1fa0baa965b078d9234a633b126c59fc3748c5"} Apr 16 04:24:33.804878 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:33.804862 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-c4k78" event={"ID":"fbd09ec2-88c2-46c7-b3da-09c43c74e6ab","Type":"ContainerStarted","Data":"e631606279a6a259689f056b798d31bea45f694f45b62bc123f619f4063ce7ac"} Apr 16 04:24:33.815772 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:33.815728 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-103.ec2.internal" podStartSLOduration=19.815717028 podStartE2EDuration="19.815717028s" podCreationTimestamp="2026-04-16 04:24:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 04:24:17.758481078 +0000 UTC m=+5.661257079" watchObservedRunningTime="2026-04-16 04:24:33.815717028 +0000 UTC m=+21.718493040" Apr 16 04:24:33.826594 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:33.826535 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-c4k78" podStartSLOduration=4.699088248 podStartE2EDuration="21.826522238s" podCreationTimestamp="2026-04-16 04:24:12 +0000 UTC" firstStartedPulling="2026-04-16 04:24:15.441060522 +0000 UTC m=+3.343836500" lastFinishedPulling="2026-04-16 04:24:32.56849451 +0000 UTC m=+20.471270490" observedRunningTime="2026-04-16 04:24:33.826174142 +0000 UTC m=+21.728950140" watchObservedRunningTime="2026-04-16 04:24:33.826522238 +0000 UTC m=+21.729298236" Apr 16 04:24:33.840982 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:33.840935 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-864l5" podStartSLOduration=4.73890514 podStartE2EDuration="21.84092059s" podCreationTimestamp="2026-04-16 04:24:12 +0000 UTC" firstStartedPulling="2026-04-16 04:24:15.466380789 +0000 UTC m=+3.369156767" lastFinishedPulling="2026-04-16 04:24:32.568396235 +0000 UTC m=+20.471172217" observedRunningTime="2026-04-16 04:24:33.84047632 +0000 UTC m=+21.743252319" watchObservedRunningTime="2026-04-16 04:24:33.84092059 +0000 UTC m=+21.743696605" Apr 16 04:24:33.853729 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:33.853691 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-w7h7r" podStartSLOduration=4.773848489 podStartE2EDuration="21.85367669s" podCreationTimestamp="2026-04-16 04:24:12 +0000 UTC" firstStartedPulling="2026-04-16 04:24:15.470815609 +0000 UTC m=+3.373591585" lastFinishedPulling="2026-04-16 04:24:32.55064381 +0000 UTC m=+20.453419786" observedRunningTime="2026-04-16 04:24:33.853404797 +0000 UTC m=+21.756180795" watchObservedRunningTime="2026-04-16 04:24:33.85367669 +0000 UTC m=+21.756452688" Apr 16 04:24:33.868894 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:33.868854 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-qksrt" podStartSLOduration=4.735140014 podStartE2EDuration="21.868840837s" podCreationTimestamp="2026-04-16 04:24:12 +0000 UTC" firstStartedPulling="2026-04-16 04:24:15.466362498 +0000 UTC m=+3.369138475" lastFinishedPulling="2026-04-16 04:24:32.600063322 +0000 UTC m=+20.502839298" observedRunningTime="2026-04-16 04:24:33.868637286 +0000 UTC m=+21.771413286" watchObservedRunningTime="2026-04-16 04:24:33.868840837 +0000 UTC m=+21.771616837" Apr 16 04:24:33.880975 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:33.880934 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-dj5pp" podStartSLOduration=4.778780488 podStartE2EDuration="21.880919835s" podCreationTimestamp="2026-04-16 04:24:12 +0000 UTC" firstStartedPulling="2026-04-16 04:24:15.466434581 +0000 UTC m=+3.369210561" lastFinishedPulling="2026-04-16 04:24:32.568573927 +0000 UTC m=+20.471349908" observedRunningTime="2026-04-16 04:24:33.880649922 +0000 UTC m=+21.783425920" watchObservedRunningTime="2026-04-16 04:24:33.880919835 +0000 UTC m=+21.783695953" Apr 16 04:24:34.230721 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:34.230688 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-c4k78" Apr 16 04:24:34.231410 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:34.231386 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-c4k78" Apr 16 04:24:34.638648 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:34.638551 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T04:24:33.794104821Z","UUID":"aef61700-da89-4365-a61e-2aacc6ab346b","Handler":null,"Name":"","Endpoint":""} Apr 16 04:24:34.642221 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:34.642195 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 04:24:34.642221 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:34.642228 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 04:24:34.702264 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:34.702232 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jv9m2" Apr 16 04:24:34.702473 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:34.702365 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jv9m2" podUID="28a44ec7-295d-426e-86f6-f5b9bd257fb5" Apr 16 04:24:34.809537 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:34.809503 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5qm28" event={"ID":"1e68ff96-cb19-4476-832d-2335b154a1a7","Type":"ContainerStarted","Data":"31fc87809213412e51fcc94846c26535850bb9c56448ef7e1f2624b58c73f1fd"} Apr 16 04:24:34.811273 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:34.811225 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-x6z2n" event={"ID":"da93ce6c-22ad-4a30-a065-834915acdf65","Type":"ContainerStarted","Data":"b532d4b7f323bc3683d432fccc1be41084e2ca57002895a6997c5fda68765d87"} Apr 16 04:24:34.812264 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:34.812242 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-c4k78" Apr 16 04:24:34.812748 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:34.812731 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-c4k78" Apr 16 04:24:34.825485 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:34.825384 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5qm28" podStartSLOduration=3.725786587 podStartE2EDuration="22.825362277s" podCreationTimestamp="2026-04-16 04:24:12 +0000 UTC" firstStartedPulling="2026-04-16 04:24:15.466537278 +0000 UTC m=+3.369313258" lastFinishedPulling="2026-04-16 04:24:34.566112966 +0000 UTC m=+22.468888948" observedRunningTime="2026-04-16 04:24:34.824732696 +0000 UTC m=+22.727508694" watchObservedRunningTime="2026-04-16 04:24:34.825362277 +0000 UTC m=+22.728138277" Apr 16 04:24:34.851090 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:34.851031 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-x6z2n" podStartSLOduration=5.721765738 podStartE2EDuration="22.851012176s" podCreationTimestamp="2026-04-16 04:24:12 +0000 UTC" firstStartedPulling="2026-04-16 04:24:15.439571189 +0000 UTC m=+3.342347167" lastFinishedPulling="2026-04-16 04:24:32.568817628 +0000 UTC m=+20.471593605" observedRunningTime="2026-04-16 04:24:34.850611324 +0000 UTC m=+22.753387323" watchObservedRunningTime="2026-04-16 04:24:34.851012176 +0000 UTC m=+22.753788176" Apr 16 04:24:35.703184 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:35.703100 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-875wc" Apr 16 04:24:35.703377 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:35.703239 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-875wc" podUID="414aeb12-26d8-4994-8fe8-1cad0342c929" Apr 16 04:24:35.816520 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:35.816479 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" event={"ID":"1b42bbbf-91b9-4c9a-b525-a6832184768c","Type":"ContainerStarted","Data":"142534dccd3fff01f396ae231b2ba4a5a0a4d52a6c3b6789336fb48eb089b6dd"} Apr 16 04:24:36.702495 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:36.702457 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jv9m2" Apr 16 04:24:36.702676 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:36.702594 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jv9m2" podUID="28a44ec7-295d-426e-86f6-f5b9bd257fb5" Apr 16 04:24:37.702562 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:37.702422 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-875wc" Apr 16 04:24:37.702863 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:37.702645 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-875wc" podUID="414aeb12-26d8-4994-8fe8-1cad0342c929" Apr 16 04:24:37.824700 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:37.824656 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" event={"ID":"1b42bbbf-91b9-4c9a-b525-a6832184768c","Type":"ContainerStarted","Data":"03078bebae3d093081cbf7b7bb3f791b4456fa94a558dfb710882be88b02ca1b"} Apr 16 04:24:37.825175 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:37.825146 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:37.825175 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:37.825174 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:37.825310 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:37.825186 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:37.843391 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:37.842893 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:37.843391 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:37.843019 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:24:37.852798 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:37.852751 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" podStartSLOduration=8.54265254 podStartE2EDuration="25.85273557s" podCreationTimestamp="2026-04-16 04:24:12 +0000 UTC" firstStartedPulling="2026-04-16 04:24:15.444005087 +0000 UTC m=+3.346781070" lastFinishedPulling="2026-04-16 04:24:32.754088121 +0000 UTC m=+20.656864100" observedRunningTime="2026-04-16 04:24:37.852400885 +0000 UTC m=+25.755176882" watchObservedRunningTime="2026-04-16 04:24:37.85273557 +0000 UTC m=+25.755511569" Apr 16 04:24:38.705137 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:38.705108 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jv9m2" Apr 16 04:24:38.705823 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:38.705198 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jv9m2" podUID="28a44ec7-295d-426e-86f6-f5b9bd257fb5" Apr 16 04:24:38.828270 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:38.828236 2575 generic.go:358] "Generic (PLEG): container finished" podID="56306845-377c-4423-8ae8-4f5824970788" containerID="0eb1b13ef971399f2ab4a0c20ea7ad21c7f20920cb3fb8ce3cf77b31a7753be4" exitCode=0 Apr 16 04:24:38.828422 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:38.828315 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cxwbn" event={"ID":"56306845-377c-4423-8ae8-4f5824970788","Type":"ContainerDied","Data":"0eb1b13ef971399f2ab4a0c20ea7ad21c7f20920cb3fb8ce3cf77b31a7753be4"} Apr 16 04:24:39.681039 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:39.680855 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-875wc"] Apr 16 04:24:39.681194 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:39.681141 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-875wc" Apr 16 04:24:39.681275 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:39.681248 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-875wc" podUID="414aeb12-26d8-4994-8fe8-1cad0342c929" Apr 16 04:24:39.683744 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:39.683723 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jv9m2"] Apr 16 04:24:39.683863 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:39.683817 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jv9m2" Apr 16 04:24:39.683908 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:39.683885 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jv9m2" podUID="28a44ec7-295d-426e-86f6-f5b9bd257fb5" Apr 16 04:24:39.832067 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:39.832029 2575 generic.go:358] "Generic (PLEG): container finished" podID="56306845-377c-4423-8ae8-4f5824970788" containerID="9a5a3f54ad4e659ccd94db13350e5d7340f2d38354066df131511984e099ba8b" exitCode=0 Apr 16 04:24:39.832421 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:39.832110 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cxwbn" event={"ID":"56306845-377c-4423-8ae8-4f5824970788","Type":"ContainerDied","Data":"9a5a3f54ad4e659ccd94db13350e5d7340f2d38354066df131511984e099ba8b"} Apr 16 04:24:40.836260 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:40.836228 2575 generic.go:358] "Generic (PLEG): container finished" podID="56306845-377c-4423-8ae8-4f5824970788" containerID="c2e969319f9fb067032d0e90b2caa997de6e9455c9982db0769b99ebcaa0878e" exitCode=0 Apr 16 04:24:40.836623 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:40.836285 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cxwbn" event={"ID":"56306845-377c-4423-8ae8-4f5824970788","Type":"ContainerDied","Data":"c2e969319f9fb067032d0e90b2caa997de6e9455c9982db0769b99ebcaa0878e"} Apr 16 04:24:41.702535 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:41.702500 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jv9m2" Apr 16 04:24:41.702691 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:41.702506 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-875wc" Apr 16 04:24:41.702691 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:41.702625 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jv9m2" podUID="28a44ec7-295d-426e-86f6-f5b9bd257fb5" Apr 16 04:24:41.702795 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:41.702754 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-875wc" podUID="414aeb12-26d8-4994-8fe8-1cad0342c929" Apr 16 04:24:43.702660 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:43.702570 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jv9m2" Apr 16 04:24:43.702660 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:43.702652 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-875wc" Apr 16 04:24:43.703115 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:43.702777 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-875wc" podUID="414aeb12-26d8-4994-8fe8-1cad0342c929" Apr 16 04:24:43.703115 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:43.702916 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jv9m2" podUID="28a44ec7-295d-426e-86f6-f5b9bd257fb5" Apr 16 04:24:45.398988 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:45.398957 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-103.ec2.internal" event="NodeReady" Apr 16 04:24:45.399386 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:45.399141 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 04:24:45.438540 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:45.438502 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-fvbjz"] Apr 16 04:24:45.467439 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:45.467404 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-8pzwz"] Apr 16 04:24:45.467602 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:45.467571 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fvbjz" Apr 16 04:24:45.470424 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:45.470398 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 04:24:45.470559 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:45.470463 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-smz8d\"" Apr 16 04:24:45.470559 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:45.470472 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 04:24:45.478986 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:45.478961 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fvbjz"] Apr 16 04:24:45.478986 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:45.478985 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8pzwz"] Apr 16 04:24:45.479167 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:45.479087 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8pzwz" Apr 16 04:24:45.481687 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:45.481667 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 04:24:45.481798 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:45.481696 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 04:24:45.481991 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:45.481934 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 04:24:45.482130 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:45.482007 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-d4mbv\"" Apr 16 04:24:45.610394 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:45.610293 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0ca4f43-c47b-4ec0-82f8-3a8054f7a277-cert\") pod \"ingress-canary-8pzwz\" (UID: \"b0ca4f43-c47b-4ec0-82f8-3a8054f7a277\") " pod="openshift-ingress-canary/ingress-canary-8pzwz" Apr 16 04:24:45.610394 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:45.610354 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a8f8c89-66e4-4302-bf2b-05fd369633cf-metrics-tls\") pod \"dns-default-fvbjz\" (UID: \"6a8f8c89-66e4-4302-bf2b-05fd369633cf\") " pod="openshift-dns/dns-default-fvbjz" Apr 16 04:24:45.610394 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:45.610382 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7hjz\" (UniqueName: \"kubernetes.io/projected/6a8f8c89-66e4-4302-bf2b-05fd369633cf-kube-api-access-t7hjz\") pod \"dns-default-fvbjz\" (UID: \"6a8f8c89-66e4-4302-bf2b-05fd369633cf\") " pod="openshift-dns/dns-default-fvbjz" Apr 16 04:24:45.610640 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:45.610503 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgrpx\" (UniqueName: \"kubernetes.io/projected/b0ca4f43-c47b-4ec0-82f8-3a8054f7a277-kube-api-access-bgrpx\") pod \"ingress-canary-8pzwz\" (UID: \"b0ca4f43-c47b-4ec0-82f8-3a8054f7a277\") " pod="openshift-ingress-canary/ingress-canary-8pzwz" Apr 16 04:24:45.610640 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:45.610574 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a8f8c89-66e4-4302-bf2b-05fd369633cf-config-volume\") pod \"dns-default-fvbjz\" (UID: \"6a8f8c89-66e4-4302-bf2b-05fd369633cf\") " pod="openshift-dns/dns-default-fvbjz" Apr 16 04:24:45.610640 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:45.610594 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6a8f8c89-66e4-4302-bf2b-05fd369633cf-tmp-dir\") pod \"dns-default-fvbjz\" (UID: \"6a8f8c89-66e4-4302-bf2b-05fd369633cf\") " pod="openshift-dns/dns-default-fvbjz" Apr 16 04:24:45.702779 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:45.702744 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-875wc" Apr 16 04:24:45.702779 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:45.702792 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jv9m2" Apr 16 04:24:45.705969 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:45.705938 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-l4rjf\"" Apr 16 04:24:45.705969 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:45.705964 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 04:24:45.706169 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:45.706123 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 04:24:45.706169 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:45.706134 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 04:24:45.706428 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:45.706411 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-xfxbt\"" Apr 16 04:24:45.710892 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:45.710862 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bgrpx\" (UniqueName: \"kubernetes.io/projected/b0ca4f43-c47b-4ec0-82f8-3a8054f7a277-kube-api-access-bgrpx\") pod \"ingress-canary-8pzwz\" (UID: \"b0ca4f43-c47b-4ec0-82f8-3a8054f7a277\") " pod="openshift-ingress-canary/ingress-canary-8pzwz" Apr 16 04:24:45.711015 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:45.710937 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a8f8c89-66e4-4302-bf2b-05fd369633cf-config-volume\") pod \"dns-default-fvbjz\" (UID: \"6a8f8c89-66e4-4302-bf2b-05fd369633cf\") " pod="openshift-dns/dns-default-fvbjz" Apr 16 04:24:45.711015 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:45.710970 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6a8f8c89-66e4-4302-bf2b-05fd369633cf-tmp-dir\") pod \"dns-default-fvbjz\" (UID: \"6a8f8c89-66e4-4302-bf2b-05fd369633cf\") " pod="openshift-dns/dns-default-fvbjz" Apr 16 04:24:45.711015 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:45.710999 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0ca4f43-c47b-4ec0-82f8-3a8054f7a277-cert\") pod \"ingress-canary-8pzwz\" (UID: \"b0ca4f43-c47b-4ec0-82f8-3a8054f7a277\") " pod="openshift-ingress-canary/ingress-canary-8pzwz" Apr 16 04:24:45.711321 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:45.711026 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a8f8c89-66e4-4302-bf2b-05fd369633cf-metrics-tls\") pod \"dns-default-fvbjz\" (UID: \"6a8f8c89-66e4-4302-bf2b-05fd369633cf\") " pod="openshift-dns/dns-default-fvbjz" Apr 16 04:24:45.711321 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:45.711050 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t7hjz\" (UniqueName: \"kubernetes.io/projected/6a8f8c89-66e4-4302-bf2b-05fd369633cf-kube-api-access-t7hjz\") pod \"dns-default-fvbjz\" (UID: \"6a8f8c89-66e4-4302-bf2b-05fd369633cf\") " pod="openshift-dns/dns-default-fvbjz" Apr 16 04:24:45.711700 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:45.711676 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6a8f8c89-66e4-4302-bf2b-05fd369633cf-tmp-dir\") pod \"dns-default-fvbjz\" (UID: \"6a8f8c89-66e4-4302-bf2b-05fd369633cf\") " pod="openshift-dns/dns-default-fvbjz" Apr 16 04:24:45.711839 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:45.711800 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 04:24:45.711916 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:45.711873 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0ca4f43-c47b-4ec0-82f8-3a8054f7a277-cert podName:b0ca4f43-c47b-4ec0-82f8-3a8054f7a277 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:46.211854346 +0000 UTC m=+34.114630330 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b0ca4f43-c47b-4ec0-82f8-3a8054f7a277-cert") pod "ingress-canary-8pzwz" (UID: "b0ca4f43-c47b-4ec0-82f8-3a8054f7a277") : secret "canary-serving-cert" not found Apr 16 04:24:45.712053 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:45.712024 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 04:24:45.712160 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:45.712096 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a8f8c89-66e4-4302-bf2b-05fd369633cf-metrics-tls podName:6a8f8c89-66e4-4302-bf2b-05fd369633cf nodeName:}" failed. No retries permitted until 2026-04-16 04:24:46.212068597 +0000 UTC m=+34.114844574 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6a8f8c89-66e4-4302-bf2b-05fd369633cf-metrics-tls") pod "dns-default-fvbjz" (UID: "6a8f8c89-66e4-4302-bf2b-05fd369633cf") : secret "dns-default-metrics-tls" not found Apr 16 04:24:45.712220 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:45.712198 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a8f8c89-66e4-4302-bf2b-05fd369633cf-config-volume\") pod \"dns-default-fvbjz\" (UID: \"6a8f8c89-66e4-4302-bf2b-05fd369633cf\") " pod="openshift-dns/dns-default-fvbjz" Apr 16 04:24:45.723254 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:45.723226 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgrpx\" (UniqueName: \"kubernetes.io/projected/b0ca4f43-c47b-4ec0-82f8-3a8054f7a277-kube-api-access-bgrpx\") pod \"ingress-canary-8pzwz\" (UID: \"b0ca4f43-c47b-4ec0-82f8-3a8054f7a277\") " pod="openshift-ingress-canary/ingress-canary-8pzwz" Apr 16 04:24:45.723403 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:45.723384 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7hjz\" (UniqueName: \"kubernetes.io/projected/6a8f8c89-66e4-4302-bf2b-05fd369633cf-kube-api-access-t7hjz\") pod \"dns-default-fvbjz\" (UID: \"6a8f8c89-66e4-4302-bf2b-05fd369633cf\") " pod="openshift-dns/dns-default-fvbjz" Apr 16 04:24:46.215268 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:46.215229 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0ca4f43-c47b-4ec0-82f8-3a8054f7a277-cert\") pod \"ingress-canary-8pzwz\" (UID: \"b0ca4f43-c47b-4ec0-82f8-3a8054f7a277\") " pod="openshift-ingress-canary/ingress-canary-8pzwz" Apr 16 04:24:46.215268 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:46.215284 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a8f8c89-66e4-4302-bf2b-05fd369633cf-metrics-tls\") pod \"dns-default-fvbjz\" (UID: \"6a8f8c89-66e4-4302-bf2b-05fd369633cf\") " pod="openshift-dns/dns-default-fvbjz" Apr 16 04:24:46.215560 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:46.215388 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 04:24:46.215560 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:46.215406 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 04:24:46.215560 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:46.215469 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0ca4f43-c47b-4ec0-82f8-3a8054f7a277-cert podName:b0ca4f43-c47b-4ec0-82f8-3a8054f7a277 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:47.215445801 +0000 UTC m=+35.118221781 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b0ca4f43-c47b-4ec0-82f8-3a8054f7a277-cert") pod "ingress-canary-8pzwz" (UID: "b0ca4f43-c47b-4ec0-82f8-3a8054f7a277") : secret "canary-serving-cert" not found Apr 16 04:24:46.215560 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:46.215487 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a8f8c89-66e4-4302-bf2b-05fd369633cf-metrics-tls podName:6a8f8c89-66e4-4302-bf2b-05fd369633cf nodeName:}" failed. No retries permitted until 2026-04-16 04:24:47.215480007 +0000 UTC m=+35.118255984 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6a8f8c89-66e4-4302-bf2b-05fd369633cf-metrics-tls") pod "dns-default-fvbjz" (UID: "6a8f8c89-66e4-4302-bf2b-05fd369633cf") : secret "dns-default-metrics-tls" not found Apr 16 04:24:46.316358 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:46.316309 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/414aeb12-26d8-4994-8fe8-1cad0342c929-metrics-certs\") pod \"network-metrics-daemon-875wc\" (UID: \"414aeb12-26d8-4994-8fe8-1cad0342c929\") " pod="openshift-multus/network-metrics-daemon-875wc" Apr 16 04:24:46.316556 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:46.316454 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 04:24:46.316556 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:46.316535 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/414aeb12-26d8-4994-8fe8-1cad0342c929-metrics-certs podName:414aeb12-26d8-4994-8fe8-1cad0342c929 nodeName:}" failed. No retries permitted until 2026-04-16 04:25:18.316514402 +0000 UTC m=+66.219290396 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/414aeb12-26d8-4994-8fe8-1cad0342c929-metrics-certs") pod "network-metrics-daemon-875wc" (UID: "414aeb12-26d8-4994-8fe8-1cad0342c929") : secret "metrics-daemon-secret" not found Apr 16 04:24:46.518089 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:46.518060 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rt8pn\" (UniqueName: \"kubernetes.io/projected/28a44ec7-295d-426e-86f6-f5b9bd257fb5-kube-api-access-rt8pn\") pod \"network-check-target-jv9m2\" (UID: \"28a44ec7-295d-426e-86f6-f5b9bd257fb5\") " pod="openshift-network-diagnostics/network-check-target-jv9m2" Apr 16 04:24:46.520738 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:46.520715 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt8pn\" (UniqueName: \"kubernetes.io/projected/28a44ec7-295d-426e-86f6-f5b9bd257fb5-kube-api-access-rt8pn\") pod \"network-check-target-jv9m2\" (UID: \"28a44ec7-295d-426e-86f6-f5b9bd257fb5\") " pod="openshift-network-diagnostics/network-check-target-jv9m2" Apr 16 04:24:46.622074 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:46.622039 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jv9m2" Apr 16 04:24:46.965990 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:46.965961 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jv9m2"] Apr 16 04:24:46.969301 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:24:46.969274 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28a44ec7_295d_426e_86f6_f5b9bd257fb5.slice/crio-e8a8df93155f8b8c6c096ece3e568125fbd93936235095bc8fdefeb65fbba42d WatchSource:0}: Error finding container e8a8df93155f8b8c6c096ece3e568125fbd93936235095bc8fdefeb65fbba42d: Status 404 returned error can't find the container with id e8a8df93155f8b8c6c096ece3e568125fbd93936235095bc8fdefeb65fbba42d Apr 16 04:24:47.224577 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:47.224547 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0ca4f43-c47b-4ec0-82f8-3a8054f7a277-cert\") pod \"ingress-canary-8pzwz\" (UID: \"b0ca4f43-c47b-4ec0-82f8-3a8054f7a277\") " pod="openshift-ingress-canary/ingress-canary-8pzwz" Apr 16 04:24:47.224577 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:47.224583 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a8f8c89-66e4-4302-bf2b-05fd369633cf-metrics-tls\") pod \"dns-default-fvbjz\" (UID: \"6a8f8c89-66e4-4302-bf2b-05fd369633cf\") " pod="openshift-dns/dns-default-fvbjz" Apr 16 04:24:47.224817 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:47.224701 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 04:24:47.224817 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:47.224745 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 04:24:47.224817 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:47.224776 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0ca4f43-c47b-4ec0-82f8-3a8054f7a277-cert podName:b0ca4f43-c47b-4ec0-82f8-3a8054f7a277 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:49.224757934 +0000 UTC m=+37.127533912 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b0ca4f43-c47b-4ec0-82f8-3a8054f7a277-cert") pod "ingress-canary-8pzwz" (UID: "b0ca4f43-c47b-4ec0-82f8-3a8054f7a277") : secret "canary-serving-cert" not found Apr 16 04:24:47.224817 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:47.224792 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a8f8c89-66e4-4302-bf2b-05fd369633cf-metrics-tls podName:6a8f8c89-66e4-4302-bf2b-05fd369633cf nodeName:}" failed. No retries permitted until 2026-04-16 04:24:49.224786454 +0000 UTC m=+37.127562430 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6a8f8c89-66e4-4302-bf2b-05fd369633cf-metrics-tls") pod "dns-default-fvbjz" (UID: "6a8f8c89-66e4-4302-bf2b-05fd369633cf") : secret "dns-default-metrics-tls" not found Apr 16 04:24:47.854179 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:47.853963 2575 generic.go:358] "Generic (PLEG): container finished" podID="56306845-377c-4423-8ae8-4f5824970788" containerID="c41caa6d44e3254bb5199b947d10178ffd6dbc1c6e52cfe15acd25f8f7d18186" exitCode=0 Apr 16 04:24:47.855198 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:47.854004 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cxwbn" event={"ID":"56306845-377c-4423-8ae8-4f5824970788","Type":"ContainerDied","Data":"c41caa6d44e3254bb5199b947d10178ffd6dbc1c6e52cfe15acd25f8f7d18186"} Apr 16 04:24:47.855702 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:47.855679 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jv9m2" event={"ID":"28a44ec7-295d-426e-86f6-f5b9bd257fb5","Type":"ContainerStarted","Data":"e8a8df93155f8b8c6c096ece3e568125fbd93936235095bc8fdefeb65fbba42d"} Apr 16 04:24:48.860882 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:48.860841 2575 generic.go:358] "Generic (PLEG): container finished" podID="56306845-377c-4423-8ae8-4f5824970788" containerID="f355424a323766b833d826f0c9d45688ee0f3f3c7f1d536b529f05a0f2056945" exitCode=0 Apr 16 04:24:48.861321 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:48.860893 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cxwbn" event={"ID":"56306845-377c-4423-8ae8-4f5824970788","Type":"ContainerDied","Data":"f355424a323766b833d826f0c9d45688ee0f3f3c7f1d536b529f05a0f2056945"} Apr 16 04:24:49.239174 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:49.239132 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0ca4f43-c47b-4ec0-82f8-3a8054f7a277-cert\") pod \"ingress-canary-8pzwz\" (UID: \"b0ca4f43-c47b-4ec0-82f8-3a8054f7a277\") " pod="openshift-ingress-canary/ingress-canary-8pzwz" Apr 16 04:24:49.239345 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:49.239187 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a8f8c89-66e4-4302-bf2b-05fd369633cf-metrics-tls\") pod \"dns-default-fvbjz\" (UID: \"6a8f8c89-66e4-4302-bf2b-05fd369633cf\") " pod="openshift-dns/dns-default-fvbjz" Apr 16 04:24:49.239345 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:49.239302 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 04:24:49.239345 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:49.239305 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 04:24:49.239485 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:49.239383 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a8f8c89-66e4-4302-bf2b-05fd369633cf-metrics-tls podName:6a8f8c89-66e4-4302-bf2b-05fd369633cf nodeName:}" failed. No retries permitted until 2026-04-16 04:24:53.239362144 +0000 UTC m=+41.142138126 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6a8f8c89-66e4-4302-bf2b-05fd369633cf-metrics-tls") pod "dns-default-fvbjz" (UID: "6a8f8c89-66e4-4302-bf2b-05fd369633cf") : secret "dns-default-metrics-tls" not found Apr 16 04:24:49.239485 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:49.239402 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0ca4f43-c47b-4ec0-82f8-3a8054f7a277-cert podName:b0ca4f43-c47b-4ec0-82f8-3a8054f7a277 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:53.239392789 +0000 UTC m=+41.142168769 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b0ca4f43-c47b-4ec0-82f8-3a8054f7a277-cert") pod "ingress-canary-8pzwz" (UID: "b0ca4f43-c47b-4ec0-82f8-3a8054f7a277") : secret "canary-serving-cert" not found Apr 16 04:24:49.865794 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:49.865764 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cxwbn" event={"ID":"56306845-377c-4423-8ae8-4f5824970788","Type":"ContainerStarted","Data":"a6cbf8b16f24f3a5cb1d8d955fc266d7fd4f3561e16a7e33aeb62b8d6134f115"} Apr 16 04:24:49.888109 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:49.888066 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-cxwbn" podStartSLOduration=6.546701484 podStartE2EDuration="37.888051152s" podCreationTimestamp="2026-04-16 04:24:12 +0000 UTC" firstStartedPulling="2026-04-16 04:24:15.470545609 +0000 UTC m=+3.373321586" lastFinishedPulling="2026-04-16 04:24:46.811895274 +0000 UTC m=+34.714671254" observedRunningTime="2026-04-16 04:24:49.886434583 +0000 UTC m=+37.789210573" watchObservedRunningTime="2026-04-16 04:24:49.888051152 +0000 UTC m=+37.790827150" Apr 16 04:24:50.869498 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:50.869465 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jv9m2" event={"ID":"28a44ec7-295d-426e-86f6-f5b9bd257fb5","Type":"ContainerStarted","Data":"79d93c1b2317a5c37d372adc1ec2fb5fe9f6089f8fc302e6dda6009645c8ac0c"} Apr 16 04:24:50.869942 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:50.869773 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-jv9m2" Apr 16 04:24:50.883613 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:50.883566 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-jv9m2" podStartSLOduration=35.897775568 podStartE2EDuration="38.883551175s" podCreationTimestamp="2026-04-16 04:24:12 +0000 UTC" firstStartedPulling="2026-04-16 04:24:46.971310721 +0000 UTC m=+34.874086698" lastFinishedPulling="2026-04-16 04:24:49.957086324 +0000 UTC m=+37.859862305" observedRunningTime="2026-04-16 04:24:50.88345684 +0000 UTC m=+38.786232839" watchObservedRunningTime="2026-04-16 04:24:50.883551175 +0000 UTC m=+38.786327174" Apr 16 04:24:53.266180 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:53.266135 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0ca4f43-c47b-4ec0-82f8-3a8054f7a277-cert\") pod \"ingress-canary-8pzwz\" (UID: \"b0ca4f43-c47b-4ec0-82f8-3a8054f7a277\") " pod="openshift-ingress-canary/ingress-canary-8pzwz" Apr 16 04:24:53.266180 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:24:53.266183 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a8f8c89-66e4-4302-bf2b-05fd369633cf-metrics-tls\") pod \"dns-default-fvbjz\" (UID: \"6a8f8c89-66e4-4302-bf2b-05fd369633cf\") " pod="openshift-dns/dns-default-fvbjz" Apr 16 04:24:53.266666 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:53.266282 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 04:24:53.266666 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:53.266360 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0ca4f43-c47b-4ec0-82f8-3a8054f7a277-cert podName:b0ca4f43-c47b-4ec0-82f8-3a8054f7a277 nodeName:}" failed. No retries permitted until 2026-04-16 04:25:01.266343258 +0000 UTC m=+49.169119249 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b0ca4f43-c47b-4ec0-82f8-3a8054f7a277-cert") pod "ingress-canary-8pzwz" (UID: "b0ca4f43-c47b-4ec0-82f8-3a8054f7a277") : secret "canary-serving-cert" not found Apr 16 04:24:53.266666 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:53.266292 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 04:24:53.266666 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:24:53.266467 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a8f8c89-66e4-4302-bf2b-05fd369633cf-metrics-tls podName:6a8f8c89-66e4-4302-bf2b-05fd369633cf nodeName:}" failed. No retries permitted until 2026-04-16 04:25:01.266444752 +0000 UTC m=+49.169220742 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6a8f8c89-66e4-4302-bf2b-05fd369633cf-metrics-tls") pod "dns-default-fvbjz" (UID: "6a8f8c89-66e4-4302-bf2b-05fd369633cf") : secret "dns-default-metrics-tls" not found Apr 16 04:25:01.326561 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:01.326520 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0ca4f43-c47b-4ec0-82f8-3a8054f7a277-cert\") pod \"ingress-canary-8pzwz\" (UID: \"b0ca4f43-c47b-4ec0-82f8-3a8054f7a277\") " pod="openshift-ingress-canary/ingress-canary-8pzwz" Apr 16 04:25:01.326561 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:01.326563 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a8f8c89-66e4-4302-bf2b-05fd369633cf-metrics-tls\") pod \"dns-default-fvbjz\" (UID: \"6a8f8c89-66e4-4302-bf2b-05fd369633cf\") " pod="openshift-dns/dns-default-fvbjz" Apr 16 04:25:01.327492 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:01.326655 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 04:25:01.327492 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:01.326679 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 04:25:01.327492 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:01.326710 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a8f8c89-66e4-4302-bf2b-05fd369633cf-metrics-tls podName:6a8f8c89-66e4-4302-bf2b-05fd369633cf nodeName:}" failed. No retries permitted until 2026-04-16 04:25:17.326696789 +0000 UTC m=+65.229472765 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6a8f8c89-66e4-4302-bf2b-05fd369633cf-metrics-tls") pod "dns-default-fvbjz" (UID: "6a8f8c89-66e4-4302-bf2b-05fd369633cf") : secret "dns-default-metrics-tls" not found Apr 16 04:25:01.327492 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:01.326746 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0ca4f43-c47b-4ec0-82f8-3a8054f7a277-cert podName:b0ca4f43-c47b-4ec0-82f8-3a8054f7a277 nodeName:}" failed. No retries permitted until 2026-04-16 04:25:17.32672792 +0000 UTC m=+65.229503901 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b0ca4f43-c47b-4ec0-82f8-3a8054f7a277-cert") pod "ingress-canary-8pzwz" (UID: "b0ca4f43-c47b-4ec0-82f8-3a8054f7a277") : secret "canary-serving-cert" not found Apr 16 04:25:06.159668 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:06.159634 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6f467d586f-nzrjz"] Apr 16 04:25:06.163156 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:06.163132 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6f467d586f-nzrjz" Apr 16 04:25:06.165804 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:06.165780 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 04:25:06.165906 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:06.165878 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 04:25:06.167036 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:06.167019 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 04:25:06.167127 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:06.167022 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 04:25:06.169313 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:06.169291 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6f467d586f-nzrjz"] Apr 16 04:25:06.261091 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:06.261051 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/69ee622b-b591-422c-8c57-21765810903c-klusterlet-config\") pod \"klusterlet-addon-workmgr-6f467d586f-nzrjz\" (UID: \"69ee622b-b591-422c-8c57-21765810903c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6f467d586f-nzrjz" Apr 16 04:25:06.261091 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:06.261090 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c264g\" (UniqueName: \"kubernetes.io/projected/69ee622b-b591-422c-8c57-21765810903c-kube-api-access-c264g\") pod \"klusterlet-addon-workmgr-6f467d586f-nzrjz\" (UID: \"69ee622b-b591-422c-8c57-21765810903c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6f467d586f-nzrjz" Apr 16 04:25:06.261312 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:06.261124 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/69ee622b-b591-422c-8c57-21765810903c-tmp\") pod \"klusterlet-addon-workmgr-6f467d586f-nzrjz\" (UID: \"69ee622b-b591-422c-8c57-21765810903c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6f467d586f-nzrjz" Apr 16 04:25:06.362414 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:06.362382 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/69ee622b-b591-422c-8c57-21765810903c-tmp\") pod \"klusterlet-addon-workmgr-6f467d586f-nzrjz\" (UID: \"69ee622b-b591-422c-8c57-21765810903c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6f467d586f-nzrjz" Apr 16 04:25:06.362601 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:06.362448 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/69ee622b-b591-422c-8c57-21765810903c-klusterlet-config\") pod \"klusterlet-addon-workmgr-6f467d586f-nzrjz\" (UID: \"69ee622b-b591-422c-8c57-21765810903c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6f467d586f-nzrjz" Apr 16 04:25:06.362601 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:06.362470 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c264g\" (UniqueName: \"kubernetes.io/projected/69ee622b-b591-422c-8c57-21765810903c-kube-api-access-c264g\") pod \"klusterlet-addon-workmgr-6f467d586f-nzrjz\" (UID: \"69ee622b-b591-422c-8c57-21765810903c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6f467d586f-nzrjz" Apr 16 04:25:06.362823 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:06.362799 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/69ee622b-b591-422c-8c57-21765810903c-tmp\") pod \"klusterlet-addon-workmgr-6f467d586f-nzrjz\" (UID: \"69ee622b-b591-422c-8c57-21765810903c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6f467d586f-nzrjz" Apr 16 04:25:06.366766 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:06.366746 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/69ee622b-b591-422c-8c57-21765810903c-klusterlet-config\") pod \"klusterlet-addon-workmgr-6f467d586f-nzrjz\" (UID: \"69ee622b-b591-422c-8c57-21765810903c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6f467d586f-nzrjz" Apr 16 04:25:06.370031 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:06.370003 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c264g\" (UniqueName: \"kubernetes.io/projected/69ee622b-b591-422c-8c57-21765810903c-kube-api-access-c264g\") pod \"klusterlet-addon-workmgr-6f467d586f-nzrjz\" (UID: \"69ee622b-b591-422c-8c57-21765810903c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6f467d586f-nzrjz" Apr 16 04:25:06.473490 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:06.473400 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6f467d586f-nzrjz" Apr 16 04:25:06.584104 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:06.584069 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6f467d586f-nzrjz"] Apr 16 04:25:06.588291 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:25:06.588254 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69ee622b_b591_422c_8c57_21765810903c.slice/crio-ce612264dfbbf12903205e34e843862e1647d32acb48bc57ac97b28bb5ac10d2 WatchSource:0}: Error finding container ce612264dfbbf12903205e34e843862e1647d32acb48bc57ac97b28bb5ac10d2: Status 404 returned error can't find the container with id ce612264dfbbf12903205e34e843862e1647d32acb48bc57ac97b28bb5ac10d2 Apr 16 04:25:06.900210 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:06.900170 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6f467d586f-nzrjz" event={"ID":"69ee622b-b591-422c-8c57-21765810903c","Type":"ContainerStarted","Data":"ce612264dfbbf12903205e34e843862e1647d32acb48bc57ac97b28bb5ac10d2"} Apr 16 04:25:09.851894 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:09.851861 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4bdrp" Apr 16 04:25:10.909304 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:10.909216 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6f467d586f-nzrjz" event={"ID":"69ee622b-b591-422c-8c57-21765810903c","Type":"ContainerStarted","Data":"28d96b75ce75dc14f029f0ceed58ef6fb1454e5e5df34f0b620fdbe3155980bd"} Apr 16 04:25:10.909683 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:10.909426 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6f467d586f-nzrjz" Apr 16 04:25:10.911085 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:10.911064 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6f467d586f-nzrjz" Apr 16 04:25:10.923563 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:10.923516 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6f467d586f-nzrjz" podStartSLOduration=0.866375341 podStartE2EDuration="4.92350238s" podCreationTimestamp="2026-04-16 04:25:06 +0000 UTC" firstStartedPulling="2026-04-16 04:25:06.590277274 +0000 UTC m=+54.493053253" lastFinishedPulling="2026-04-16 04:25:10.647404301 +0000 UTC m=+58.550180292" observedRunningTime="2026-04-16 04:25:10.923196083 +0000 UTC m=+58.825972095" watchObservedRunningTime="2026-04-16 04:25:10.92350238 +0000 UTC m=+58.826278378" Apr 16 04:25:17.339348 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:17.339289 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0ca4f43-c47b-4ec0-82f8-3a8054f7a277-cert\") pod \"ingress-canary-8pzwz\" (UID: \"b0ca4f43-c47b-4ec0-82f8-3a8054f7a277\") " pod="openshift-ingress-canary/ingress-canary-8pzwz" Apr 16 04:25:17.339348 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:17.339349 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a8f8c89-66e4-4302-bf2b-05fd369633cf-metrics-tls\") pod \"dns-default-fvbjz\" (UID: \"6a8f8c89-66e4-4302-bf2b-05fd369633cf\") " pod="openshift-dns/dns-default-fvbjz" Apr 16 04:25:17.339981 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:17.339473 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 04:25:17.339981 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:17.339557 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0ca4f43-c47b-4ec0-82f8-3a8054f7a277-cert podName:b0ca4f43-c47b-4ec0-82f8-3a8054f7a277 nodeName:}" failed. No retries permitted until 2026-04-16 04:25:49.33953415 +0000 UTC m=+97.242310144 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b0ca4f43-c47b-4ec0-82f8-3a8054f7a277-cert") pod "ingress-canary-8pzwz" (UID: "b0ca4f43-c47b-4ec0-82f8-3a8054f7a277") : secret "canary-serving-cert" not found Apr 16 04:25:17.339981 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:17.339479 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 04:25:17.340141 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:17.339986 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a8f8c89-66e4-4302-bf2b-05fd369633cf-metrics-tls podName:6a8f8c89-66e4-4302-bf2b-05fd369633cf nodeName:}" failed. No retries permitted until 2026-04-16 04:25:49.339953831 +0000 UTC m=+97.242729825 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6a8f8c89-66e4-4302-bf2b-05fd369633cf-metrics-tls") pod "dns-default-fvbjz" (UID: "6a8f8c89-66e4-4302-bf2b-05fd369633cf") : secret "dns-default-metrics-tls" not found Apr 16 04:25:18.347393 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:18.347344 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/414aeb12-26d8-4994-8fe8-1cad0342c929-metrics-certs\") pod \"network-metrics-daemon-875wc\" (UID: \"414aeb12-26d8-4994-8fe8-1cad0342c929\") " pod="openshift-multus/network-metrics-daemon-875wc" Apr 16 04:25:18.347763 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:18.347446 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 04:25:18.347763 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:18.347516 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/414aeb12-26d8-4994-8fe8-1cad0342c929-metrics-certs podName:414aeb12-26d8-4994-8fe8-1cad0342c929 nodeName:}" failed. No retries permitted until 2026-04-16 04:26:22.347499405 +0000 UTC m=+130.250275381 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/414aeb12-26d8-4994-8fe8-1cad0342c929-metrics-certs") pod "network-metrics-daemon-875wc" (UID: "414aeb12-26d8-4994-8fe8-1cad0342c929") : secret "metrics-daemon-secret" not found Apr 16 04:25:22.875297 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:22.875271 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-jv9m2" Apr 16 04:25:33.740951 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.740827 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-79vz7"] Apr 16 04:25:33.744775 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.744757 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-79vz7" Apr 16 04:25:33.747774 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.747754 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-vthcz\"" Apr 16 04:25:33.747954 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.747940 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 04:25:33.748119 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.748106 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 04:25:33.749092 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.749080 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 04:25:33.749251 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.749240 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 04:25:33.753516 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.753493 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-79vz7"] Apr 16 04:25:33.848935 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.848904 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-76684bdc88-r8jsk"] Apr 16 04:25:33.851736 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.851721 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-76684bdc88-r8jsk" Apr 16 04:25:33.854969 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.854947 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 04:25:33.855081 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.854969 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 04:25:33.856577 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.856554 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 04:25:33.856804 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.856784 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-hvkh7\"" Apr 16 04:25:33.859146 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.859122 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5kt8\" (UniqueName: \"kubernetes.io/projected/b6856769-4d20-4817-9f00-63ded00b2073-kube-api-access-b5kt8\") pod \"kube-storage-version-migrator-operator-756bb7d76f-79vz7\" (UID: \"b6856769-4d20-4817-9f00-63ded00b2073\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-79vz7" Apr 16 04:25:33.859231 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.859203 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6856769-4d20-4817-9f00-63ded00b2073-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-79vz7\" (UID: \"b6856769-4d20-4817-9f00-63ded00b2073\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-79vz7" Apr 16 04:25:33.859285 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.859270 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6856769-4d20-4817-9f00-63ded00b2073-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-79vz7\" (UID: \"b6856769-4d20-4817-9f00-63ded00b2073\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-79vz7" Apr 16 04:25:33.861042 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.861027 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 04:25:33.865586 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.865563 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-76684bdc88-r8jsk"] Apr 16 04:25:33.950081 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.949983 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5d4bf78676-l9b55"] Apr 16 04:25:33.953402 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.953378 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-kn57s"] Apr 16 04:25:33.953551 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.953530 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5d4bf78676-l9b55" Apr 16 04:25:33.956061 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.956031 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-kn57s" Apr 16 04:25:33.956202 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.956182 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-kzm5k\"" Apr 16 04:25:33.956320 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.956215 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 04:25:33.956669 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.956640 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 04:25:33.956870 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.956648 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 04:25:33.956870 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.956698 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 04:25:33.957017 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.956751 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 04:25:33.957017 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.956754 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 04:25:33.958648 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.958627 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 04:25:33.958754 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.958632 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-l996d\"" Apr 16 04:25:33.958858 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.958812 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 04:25:33.958858 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.958818 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 04:25:33.959180 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.959162 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 04:25:33.959571 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.959554 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6856769-4d20-4817-9f00-63ded00b2073-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-79vz7\" (UID: \"b6856769-4d20-4817-9f00-63ded00b2073\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-79vz7" Apr 16 04:25:33.959640 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.959591 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3d8b1eea-523b-4786-aa84-e8f73ea58da2-bound-sa-token\") pod \"image-registry-76684bdc88-r8jsk\" (UID: \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\") " pod="openshift-image-registry/image-registry-76684bdc88-r8jsk" Apr 16 04:25:33.959640 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.959608 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snjv2\" (UniqueName: \"kubernetes.io/projected/3d8b1eea-523b-4786-aa84-e8f73ea58da2-kube-api-access-snjv2\") pod \"image-registry-76684bdc88-r8jsk\" (UID: \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\") " pod="openshift-image-registry/image-registry-76684bdc88-r8jsk" Apr 16 04:25:33.959640 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.959634 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3d8b1eea-523b-4786-aa84-e8f73ea58da2-image-registry-private-configuration\") pod \"image-registry-76684bdc88-r8jsk\" (UID: \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\") " pod="openshift-image-registry/image-registry-76684bdc88-r8jsk" Apr 16 04:25:33.959855 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.959658 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3d8b1eea-523b-4786-aa84-e8f73ea58da2-ca-trust-extracted\") pod \"image-registry-76684bdc88-r8jsk\" (UID: \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\") " pod="openshift-image-registry/image-registry-76684bdc88-r8jsk" Apr 16 04:25:33.959855 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.959717 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6856769-4d20-4817-9f00-63ded00b2073-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-79vz7\" (UID: \"b6856769-4d20-4817-9f00-63ded00b2073\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-79vz7" Apr 16 04:25:33.959855 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.959757 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d8b1eea-523b-4786-aa84-e8f73ea58da2-registry-tls\") pod \"image-registry-76684bdc88-r8jsk\" (UID: \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\") " pod="openshift-image-registry/image-registry-76684bdc88-r8jsk" Apr 16 04:25:33.959855 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.959786 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3d8b1eea-523b-4786-aa84-e8f73ea58da2-trusted-ca\") pod \"image-registry-76684bdc88-r8jsk\" (UID: \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\") " pod="openshift-image-registry/image-registry-76684bdc88-r8jsk" Apr 16 04:25:33.959855 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.959811 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3d8b1eea-523b-4786-aa84-e8f73ea58da2-installation-pull-secrets\") pod \"image-registry-76684bdc88-r8jsk\" (UID: \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\") " pod="openshift-image-registry/image-registry-76684bdc88-r8jsk" Apr 16 04:25:33.960179 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.959906 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b5kt8\" (UniqueName: \"kubernetes.io/projected/b6856769-4d20-4817-9f00-63ded00b2073-kube-api-access-b5kt8\") pod \"kube-storage-version-migrator-operator-756bb7d76f-79vz7\" (UID: \"b6856769-4d20-4817-9f00-63ded00b2073\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-79vz7" Apr 16 04:25:33.960179 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.959939 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3d8b1eea-523b-4786-aa84-e8f73ea58da2-registry-certificates\") pod \"image-registry-76684bdc88-r8jsk\" (UID: \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\") " pod="openshift-image-registry/image-registry-76684bdc88-r8jsk" Apr 16 04:25:33.960352 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.960278 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6856769-4d20-4817-9f00-63ded00b2073-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-79vz7\" (UID: \"b6856769-4d20-4817-9f00-63ded00b2073\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-79vz7" Apr 16 04:25:33.961727 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.961708 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-kn57s"] Apr 16 04:25:33.962300 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.962277 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6856769-4d20-4817-9f00-63ded00b2073-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-79vz7\" (UID: \"b6856769-4d20-4817-9f00-63ded00b2073\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-79vz7" Apr 16 04:25:33.964398 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.964373 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5d4bf78676-l9b55"] Apr 16 04:25:33.965437 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.965417 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 04:25:33.976304 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:33.976282 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5kt8\" (UniqueName: \"kubernetes.io/projected/b6856769-4d20-4817-9f00-63ded00b2073-kube-api-access-b5kt8\") pod \"kube-storage-version-migrator-operator-756bb7d76f-79vz7\" (UID: \"b6856769-4d20-4817-9f00-63ded00b2073\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-79vz7" Apr 16 04:25:34.053625 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.053569 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-79vz7" Apr 16 04:25:34.060490 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.060467 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a33cde98-e7f0-4bc9-a9e5-1a6136039882-metrics-certs\") pod \"router-default-5d4bf78676-l9b55\" (UID: \"a33cde98-e7f0-4bc9-a9e5-1a6136039882\") " pod="openshift-ingress/router-default-5d4bf78676-l9b55" Apr 16 04:25:34.060576 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.060499 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ed401949-3fdf-455d-bb0a-400a7a15466c-trusted-ca\") pod \"console-operator-d87b8d5fc-kn57s\" (UID: \"ed401949-3fdf-455d-bb0a-400a7a15466c\") " pod="openshift-console-operator/console-operator-d87b8d5fc-kn57s" Apr 16 04:25:34.060576 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.060517 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hpgb\" (UniqueName: \"kubernetes.io/projected/ed401949-3fdf-455d-bb0a-400a7a15466c-kube-api-access-7hpgb\") pod \"console-operator-d87b8d5fc-kn57s\" (UID: \"ed401949-3fdf-455d-bb0a-400a7a15466c\") " pod="openshift-console-operator/console-operator-d87b8d5fc-kn57s" Apr 16 04:25:34.060576 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.060545 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3d8b1eea-523b-4786-aa84-e8f73ea58da2-bound-sa-token\") pod \"image-registry-76684bdc88-r8jsk\" (UID: \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\") " pod="openshift-image-registry/image-registry-76684bdc88-r8jsk" Apr 16 04:25:34.060686 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.060610 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-snjv2\" (UniqueName: \"kubernetes.io/projected/3d8b1eea-523b-4786-aa84-e8f73ea58da2-kube-api-access-snjv2\") pod \"image-registry-76684bdc88-r8jsk\" (UID: \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\") " pod="openshift-image-registry/image-registry-76684bdc88-r8jsk" Apr 16 04:25:34.060686 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.060639 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed401949-3fdf-455d-bb0a-400a7a15466c-config\") pod \"console-operator-d87b8d5fc-kn57s\" (UID: \"ed401949-3fdf-455d-bb0a-400a7a15466c\") " pod="openshift-console-operator/console-operator-d87b8d5fc-kn57s" Apr 16 04:25:34.060686 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.060673 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3d8b1eea-523b-4786-aa84-e8f73ea58da2-image-registry-private-configuration\") pod \"image-registry-76684bdc88-r8jsk\" (UID: \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\") " pod="openshift-image-registry/image-registry-76684bdc88-r8jsk" Apr 16 04:25:34.060822 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.060702 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3d8b1eea-523b-4786-aa84-e8f73ea58da2-ca-trust-extracted\") pod \"image-registry-76684bdc88-r8jsk\" (UID: \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\") " pod="openshift-image-registry/image-registry-76684bdc88-r8jsk" Apr 16 04:25:34.060822 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.060726 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed401949-3fdf-455d-bb0a-400a7a15466c-serving-cert\") pod \"console-operator-d87b8d5fc-kn57s\" (UID: \"ed401949-3fdf-455d-bb0a-400a7a15466c\") " pod="openshift-console-operator/console-operator-d87b8d5fc-kn57s" Apr 16 04:25:34.060822 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.060760 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d8b1eea-523b-4786-aa84-e8f73ea58da2-registry-tls\") pod \"image-registry-76684bdc88-r8jsk\" (UID: \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\") " pod="openshift-image-registry/image-registry-76684bdc88-r8jsk" Apr 16 04:25:34.060822 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.060782 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3d8b1eea-523b-4786-aa84-e8f73ea58da2-trusted-ca\") pod \"image-registry-76684bdc88-r8jsk\" (UID: \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\") " pod="openshift-image-registry/image-registry-76684bdc88-r8jsk" Apr 16 04:25:34.060822 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.060807 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3d8b1eea-523b-4786-aa84-e8f73ea58da2-installation-pull-secrets\") pod \"image-registry-76684bdc88-r8jsk\" (UID: \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\") " pod="openshift-image-registry/image-registry-76684bdc88-r8jsk" Apr 16 04:25:34.061048 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.060878 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3d8b1eea-523b-4786-aa84-e8f73ea58da2-registry-certificates\") pod \"image-registry-76684bdc88-r8jsk\" (UID: \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\") " pod="openshift-image-registry/image-registry-76684bdc88-r8jsk" Apr 16 04:25:34.061048 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.060911 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a33cde98-e7f0-4bc9-a9e5-1a6136039882-default-certificate\") pod \"router-default-5d4bf78676-l9b55\" (UID: \"a33cde98-e7f0-4bc9-a9e5-1a6136039882\") " pod="openshift-ingress/router-default-5d4bf78676-l9b55" Apr 16 04:25:34.061048 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:34.060911 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 04:25:34.061048 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:34.060939 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-76684bdc88-r8jsk: secret "image-registry-tls" not found Apr 16 04:25:34.061048 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.060962 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a33cde98-e7f0-4bc9-a9e5-1a6136039882-stats-auth\") pod \"router-default-5d4bf78676-l9b55\" (UID: \"a33cde98-e7f0-4bc9-a9e5-1a6136039882\") " pod="openshift-ingress/router-default-5d4bf78676-l9b55" Apr 16 04:25:34.061048 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:34.061005 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3d8b1eea-523b-4786-aa84-e8f73ea58da2-registry-tls podName:3d8b1eea-523b-4786-aa84-e8f73ea58da2 nodeName:}" failed. No retries permitted until 2026-04-16 04:25:34.560986642 +0000 UTC m=+82.463762638 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3d8b1eea-523b-4786-aa84-e8f73ea58da2-registry-tls") pod "image-registry-76684bdc88-r8jsk" (UID: "3d8b1eea-523b-4786-aa84-e8f73ea58da2") : secret "image-registry-tls" not found Apr 16 04:25:34.061489 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.061097 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3d8b1eea-523b-4786-aa84-e8f73ea58da2-ca-trust-extracted\") pod \"image-registry-76684bdc88-r8jsk\" (UID: \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\") " pod="openshift-image-registry/image-registry-76684bdc88-r8jsk" Apr 16 04:25:34.061591 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.061561 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a33cde98-e7f0-4bc9-a9e5-1a6136039882-service-ca-bundle\") pod \"router-default-5d4bf78676-l9b55\" (UID: \"a33cde98-e7f0-4bc9-a9e5-1a6136039882\") " pod="openshift-ingress/router-default-5d4bf78676-l9b55" Apr 16 04:25:34.061663 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.061604 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3d8b1eea-523b-4786-aa84-e8f73ea58da2-registry-certificates\") pod \"image-registry-76684bdc88-r8jsk\" (UID: \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\") " pod="openshift-image-registry/image-registry-76684bdc88-r8jsk" Apr 16 04:25:34.061663 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.061614 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgkbk\" (UniqueName: \"kubernetes.io/projected/a33cde98-e7f0-4bc9-a9e5-1a6136039882-kube-api-access-lgkbk\") pod \"router-default-5d4bf78676-l9b55\" (UID: \"a33cde98-e7f0-4bc9-a9e5-1a6136039882\") " pod="openshift-ingress/router-default-5d4bf78676-l9b55" Apr 16 04:25:34.062524 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.062481 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3d8b1eea-523b-4786-aa84-e8f73ea58da2-trusted-ca\") pod \"image-registry-76684bdc88-r8jsk\" (UID: \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\") " pod="openshift-image-registry/image-registry-76684bdc88-r8jsk" Apr 16 04:25:34.063595 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.063560 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3d8b1eea-523b-4786-aa84-e8f73ea58da2-installation-pull-secrets\") pod \"image-registry-76684bdc88-r8jsk\" (UID: \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\") " pod="openshift-image-registry/image-registry-76684bdc88-r8jsk" Apr 16 04:25:34.063703 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.063646 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3d8b1eea-523b-4786-aa84-e8f73ea58da2-image-registry-private-configuration\") pod \"image-registry-76684bdc88-r8jsk\" (UID: \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\") " pod="openshift-image-registry/image-registry-76684bdc88-r8jsk" Apr 16 04:25:34.069309 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.069285 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3d8b1eea-523b-4786-aa84-e8f73ea58da2-bound-sa-token\") pod \"image-registry-76684bdc88-r8jsk\" (UID: \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\") " pod="openshift-image-registry/image-registry-76684bdc88-r8jsk" Apr 16 04:25:34.069458 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.069443 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-snjv2\" (UniqueName: \"kubernetes.io/projected/3d8b1eea-523b-4786-aa84-e8f73ea58da2-kube-api-access-snjv2\") pod \"image-registry-76684bdc88-r8jsk\" (UID: \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\") " pod="openshift-image-registry/image-registry-76684bdc88-r8jsk" Apr 16 04:25:34.162678 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.162646 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed401949-3fdf-455d-bb0a-400a7a15466c-config\") pod \"console-operator-d87b8d5fc-kn57s\" (UID: \"ed401949-3fdf-455d-bb0a-400a7a15466c\") " pod="openshift-console-operator/console-operator-d87b8d5fc-kn57s" Apr 16 04:25:34.162678 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.162687 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed401949-3fdf-455d-bb0a-400a7a15466c-serving-cert\") pod \"console-operator-d87b8d5fc-kn57s\" (UID: \"ed401949-3fdf-455d-bb0a-400a7a15466c\") " pod="openshift-console-operator/console-operator-d87b8d5fc-kn57s" Apr 16 04:25:34.162866 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.162773 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a33cde98-e7f0-4bc9-a9e5-1a6136039882-default-certificate\") pod \"router-default-5d4bf78676-l9b55\" (UID: \"a33cde98-e7f0-4bc9-a9e5-1a6136039882\") " pod="openshift-ingress/router-default-5d4bf78676-l9b55" Apr 16 04:25:34.162866 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.162800 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a33cde98-e7f0-4bc9-a9e5-1a6136039882-stats-auth\") pod \"router-default-5d4bf78676-l9b55\" (UID: \"a33cde98-e7f0-4bc9-a9e5-1a6136039882\") " pod="openshift-ingress/router-default-5d4bf78676-l9b55" Apr 16 04:25:34.162866 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.162827 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a33cde98-e7f0-4bc9-a9e5-1a6136039882-service-ca-bundle\") pod \"router-default-5d4bf78676-l9b55\" (UID: \"a33cde98-e7f0-4bc9-a9e5-1a6136039882\") " pod="openshift-ingress/router-default-5d4bf78676-l9b55" Apr 16 04:25:34.162866 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.162850 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lgkbk\" (UniqueName: \"kubernetes.io/projected/a33cde98-e7f0-4bc9-a9e5-1a6136039882-kube-api-access-lgkbk\") pod \"router-default-5d4bf78676-l9b55\" (UID: \"a33cde98-e7f0-4bc9-a9e5-1a6136039882\") " pod="openshift-ingress/router-default-5d4bf78676-l9b55" Apr 16 04:25:34.163039 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.162881 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a33cde98-e7f0-4bc9-a9e5-1a6136039882-metrics-certs\") pod \"router-default-5d4bf78676-l9b55\" (UID: \"a33cde98-e7f0-4bc9-a9e5-1a6136039882\") " pod="openshift-ingress/router-default-5d4bf78676-l9b55" Apr 16 04:25:34.163039 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.162910 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ed401949-3fdf-455d-bb0a-400a7a15466c-trusted-ca\") pod \"console-operator-d87b8d5fc-kn57s\" (UID: \"ed401949-3fdf-455d-bb0a-400a7a15466c\") " pod="openshift-console-operator/console-operator-d87b8d5fc-kn57s" Apr 16 04:25:34.163039 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.162935 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7hpgb\" (UniqueName: \"kubernetes.io/projected/ed401949-3fdf-455d-bb0a-400a7a15466c-kube-api-access-7hpgb\") pod \"console-operator-d87b8d5fc-kn57s\" (UID: \"ed401949-3fdf-455d-bb0a-400a7a15466c\") " pod="openshift-console-operator/console-operator-d87b8d5fc-kn57s" Apr 16 04:25:34.163039 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:34.162990 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a33cde98-e7f0-4bc9-a9e5-1a6136039882-service-ca-bundle podName:a33cde98-e7f0-4bc9-a9e5-1a6136039882 nodeName:}" failed. No retries permitted until 2026-04-16 04:25:34.662964412 +0000 UTC m=+82.565740409 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a33cde98-e7f0-4bc9-a9e5-1a6136039882-service-ca-bundle") pod "router-default-5d4bf78676-l9b55" (UID: "a33cde98-e7f0-4bc9-a9e5-1a6136039882") : configmap references non-existent config key: service-ca.crt Apr 16 04:25:34.163226 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:34.163160 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 04:25:34.163278 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:34.163225 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a33cde98-e7f0-4bc9-a9e5-1a6136039882-metrics-certs podName:a33cde98-e7f0-4bc9-a9e5-1a6136039882 nodeName:}" failed. No retries permitted until 2026-04-16 04:25:34.663204231 +0000 UTC m=+82.565980224 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a33cde98-e7f0-4bc9-a9e5-1a6136039882-metrics-certs") pod "router-default-5d4bf78676-l9b55" (UID: "a33cde98-e7f0-4bc9-a9e5-1a6136039882") : secret "router-metrics-certs-default" not found Apr 16 04:25:34.163525 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.163500 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed401949-3fdf-455d-bb0a-400a7a15466c-config\") pod \"console-operator-d87b8d5fc-kn57s\" (UID: \"ed401949-3fdf-455d-bb0a-400a7a15466c\") " pod="openshift-console-operator/console-operator-d87b8d5fc-kn57s" Apr 16 04:25:34.164256 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.164237 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ed401949-3fdf-455d-bb0a-400a7a15466c-trusted-ca\") pod \"console-operator-d87b8d5fc-kn57s\" (UID: \"ed401949-3fdf-455d-bb0a-400a7a15466c\") " pod="openshift-console-operator/console-operator-d87b8d5fc-kn57s" Apr 16 04:25:34.165270 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.165252 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a33cde98-e7f0-4bc9-a9e5-1a6136039882-stats-auth\") pod \"router-default-5d4bf78676-l9b55\" (UID: \"a33cde98-e7f0-4bc9-a9e5-1a6136039882\") " pod="openshift-ingress/router-default-5d4bf78676-l9b55" Apr 16 04:25:34.165694 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.165670 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed401949-3fdf-455d-bb0a-400a7a15466c-serving-cert\") pod \"console-operator-d87b8d5fc-kn57s\" (UID: \"ed401949-3fdf-455d-bb0a-400a7a15466c\") " pod="openshift-console-operator/console-operator-d87b8d5fc-kn57s" Apr 16 04:25:34.165768 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.165754 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a33cde98-e7f0-4bc9-a9e5-1a6136039882-default-certificate\") pod \"router-default-5d4bf78676-l9b55\" (UID: \"a33cde98-e7f0-4bc9-a9e5-1a6136039882\") " pod="openshift-ingress/router-default-5d4bf78676-l9b55" Apr 16 04:25:34.168729 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.168705 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-79vz7"] Apr 16 04:25:34.172407 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.172384 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgkbk\" (UniqueName: \"kubernetes.io/projected/a33cde98-e7f0-4bc9-a9e5-1a6136039882-kube-api-access-lgkbk\") pod \"router-default-5d4bf78676-l9b55\" (UID: \"a33cde98-e7f0-4bc9-a9e5-1a6136039882\") " pod="openshift-ingress/router-default-5d4bf78676-l9b55" Apr 16 04:25:34.172669 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:25:34.172650 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6856769_4d20_4817_9f00_63ded00b2073.slice/crio-8828b2b6f92f5a47317d11d877e9a937954913524e383718e151b5b8f9c857d9 WatchSource:0}: Error finding container 8828b2b6f92f5a47317d11d877e9a937954913524e383718e151b5b8f9c857d9: Status 404 returned error can't find the container with id 8828b2b6f92f5a47317d11d877e9a937954913524e383718e151b5b8f9c857d9 Apr 16 04:25:34.172751 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.172736 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hpgb\" (UniqueName: \"kubernetes.io/projected/ed401949-3fdf-455d-bb0a-400a7a15466c-kube-api-access-7hpgb\") pod \"console-operator-d87b8d5fc-kn57s\" (UID: \"ed401949-3fdf-455d-bb0a-400a7a15466c\") " pod="openshift-console-operator/console-operator-d87b8d5fc-kn57s" Apr 16 04:25:34.275603 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.275571 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-kn57s" Apr 16 04:25:34.385620 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.385590 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-kn57s"] Apr 16 04:25:34.389101 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:25:34.389075 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded401949_3fdf_455d_bb0a_400a7a15466c.slice/crio-1060dd59c0f1c3d43649dab292d746c77d551b43515ada88295bbb5ca10a584d WatchSource:0}: Error finding container 1060dd59c0f1c3d43649dab292d746c77d551b43515ada88295bbb5ca10a584d: Status 404 returned error can't find the container with id 1060dd59c0f1c3d43649dab292d746c77d551b43515ada88295bbb5ca10a584d Apr 16 04:25:34.567129 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.567097 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d8b1eea-523b-4786-aa84-e8f73ea58da2-registry-tls\") pod \"image-registry-76684bdc88-r8jsk\" (UID: \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\") " pod="openshift-image-registry/image-registry-76684bdc88-r8jsk" Apr 16 04:25:34.567291 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:34.567252 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 04:25:34.567291 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:34.567270 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-76684bdc88-r8jsk: secret "image-registry-tls" not found Apr 16 04:25:34.567377 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:34.567352 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3d8b1eea-523b-4786-aa84-e8f73ea58da2-registry-tls podName:3d8b1eea-523b-4786-aa84-e8f73ea58da2 nodeName:}" failed. No retries permitted until 2026-04-16 04:25:35.567318859 +0000 UTC m=+83.470094854 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3d8b1eea-523b-4786-aa84-e8f73ea58da2-registry-tls") pod "image-registry-76684bdc88-r8jsk" (UID: "3d8b1eea-523b-4786-aa84-e8f73ea58da2") : secret "image-registry-tls" not found Apr 16 04:25:34.668060 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.667947 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a33cde98-e7f0-4bc9-a9e5-1a6136039882-metrics-certs\") pod \"router-default-5d4bf78676-l9b55\" (UID: \"a33cde98-e7f0-4bc9-a9e5-1a6136039882\") " pod="openshift-ingress/router-default-5d4bf78676-l9b55" Apr 16 04:25:34.668245 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.668096 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a33cde98-e7f0-4bc9-a9e5-1a6136039882-service-ca-bundle\") pod \"router-default-5d4bf78676-l9b55\" (UID: \"a33cde98-e7f0-4bc9-a9e5-1a6136039882\") " pod="openshift-ingress/router-default-5d4bf78676-l9b55" Apr 16 04:25:34.668245 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:34.668100 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 04:25:34.668245 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:34.668160 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a33cde98-e7f0-4bc9-a9e5-1a6136039882-metrics-certs podName:a33cde98-e7f0-4bc9-a9e5-1a6136039882 nodeName:}" failed. No retries permitted until 2026-04-16 04:25:35.668143135 +0000 UTC m=+83.570919112 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a33cde98-e7f0-4bc9-a9e5-1a6136039882-metrics-certs") pod "router-default-5d4bf78676-l9b55" (UID: "a33cde98-e7f0-4bc9-a9e5-1a6136039882") : secret "router-metrics-certs-default" not found Apr 16 04:25:34.668245 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:34.668208 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a33cde98-e7f0-4bc9-a9e5-1a6136039882-service-ca-bundle podName:a33cde98-e7f0-4bc9-a9e5-1a6136039882 nodeName:}" failed. No retries permitted until 2026-04-16 04:25:35.668190919 +0000 UTC m=+83.570966897 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a33cde98-e7f0-4bc9-a9e5-1a6136039882-service-ca-bundle") pod "router-default-5d4bf78676-l9b55" (UID: "a33cde98-e7f0-4bc9-a9e5-1a6136039882") : configmap references non-existent config key: service-ca.crt Apr 16 04:25:34.955249 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.955146 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-kn57s" event={"ID":"ed401949-3fdf-455d-bb0a-400a7a15466c","Type":"ContainerStarted","Data":"1060dd59c0f1c3d43649dab292d746c77d551b43515ada88295bbb5ca10a584d"} Apr 16 04:25:34.956192 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:34.956155 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-79vz7" event={"ID":"b6856769-4d20-4817-9f00-63ded00b2073","Type":"ContainerStarted","Data":"8828b2b6f92f5a47317d11d877e9a937954913524e383718e151b5b8f9c857d9"} Apr 16 04:25:35.576109 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:35.576063 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d8b1eea-523b-4786-aa84-e8f73ea58da2-registry-tls\") pod \"image-registry-76684bdc88-r8jsk\" (UID: \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\") " pod="openshift-image-registry/image-registry-76684bdc88-r8jsk" Apr 16 04:25:35.576286 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:35.576181 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 04:25:35.576286 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:35.576202 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-76684bdc88-r8jsk: secret "image-registry-tls" not found Apr 16 04:25:35.576286 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:35.576260 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3d8b1eea-523b-4786-aa84-e8f73ea58da2-registry-tls podName:3d8b1eea-523b-4786-aa84-e8f73ea58da2 nodeName:}" failed. No retries permitted until 2026-04-16 04:25:37.576243303 +0000 UTC m=+85.479019279 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3d8b1eea-523b-4786-aa84-e8f73ea58da2-registry-tls") pod "image-registry-76684bdc88-r8jsk" (UID: "3d8b1eea-523b-4786-aa84-e8f73ea58da2") : secret "image-registry-tls" not found Apr 16 04:25:35.676784 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:35.676754 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a33cde98-e7f0-4bc9-a9e5-1a6136039882-service-ca-bundle\") pod \"router-default-5d4bf78676-l9b55\" (UID: \"a33cde98-e7f0-4bc9-a9e5-1a6136039882\") " pod="openshift-ingress/router-default-5d4bf78676-l9b55" Apr 16 04:25:35.676966 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:35.676795 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a33cde98-e7f0-4bc9-a9e5-1a6136039882-metrics-certs\") pod \"router-default-5d4bf78676-l9b55\" (UID: \"a33cde98-e7f0-4bc9-a9e5-1a6136039882\") " pod="openshift-ingress/router-default-5d4bf78676-l9b55" Apr 16 04:25:35.676966 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:35.676924 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a33cde98-e7f0-4bc9-a9e5-1a6136039882-service-ca-bundle podName:a33cde98-e7f0-4bc9-a9e5-1a6136039882 nodeName:}" failed. No retries permitted until 2026-04-16 04:25:37.676907201 +0000 UTC m=+85.579683197 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a33cde98-e7f0-4bc9-a9e5-1a6136039882-service-ca-bundle") pod "router-default-5d4bf78676-l9b55" (UID: "a33cde98-e7f0-4bc9-a9e5-1a6136039882") : configmap references non-existent config key: service-ca.crt Apr 16 04:25:35.677097 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:35.676988 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 04:25:35.677097 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:35.677049 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a33cde98-e7f0-4bc9-a9e5-1a6136039882-metrics-certs podName:a33cde98-e7f0-4bc9-a9e5-1a6136039882 nodeName:}" failed. No retries permitted until 2026-04-16 04:25:37.677032819 +0000 UTC m=+85.579808798 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a33cde98-e7f0-4bc9-a9e5-1a6136039882-metrics-certs") pod "router-default-5d4bf78676-l9b55" (UID: "a33cde98-e7f0-4bc9-a9e5-1a6136039882") : secret "router-metrics-certs-default" not found Apr 16 04:25:37.590972 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:37.590931 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d8b1eea-523b-4786-aa84-e8f73ea58da2-registry-tls\") pod \"image-registry-76684bdc88-r8jsk\" (UID: \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\") " pod="openshift-image-registry/image-registry-76684bdc88-r8jsk" Apr 16 04:25:37.591384 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:37.591091 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 04:25:37.591384 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:37.591112 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-76684bdc88-r8jsk: secret "image-registry-tls" not found Apr 16 04:25:37.591384 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:37.591168 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3d8b1eea-523b-4786-aa84-e8f73ea58da2-registry-tls podName:3d8b1eea-523b-4786-aa84-e8f73ea58da2 nodeName:}" failed. No retries permitted until 2026-04-16 04:25:41.591152624 +0000 UTC m=+89.493928601 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3d8b1eea-523b-4786-aa84-e8f73ea58da2-registry-tls") pod "image-registry-76684bdc88-r8jsk" (UID: "3d8b1eea-523b-4786-aa84-e8f73ea58da2") : secret "image-registry-tls" not found Apr 16 04:25:37.692258 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:37.692205 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a33cde98-e7f0-4bc9-a9e5-1a6136039882-service-ca-bundle\") pod \"router-default-5d4bf78676-l9b55\" (UID: \"a33cde98-e7f0-4bc9-a9e5-1a6136039882\") " pod="openshift-ingress/router-default-5d4bf78676-l9b55" Apr 16 04:25:37.692258 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:37.692262 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a33cde98-e7f0-4bc9-a9e5-1a6136039882-metrics-certs\") pod \"router-default-5d4bf78676-l9b55\" (UID: \"a33cde98-e7f0-4bc9-a9e5-1a6136039882\") " pod="openshift-ingress/router-default-5d4bf78676-l9b55" Apr 16 04:25:37.692503 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:37.692390 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a33cde98-e7f0-4bc9-a9e5-1a6136039882-service-ca-bundle podName:a33cde98-e7f0-4bc9-a9e5-1a6136039882 nodeName:}" failed. No retries permitted until 2026-04-16 04:25:41.6923731 +0000 UTC m=+89.595149076 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a33cde98-e7f0-4bc9-a9e5-1a6136039882-service-ca-bundle") pod "router-default-5d4bf78676-l9b55" (UID: "a33cde98-e7f0-4bc9-a9e5-1a6136039882") : configmap references non-existent config key: service-ca.crt Apr 16 04:25:37.692503 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:37.692396 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 04:25:37.692503 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:37.692430 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a33cde98-e7f0-4bc9-a9e5-1a6136039882-metrics-certs podName:a33cde98-e7f0-4bc9-a9e5-1a6136039882 nodeName:}" failed. No retries permitted until 2026-04-16 04:25:41.692422388 +0000 UTC m=+89.595198365 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a33cde98-e7f0-4bc9-a9e5-1a6136039882-metrics-certs") pod "router-default-5d4bf78676-l9b55" (UID: "a33cde98-e7f0-4bc9-a9e5-1a6136039882") : secret "router-metrics-certs-default" not found Apr 16 04:25:37.963779 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:37.963704 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kn57s_ed401949-3fdf-455d-bb0a-400a7a15466c/console-operator/0.log" Apr 16 04:25:37.963779 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:37.963742 2575 generic.go:358] "Generic (PLEG): container finished" podID="ed401949-3fdf-455d-bb0a-400a7a15466c" containerID="614aec8dfd84b206070d13408c308cede4f10e5da31e583efec65e3ea09cc138" exitCode=255 Apr 16 04:25:37.964002 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:37.963777 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-kn57s" event={"ID":"ed401949-3fdf-455d-bb0a-400a7a15466c","Type":"ContainerDied","Data":"614aec8dfd84b206070d13408c308cede4f10e5da31e583efec65e3ea09cc138"} Apr 16 04:25:37.964109 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:37.964087 2575 scope.go:117] "RemoveContainer" containerID="614aec8dfd84b206070d13408c308cede4f10e5da31e583efec65e3ea09cc138" Apr 16 04:25:37.965260 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:37.965239 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-79vz7" event={"ID":"b6856769-4d20-4817-9f00-63ded00b2073","Type":"ContainerStarted","Data":"c7fbd8c27966bbde3275dbe01b1463c35f5314a016ad4b42ad87bacd52c1ec46"} Apr 16 04:25:37.994700 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:37.994291 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-79vz7" podStartSLOduration=2.25909081 podStartE2EDuration="4.994274963s" podCreationTimestamp="2026-04-16 04:25:33 +0000 UTC" firstStartedPulling="2026-04-16 04:25:34.174427153 +0000 UTC m=+82.077203131" lastFinishedPulling="2026-04-16 04:25:36.909611303 +0000 UTC m=+84.812387284" observedRunningTime="2026-04-16 04:25:37.994217958 +0000 UTC m=+85.896993956" watchObservedRunningTime="2026-04-16 04:25:37.994274963 +0000 UTC m=+85.897050963" Apr 16 04:25:38.968780 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:38.968754 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kn57s_ed401949-3fdf-455d-bb0a-400a7a15466c/console-operator/1.log" Apr 16 04:25:38.969244 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:38.969105 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kn57s_ed401949-3fdf-455d-bb0a-400a7a15466c/console-operator/0.log" Apr 16 04:25:38.969244 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:38.969136 2575 generic.go:358] "Generic (PLEG): container finished" podID="ed401949-3fdf-455d-bb0a-400a7a15466c" containerID="e362dfe26e3c54c99b6a88b59be18f979b6dc19ff954e4a76ead80aa75d7f4e6" exitCode=255 Apr 16 04:25:38.969367 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:38.969240 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-kn57s" event={"ID":"ed401949-3fdf-455d-bb0a-400a7a15466c","Type":"ContainerDied","Data":"e362dfe26e3c54c99b6a88b59be18f979b6dc19ff954e4a76ead80aa75d7f4e6"} Apr 16 04:25:38.969367 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:38.969287 2575 scope.go:117] "RemoveContainer" containerID="614aec8dfd84b206070d13408c308cede4f10e5da31e583efec65e3ea09cc138" Apr 16 04:25:38.969475 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:38.969458 2575 scope.go:117] "RemoveContainer" containerID="e362dfe26e3c54c99b6a88b59be18f979b6dc19ff954e4a76ead80aa75d7f4e6" Apr 16 04:25:38.969652 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:38.969634 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-kn57s_openshift-console-operator(ed401949-3fdf-455d-bb0a-400a7a15466c)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-kn57s" podUID="ed401949-3fdf-455d-bb0a-400a7a15466c" Apr 16 04:25:39.973028 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:39.973002 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kn57s_ed401949-3fdf-455d-bb0a-400a7a15466c/console-operator/1.log" Apr 16 04:25:39.973506 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:39.973423 2575 scope.go:117] "RemoveContainer" containerID="e362dfe26e3c54c99b6a88b59be18f979b6dc19ff954e4a76ead80aa75d7f4e6" Apr 16 04:25:39.973654 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:39.973634 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-kn57s_openshift-console-operator(ed401949-3fdf-455d-bb0a-400a7a15466c)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-kn57s" podUID="ed401949-3fdf-455d-bb0a-400a7a15466c" Apr 16 04:25:40.313168 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:40.313138 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-kgkxr"] Apr 16 04:25:40.317245 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:40.317230 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-kgkxr" Apr 16 04:25:40.319776 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:40.319759 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-bnjgs\"" Apr 16 04:25:40.320057 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:40.320035 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 04:25:40.320129 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:40.320112 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 04:25:40.320180 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:40.320149 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 04:25:40.320219 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:40.320178 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 04:25:40.322853 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:40.322833 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-kgkxr"] Apr 16 04:25:40.414644 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:40.414611 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2b2d1087-2ad0-4620-96de-619f77fb1045-signing-key\") pod \"service-ca-bfc587fb7-kgkxr\" (UID: \"2b2d1087-2ad0-4620-96de-619f77fb1045\") " pod="openshift-service-ca/service-ca-bfc587fb7-kgkxr" Apr 16 04:25:40.414806 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:40.414664 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krtjl\" (UniqueName: \"kubernetes.io/projected/2b2d1087-2ad0-4620-96de-619f77fb1045-kube-api-access-krtjl\") pod \"service-ca-bfc587fb7-kgkxr\" (UID: \"2b2d1087-2ad0-4620-96de-619f77fb1045\") " pod="openshift-service-ca/service-ca-bfc587fb7-kgkxr" Apr 16 04:25:40.414806 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:40.414787 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2b2d1087-2ad0-4620-96de-619f77fb1045-signing-cabundle\") pod \"service-ca-bfc587fb7-kgkxr\" (UID: \"2b2d1087-2ad0-4620-96de-619f77fb1045\") " pod="openshift-service-ca/service-ca-bfc587fb7-kgkxr" Apr 16 04:25:40.515214 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:40.515183 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2b2d1087-2ad0-4620-96de-619f77fb1045-signing-cabundle\") pod \"service-ca-bfc587fb7-kgkxr\" (UID: \"2b2d1087-2ad0-4620-96de-619f77fb1045\") " pod="openshift-service-ca/service-ca-bfc587fb7-kgkxr" Apr 16 04:25:40.515300 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:40.515252 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2b2d1087-2ad0-4620-96de-619f77fb1045-signing-key\") pod \"service-ca-bfc587fb7-kgkxr\" (UID: \"2b2d1087-2ad0-4620-96de-619f77fb1045\") " pod="openshift-service-ca/service-ca-bfc587fb7-kgkxr" Apr 16 04:25:40.515300 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:40.515284 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-krtjl\" (UniqueName: \"kubernetes.io/projected/2b2d1087-2ad0-4620-96de-619f77fb1045-kube-api-access-krtjl\") pod \"service-ca-bfc587fb7-kgkxr\" (UID: \"2b2d1087-2ad0-4620-96de-619f77fb1045\") " pod="openshift-service-ca/service-ca-bfc587fb7-kgkxr" Apr 16 04:25:40.515862 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:40.515827 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2b2d1087-2ad0-4620-96de-619f77fb1045-signing-cabundle\") pod \"service-ca-bfc587fb7-kgkxr\" (UID: \"2b2d1087-2ad0-4620-96de-619f77fb1045\") " pod="openshift-service-ca/service-ca-bfc587fb7-kgkxr" Apr 16 04:25:40.517656 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:40.517635 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2b2d1087-2ad0-4620-96de-619f77fb1045-signing-key\") pod \"service-ca-bfc587fb7-kgkxr\" (UID: \"2b2d1087-2ad0-4620-96de-619f77fb1045\") " pod="openshift-service-ca/service-ca-bfc587fb7-kgkxr" Apr 16 04:25:40.522736 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:40.522710 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-krtjl\" (UniqueName: \"kubernetes.io/projected/2b2d1087-2ad0-4620-96de-619f77fb1045-kube-api-access-krtjl\") pod \"service-ca-bfc587fb7-kgkxr\" (UID: \"2b2d1087-2ad0-4620-96de-619f77fb1045\") " pod="openshift-service-ca/service-ca-bfc587fb7-kgkxr" Apr 16 04:25:40.626748 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:40.626661 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-kgkxr" Apr 16 04:25:40.738293 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:40.738262 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-kgkxr"] Apr 16 04:25:40.741238 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:25:40.741206 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b2d1087_2ad0_4620_96de_619f77fb1045.slice/crio-16d57c7c1417c58c84c121f2ad553d612276b37e4636849a91abda321caeae49 WatchSource:0}: Error finding container 16d57c7c1417c58c84c121f2ad553d612276b37e4636849a91abda321caeae49: Status 404 returned error can't find the container with id 16d57c7c1417c58c84c121f2ad553d612276b37e4636849a91abda321caeae49 Apr 16 04:25:40.976256 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:40.976162 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-kgkxr" event={"ID":"2b2d1087-2ad0-4620-96de-619f77fb1045","Type":"ContainerStarted","Data":"16d57c7c1417c58c84c121f2ad553d612276b37e4636849a91abda321caeae49"} Apr 16 04:25:41.624521 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:41.624479 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d8b1eea-523b-4786-aa84-e8f73ea58da2-registry-tls\") pod \"image-registry-76684bdc88-r8jsk\" (UID: \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\") " pod="openshift-image-registry/image-registry-76684bdc88-r8jsk" Apr 16 04:25:41.624696 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:41.624657 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 04:25:41.624696 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:41.624672 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-76684bdc88-r8jsk: secret "image-registry-tls" not found Apr 16 04:25:41.624768 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:41.624738 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3d8b1eea-523b-4786-aa84-e8f73ea58da2-registry-tls podName:3d8b1eea-523b-4786-aa84-e8f73ea58da2 nodeName:}" failed. No retries permitted until 2026-04-16 04:25:49.624716707 +0000 UTC m=+97.527492686 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3d8b1eea-523b-4786-aa84-e8f73ea58da2-registry-tls") pod "image-registry-76684bdc88-r8jsk" (UID: "3d8b1eea-523b-4786-aa84-e8f73ea58da2") : secret "image-registry-tls" not found Apr 16 04:25:41.725289 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:41.725248 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a33cde98-e7f0-4bc9-a9e5-1a6136039882-service-ca-bundle\") pod \"router-default-5d4bf78676-l9b55\" (UID: \"a33cde98-e7f0-4bc9-a9e5-1a6136039882\") " pod="openshift-ingress/router-default-5d4bf78676-l9b55" Apr 16 04:25:41.725489 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:41.725373 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a33cde98-e7f0-4bc9-a9e5-1a6136039882-metrics-certs\") pod \"router-default-5d4bf78676-l9b55\" (UID: \"a33cde98-e7f0-4bc9-a9e5-1a6136039882\") " pod="openshift-ingress/router-default-5d4bf78676-l9b55" Apr 16 04:25:41.725489 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:41.725457 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a33cde98-e7f0-4bc9-a9e5-1a6136039882-service-ca-bundle podName:a33cde98-e7f0-4bc9-a9e5-1a6136039882 nodeName:}" failed. No retries permitted until 2026-04-16 04:25:49.725432963 +0000 UTC m=+97.628208951 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a33cde98-e7f0-4bc9-a9e5-1a6136039882-service-ca-bundle") pod "router-default-5d4bf78676-l9b55" (UID: "a33cde98-e7f0-4bc9-a9e5-1a6136039882") : configmap references non-existent config key: service-ca.crt Apr 16 04:25:41.725597 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:41.725489 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 04:25:41.725597 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:41.725549 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a33cde98-e7f0-4bc9-a9e5-1a6136039882-metrics-certs podName:a33cde98-e7f0-4bc9-a9e5-1a6136039882 nodeName:}" failed. No retries permitted until 2026-04-16 04:25:49.725536568 +0000 UTC m=+97.628312559 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a33cde98-e7f0-4bc9-a9e5-1a6136039882-metrics-certs") pod "router-default-5d4bf78676-l9b55" (UID: "a33cde98-e7f0-4bc9-a9e5-1a6136039882") : secret "router-metrics-certs-default" not found Apr 16 04:25:41.742370 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:41.742322 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-w7h7r_1d98373a-f359-466d-8c72-1e9b4e835c04/dns-node-resolver/0.log" Apr 16 04:25:42.540914 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:42.540888 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-dj5pp_f587afea-01f4-4981-9efb-09779dde706b/node-ca/0.log" Apr 16 04:25:43.984944 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:43.984905 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-kgkxr" event={"ID":"2b2d1087-2ad0-4620-96de-619f77fb1045","Type":"ContainerStarted","Data":"66b9035d8a6d655785f10a890ba0de7ac733c370366fa82069bfbe3e6aad7501"} Apr 16 04:25:44.001391 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:44.001325 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-bfc587fb7-kgkxr" podStartSLOduration=1.862201079 podStartE2EDuration="4.001307689s" podCreationTimestamp="2026-04-16 04:25:40 +0000 UTC" firstStartedPulling="2026-04-16 04:25:40.743148616 +0000 UTC m=+88.645924593" lastFinishedPulling="2026-04-16 04:25:42.88225521 +0000 UTC m=+90.785031203" observedRunningTime="2026-04-16 04:25:44.00120358 +0000 UTC m=+91.903979605" watchObservedRunningTime="2026-04-16 04:25:44.001307689 +0000 UTC m=+91.904083690" Apr 16 04:25:44.142858 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:44.142832 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-79vz7_b6856769-4d20-4817-9f00-63ded00b2073/kube-storage-version-migrator-operator/0.log" Apr 16 04:25:44.276027 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:44.275992 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-kn57s" Apr 16 04:25:44.276027 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:44.276033 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-kn57s" Apr 16 04:25:44.276536 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:44.276517 2575 scope.go:117] "RemoveContainer" containerID="e362dfe26e3c54c99b6a88b59be18f979b6dc19ff954e4a76ead80aa75d7f4e6" Apr 16 04:25:44.276753 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:44.276734 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-kn57s_openshift-console-operator(ed401949-3fdf-455d-bb0a-400a7a15466c)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-kn57s" podUID="ed401949-3fdf-455d-bb0a-400a7a15466c" Apr 16 04:25:49.389737 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:49.389688 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0ca4f43-c47b-4ec0-82f8-3a8054f7a277-cert\") pod \"ingress-canary-8pzwz\" (UID: \"b0ca4f43-c47b-4ec0-82f8-3a8054f7a277\") " pod="openshift-ingress-canary/ingress-canary-8pzwz" Apr 16 04:25:49.389737 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:49.389740 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a8f8c89-66e4-4302-bf2b-05fd369633cf-metrics-tls\") pod \"dns-default-fvbjz\" (UID: \"6a8f8c89-66e4-4302-bf2b-05fd369633cf\") " pod="openshift-dns/dns-default-fvbjz" Apr 16 04:25:49.392052 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:49.392025 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a8f8c89-66e4-4302-bf2b-05fd369633cf-metrics-tls\") pod \"dns-default-fvbjz\" (UID: \"6a8f8c89-66e4-4302-bf2b-05fd369633cf\") " pod="openshift-dns/dns-default-fvbjz" Apr 16 04:25:49.392202 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:49.392183 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0ca4f43-c47b-4ec0-82f8-3a8054f7a277-cert\") pod \"ingress-canary-8pzwz\" (UID: \"b0ca4f43-c47b-4ec0-82f8-3a8054f7a277\") " pod="openshift-ingress-canary/ingress-canary-8pzwz" Apr 16 04:25:49.681975 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:49.681947 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-smz8d\"" Apr 16 04:25:49.689207 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:49.689182 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fvbjz" Apr 16 04:25:49.691399 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:49.691375 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-d4mbv\"" Apr 16 04:25:49.692529 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:49.692512 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d8b1eea-523b-4786-aa84-e8f73ea58da2-registry-tls\") pod \"image-registry-76684bdc88-r8jsk\" (UID: \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\") " pod="openshift-image-registry/image-registry-76684bdc88-r8jsk" Apr 16 04:25:49.694984 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:49.694958 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d8b1eea-523b-4786-aa84-e8f73ea58da2-registry-tls\") pod \"image-registry-76684bdc88-r8jsk\" (UID: \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\") " pod="openshift-image-registry/image-registry-76684bdc88-r8jsk" Apr 16 04:25:49.699708 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:49.699684 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8pzwz" Apr 16 04:25:49.761282 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:49.761246 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-76684bdc88-r8jsk" Apr 16 04:25:49.793031 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:49.792961 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a33cde98-e7f0-4bc9-a9e5-1a6136039882-metrics-certs\") pod \"router-default-5d4bf78676-l9b55\" (UID: \"a33cde98-e7f0-4bc9-a9e5-1a6136039882\") " pod="openshift-ingress/router-default-5d4bf78676-l9b55" Apr 16 04:25:49.793186 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:49.793074 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a33cde98-e7f0-4bc9-a9e5-1a6136039882-service-ca-bundle\") pod \"router-default-5d4bf78676-l9b55\" (UID: \"a33cde98-e7f0-4bc9-a9e5-1a6136039882\") " pod="openshift-ingress/router-default-5d4bf78676-l9b55" Apr 16 04:25:49.793385 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:49.793255 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a33cde98-e7f0-4bc9-a9e5-1a6136039882-service-ca-bundle podName:a33cde98-e7f0-4bc9-a9e5-1a6136039882 nodeName:}" failed. No retries permitted until 2026-04-16 04:26:05.793233476 +0000 UTC m=+113.696009473 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a33cde98-e7f0-4bc9-a9e5-1a6136039882-service-ca-bundle") pod "router-default-5d4bf78676-l9b55" (UID: "a33cde98-e7f0-4bc9-a9e5-1a6136039882") : configmap references non-existent config key: service-ca.crt Apr 16 04:25:49.796354 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:49.796310 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a33cde98-e7f0-4bc9-a9e5-1a6136039882-metrics-certs\") pod \"router-default-5d4bf78676-l9b55\" (UID: \"a33cde98-e7f0-4bc9-a9e5-1a6136039882\") " pod="openshift-ingress/router-default-5d4bf78676-l9b55" Apr 16 04:25:49.822096 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:49.821781 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fvbjz"] Apr 16 04:25:49.826022 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:25:49.825992 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a8f8c89_66e4_4302_bf2b_05fd369633cf.slice/crio-823985fafdea06d30e0765db649177f0e68fbf0ed02bdfd3e6d67e014a579bc8 WatchSource:0}: Error finding container 823985fafdea06d30e0765db649177f0e68fbf0ed02bdfd3e6d67e014a579bc8: Status 404 returned error can't find the container with id 823985fafdea06d30e0765db649177f0e68fbf0ed02bdfd3e6d67e014a579bc8 Apr 16 04:25:49.841652 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:49.841597 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8pzwz"] Apr 16 04:25:49.845119 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:25:49.845087 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0ca4f43_c47b_4ec0_82f8_3a8054f7a277.slice/crio-ddc67d4b4c5598e536bfa8bea270e20c9ea590cf18844bfd833ad7b98fbd44bd WatchSource:0}: Error finding container ddc67d4b4c5598e536bfa8bea270e20c9ea590cf18844bfd833ad7b98fbd44bd: Status 404 returned error can't find the container with id ddc67d4b4c5598e536bfa8bea270e20c9ea590cf18844bfd833ad7b98fbd44bd Apr 16 04:25:49.894457 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:49.894424 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-76684bdc88-r8jsk"] Apr 16 04:25:49.897646 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:25:49.897623 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d8b1eea_523b_4786_aa84_e8f73ea58da2.slice/crio-9d6e207a327cf7d21da354d664c2b4ffadc47632dff697fef10d256419016034 WatchSource:0}: Error finding container 9d6e207a327cf7d21da354d664c2b4ffadc47632dff697fef10d256419016034: Status 404 returned error can't find the container with id 9d6e207a327cf7d21da354d664c2b4ffadc47632dff697fef10d256419016034 Apr 16 04:25:50.001955 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:50.001912 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8pzwz" event={"ID":"b0ca4f43-c47b-4ec0-82f8-3a8054f7a277","Type":"ContainerStarted","Data":"ddc67d4b4c5598e536bfa8bea270e20c9ea590cf18844bfd833ad7b98fbd44bd"} Apr 16 04:25:50.003072 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:50.003044 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fvbjz" event={"ID":"6a8f8c89-66e4-4302-bf2b-05fd369633cf","Type":"ContainerStarted","Data":"823985fafdea06d30e0765db649177f0e68fbf0ed02bdfd3e6d67e014a579bc8"} Apr 16 04:25:50.004391 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:50.004365 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-76684bdc88-r8jsk" event={"ID":"3d8b1eea-523b-4786-aa84-e8f73ea58da2","Type":"ContainerStarted","Data":"0f28e45c5b5c82c50db14627b595a5d3ef7e7baff5bcb8a7bdc552f5bb63f3a9"} Apr 16 04:25:50.004503 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:50.004397 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-76684bdc88-r8jsk" event={"ID":"3d8b1eea-523b-4786-aa84-e8f73ea58da2","Type":"ContainerStarted","Data":"9d6e207a327cf7d21da354d664c2b4ffadc47632dff697fef10d256419016034"} Apr 16 04:25:50.004503 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:50.004477 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-76684bdc88-r8jsk" Apr 16 04:25:50.026766 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:50.026713 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-76684bdc88-r8jsk" podStartSLOduration=17.026697252 podStartE2EDuration="17.026697252s" podCreationTimestamp="2026-04-16 04:25:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 04:25:50.024745556 +0000 UTC m=+97.927521565" watchObservedRunningTime="2026-04-16 04:25:50.026697252 +0000 UTC m=+97.929473247" Apr 16 04:25:52.012975 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:52.012887 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fvbjz" event={"ID":"6a8f8c89-66e4-4302-bf2b-05fd369633cf","Type":"ContainerStarted","Data":"e8eb12d98a6ad389c01b7be1b571142724df0e9ec56637e9b1da9823359a8619"} Apr 16 04:25:52.012975 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:52.012932 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fvbjz" event={"ID":"6a8f8c89-66e4-4302-bf2b-05fd369633cf","Type":"ContainerStarted","Data":"9b7acf9898f6a4d6c71237365c98892b7743a6e29a2974a6ddc80a3e0fa134f1"} Apr 16 04:25:52.013526 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:52.013142 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-fvbjz" Apr 16 04:25:52.033482 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:52.033431 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-fvbjz" podStartSLOduration=65.575270099 podStartE2EDuration="1m7.033417118s" podCreationTimestamp="2026-04-16 04:24:45 +0000 UTC" firstStartedPulling="2026-04-16 04:25:49.828194411 +0000 UTC m=+97.730970403" lastFinishedPulling="2026-04-16 04:25:51.286341435 +0000 UTC m=+99.189117422" observedRunningTime="2026-04-16 04:25:52.031985389 +0000 UTC m=+99.934761391" watchObservedRunningTime="2026-04-16 04:25:52.033417118 +0000 UTC m=+99.936193116" Apr 16 04:25:53.016820 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:53.016788 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8pzwz" event={"ID":"b0ca4f43-c47b-4ec0-82f8-3a8054f7a277","Type":"ContainerStarted","Data":"98b1c127807d30b4e432ddb8e3e7bfd0014e88c96e2f399d610aad8122991ab6"} Apr 16 04:25:53.034079 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:53.034029 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-8pzwz" podStartSLOduration=65.793765963 podStartE2EDuration="1m8.034014958s" podCreationTimestamp="2026-04-16 04:24:45 +0000 UTC" firstStartedPulling="2026-04-16 04:25:49.84711075 +0000 UTC m=+97.749886735" lastFinishedPulling="2026-04-16 04:25:52.087359753 +0000 UTC m=+99.990135730" observedRunningTime="2026-04-16 04:25:53.032646952 +0000 UTC m=+100.935422951" watchObservedRunningTime="2026-04-16 04:25:53.034014958 +0000 UTC m=+100.936790956" Apr 16 04:25:56.702758 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:56.702726 2575 scope.go:117] "RemoveContainer" containerID="e362dfe26e3c54c99b6a88b59be18f979b6dc19ff954e4a76ead80aa75d7f4e6" Apr 16 04:25:57.028540 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:57.028513 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kn57s_ed401949-3fdf-455d-bb0a-400a7a15466c/console-operator/2.log" Apr 16 04:25:57.028880 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:57.028862 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kn57s_ed401949-3fdf-455d-bb0a-400a7a15466c/console-operator/1.log" Apr 16 04:25:57.028969 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:57.028902 2575 generic.go:358] "Generic (PLEG): container finished" podID="ed401949-3fdf-455d-bb0a-400a7a15466c" containerID="7ab2b2baacc31df355cdb23eee54e03836b143954c38232b25d178b4ec0bd03c" exitCode=255 Apr 16 04:25:57.028969 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:57.028940 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-kn57s" event={"ID":"ed401949-3fdf-455d-bb0a-400a7a15466c","Type":"ContainerDied","Data":"7ab2b2baacc31df355cdb23eee54e03836b143954c38232b25d178b4ec0bd03c"} Apr 16 04:25:57.029074 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:57.028976 2575 scope.go:117] "RemoveContainer" containerID="e362dfe26e3c54c99b6a88b59be18f979b6dc19ff954e4a76ead80aa75d7f4e6" Apr 16 04:25:57.029357 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:57.029320 2575 scope.go:117] "RemoveContainer" containerID="7ab2b2baacc31df355cdb23eee54e03836b143954c38232b25d178b4ec0bd03c" Apr 16 04:25:57.029552 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:25:57.029531 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-kn57s_openshift-console-operator(ed401949-3fdf-455d-bb0a-400a7a15466c)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-kn57s" podUID="ed401949-3fdf-455d-bb0a-400a7a15466c" Apr 16 04:25:58.033990 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:25:58.033960 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kn57s_ed401949-3fdf-455d-bb0a-400a7a15466c/console-operator/2.log" Apr 16 04:26:01.177366 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:01.177316 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-76684bdc88-r8jsk"] Apr 16 04:26:01.227531 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:01.227496 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-kmwkb"] Apr 16 04:26:01.230903 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:01.230878 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-kmwkb" Apr 16 04:26:01.233637 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:01.233608 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 04:26:01.233758 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:01.233744 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 04:26:01.233819 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:01.233784 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 04:26:01.234083 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:01.234066 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 04:26:01.234083 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:01.234071 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-stqlj\"" Apr 16 04:26:01.240588 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:01.240550 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-kmwkb"] Apr 16 04:26:01.379411 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:01.379376 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7d5fb402-61cd-404e-9bcf-644c7a73c048-crio-socket\") pod \"insights-runtime-extractor-kmwkb\" (UID: \"7d5fb402-61cd-404e-9bcf-644c7a73c048\") " pod="openshift-insights/insights-runtime-extractor-kmwkb" Apr 16 04:26:01.379592 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:01.379423 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7d5fb402-61cd-404e-9bcf-644c7a73c048-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kmwkb\" (UID: \"7d5fb402-61cd-404e-9bcf-644c7a73c048\") " pod="openshift-insights/insights-runtime-extractor-kmwkb" Apr 16 04:26:01.379592 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:01.379510 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7xr5\" (UniqueName: \"kubernetes.io/projected/7d5fb402-61cd-404e-9bcf-644c7a73c048-kube-api-access-k7xr5\") pod \"insights-runtime-extractor-kmwkb\" (UID: \"7d5fb402-61cd-404e-9bcf-644c7a73c048\") " pod="openshift-insights/insights-runtime-extractor-kmwkb" Apr 16 04:26:01.379592 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:01.379577 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7d5fb402-61cd-404e-9bcf-644c7a73c048-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-kmwkb\" (UID: \"7d5fb402-61cd-404e-9bcf-644c7a73c048\") " pod="openshift-insights/insights-runtime-extractor-kmwkb" Apr 16 04:26:01.379704 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:01.379604 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7d5fb402-61cd-404e-9bcf-644c7a73c048-data-volume\") pod \"insights-runtime-extractor-kmwkb\" (UID: \"7d5fb402-61cd-404e-9bcf-644c7a73c048\") " pod="openshift-insights/insights-runtime-extractor-kmwkb" Apr 16 04:26:01.480039 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:01.479947 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k7xr5\" (UniqueName: \"kubernetes.io/projected/7d5fb402-61cd-404e-9bcf-644c7a73c048-kube-api-access-k7xr5\") pod \"insights-runtime-extractor-kmwkb\" (UID: \"7d5fb402-61cd-404e-9bcf-644c7a73c048\") " pod="openshift-insights/insights-runtime-extractor-kmwkb" Apr 16 04:26:01.480039 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:01.479998 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7d5fb402-61cd-404e-9bcf-644c7a73c048-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-kmwkb\" (UID: \"7d5fb402-61cd-404e-9bcf-644c7a73c048\") " pod="openshift-insights/insights-runtime-extractor-kmwkb" Apr 16 04:26:01.480039 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:01.480021 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7d5fb402-61cd-404e-9bcf-644c7a73c048-data-volume\") pod \"insights-runtime-extractor-kmwkb\" (UID: \"7d5fb402-61cd-404e-9bcf-644c7a73c048\") " pod="openshift-insights/insights-runtime-extractor-kmwkb" Apr 16 04:26:01.480303 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:01.480046 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7d5fb402-61cd-404e-9bcf-644c7a73c048-crio-socket\") pod \"insights-runtime-extractor-kmwkb\" (UID: \"7d5fb402-61cd-404e-9bcf-644c7a73c048\") " pod="openshift-insights/insights-runtime-extractor-kmwkb" Apr 16 04:26:01.480303 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:01.480070 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7d5fb402-61cd-404e-9bcf-644c7a73c048-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kmwkb\" (UID: \"7d5fb402-61cd-404e-9bcf-644c7a73c048\") " pod="openshift-insights/insights-runtime-extractor-kmwkb" Apr 16 04:26:01.480303 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:01.480185 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7d5fb402-61cd-404e-9bcf-644c7a73c048-crio-socket\") pod \"insights-runtime-extractor-kmwkb\" (UID: \"7d5fb402-61cd-404e-9bcf-644c7a73c048\") " pod="openshift-insights/insights-runtime-extractor-kmwkb" Apr 16 04:26:01.480440 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:01.480397 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7d5fb402-61cd-404e-9bcf-644c7a73c048-data-volume\") pod \"insights-runtime-extractor-kmwkb\" (UID: \"7d5fb402-61cd-404e-9bcf-644c7a73c048\") " pod="openshift-insights/insights-runtime-extractor-kmwkb" Apr 16 04:26:01.480646 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:01.480625 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7d5fb402-61cd-404e-9bcf-644c7a73c048-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-kmwkb\" (UID: \"7d5fb402-61cd-404e-9bcf-644c7a73c048\") " pod="openshift-insights/insights-runtime-extractor-kmwkb" Apr 16 04:26:01.482543 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:01.482526 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7d5fb402-61cd-404e-9bcf-644c7a73c048-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kmwkb\" (UID: \"7d5fb402-61cd-404e-9bcf-644c7a73c048\") " pod="openshift-insights/insights-runtime-extractor-kmwkb" Apr 16 04:26:01.487279 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:01.487259 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7xr5\" (UniqueName: \"kubernetes.io/projected/7d5fb402-61cd-404e-9bcf-644c7a73c048-kube-api-access-k7xr5\") pod \"insights-runtime-extractor-kmwkb\" (UID: \"7d5fb402-61cd-404e-9bcf-644c7a73c048\") " pod="openshift-insights/insights-runtime-extractor-kmwkb" Apr 16 04:26:01.539602 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:01.539563 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-kmwkb" Apr 16 04:26:01.674613 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:01.674583 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-kmwkb"] Apr 16 04:26:01.677503 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:26:01.677470 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d5fb402_61cd_404e_9bcf_644c7a73c048.slice/crio-72f1dbe1630ec16bc58bce62d94e4f64d7d01286c975fd8f1f7a9a49f055f143 WatchSource:0}: Error finding container 72f1dbe1630ec16bc58bce62d94e4f64d7d01286c975fd8f1f7a9a49f055f143: Status 404 returned error can't find the container with id 72f1dbe1630ec16bc58bce62d94e4f64d7d01286c975fd8f1f7a9a49f055f143 Apr 16 04:26:02.019517 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:02.019489 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-fvbjz" Apr 16 04:26:02.044982 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:02.044953 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kmwkb" event={"ID":"7d5fb402-61cd-404e-9bcf-644c7a73c048","Type":"ContainerStarted","Data":"659c4291a36f489911c54dd3274f6523df83ccc707c1269138a98d7c4a394222"} Apr 16 04:26:02.044982 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:02.044987 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kmwkb" event={"ID":"7d5fb402-61cd-404e-9bcf-644c7a73c048","Type":"ContainerStarted","Data":"72f1dbe1630ec16bc58bce62d94e4f64d7d01286c975fd8f1f7a9a49f055f143"} Apr 16 04:26:03.049582 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:03.049547 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kmwkb" event={"ID":"7d5fb402-61cd-404e-9bcf-644c7a73c048","Type":"ContainerStarted","Data":"d7095a70beed42c24f994e3722d95da0c179bf1989340a8206b998cd221520a3"} Apr 16 04:26:04.053598 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:04.053560 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kmwkb" event={"ID":"7d5fb402-61cd-404e-9bcf-644c7a73c048","Type":"ContainerStarted","Data":"fbb5281b9498a82ad98ad8bb7ee1628158fd61be22255cce16ffb305701f2f7e"} Apr 16 04:26:04.069563 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:04.069510 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-kmwkb" podStartSLOduration=0.877464576 podStartE2EDuration="3.069494119s" podCreationTimestamp="2026-04-16 04:26:01 +0000 UTC" firstStartedPulling="2026-04-16 04:26:01.732756056 +0000 UTC m=+109.635532033" lastFinishedPulling="2026-04-16 04:26:03.924785592 +0000 UTC m=+111.827561576" observedRunningTime="2026-04-16 04:26:04.068879203 +0000 UTC m=+111.971655203" watchObservedRunningTime="2026-04-16 04:26:04.069494119 +0000 UTC m=+111.972270116" Apr 16 04:26:04.276586 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:04.276549 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-kn57s" Apr 16 04:26:04.276586 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:04.276582 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-kn57s" Apr 16 04:26:04.276924 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:04.276911 2575 scope.go:117] "RemoveContainer" containerID="7ab2b2baacc31df355cdb23eee54e03836b143954c38232b25d178b4ec0bd03c" Apr 16 04:26:04.277101 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:26:04.277085 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-kn57s_openshift-console-operator(ed401949-3fdf-455d-bb0a-400a7a15466c)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-kn57s" podUID="ed401949-3fdf-455d-bb0a-400a7a15466c" Apr 16 04:26:05.810973 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:05.810930 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a33cde98-e7f0-4bc9-a9e5-1a6136039882-service-ca-bundle\") pod \"router-default-5d4bf78676-l9b55\" (UID: \"a33cde98-e7f0-4bc9-a9e5-1a6136039882\") " pod="openshift-ingress/router-default-5d4bf78676-l9b55" Apr 16 04:26:05.811561 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:05.811540 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a33cde98-e7f0-4bc9-a9e5-1a6136039882-service-ca-bundle\") pod \"router-default-5d4bf78676-l9b55\" (UID: \"a33cde98-e7f0-4bc9-a9e5-1a6136039882\") " pod="openshift-ingress/router-default-5d4bf78676-l9b55" Apr 16 04:26:06.069696 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:06.069597 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5d4bf78676-l9b55" Apr 16 04:26:06.184422 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:06.184391 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5d4bf78676-l9b55"] Apr 16 04:26:06.189033 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:26:06.189005 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda33cde98_e7f0_4bc9_a9e5_1a6136039882.slice/crio-984f83d887fe969fe27cc7f55ecdba565cc7057f37ab3360a6ef07d273ec7535 WatchSource:0}: Error finding container 984f83d887fe969fe27cc7f55ecdba565cc7057f37ab3360a6ef07d273ec7535: Status 404 returned error can't find the container with id 984f83d887fe969fe27cc7f55ecdba565cc7057f37ab3360a6ef07d273ec7535 Apr 16 04:26:07.062007 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:07.061972 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5d4bf78676-l9b55" event={"ID":"a33cde98-e7f0-4bc9-a9e5-1a6136039882","Type":"ContainerStarted","Data":"23f1b73319aa4ec8e675d048a4941874f0bb7686975e2ff7905fd8be2830e0b4"} Apr 16 04:26:07.062007 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:07.062005 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5d4bf78676-l9b55" event={"ID":"a33cde98-e7f0-4bc9-a9e5-1a6136039882","Type":"ContainerStarted","Data":"984f83d887fe969fe27cc7f55ecdba565cc7057f37ab3360a6ef07d273ec7535"} Apr 16 04:26:07.070589 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:07.070568 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5d4bf78676-l9b55" Apr 16 04:26:07.073070 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:07.073050 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5d4bf78676-l9b55" Apr 16 04:26:07.080486 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:07.080435 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5d4bf78676-l9b55" podStartSLOduration=34.080416246 podStartE2EDuration="34.080416246s" podCreationTimestamp="2026-04-16 04:25:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 04:26:07.079036503 +0000 UTC m=+114.981812493" watchObservedRunningTime="2026-04-16 04:26:07.080416246 +0000 UTC m=+114.983192246" Apr 16 04:26:08.065190 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:08.065154 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-5d4bf78676-l9b55" Apr 16 04:26:08.066399 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:08.066382 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5d4bf78676-l9b55" Apr 16 04:26:11.182159 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:11.182132 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-76684bdc88-r8jsk" Apr 16 04:26:15.703292 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:15.703258 2575 scope.go:117] "RemoveContainer" containerID="7ab2b2baacc31df355cdb23eee54e03836b143954c38232b25d178b4ec0bd03c" Apr 16 04:26:15.703777 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:26:15.703459 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-kn57s_openshift-console-operator(ed401949-3fdf-455d-bb0a-400a7a15466c)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-kn57s" podUID="ed401949-3fdf-455d-bb0a-400a7a15466c" Apr 16 04:26:19.629415 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:19.629377 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-f2phz"] Apr 16 04:26:19.634831 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:19.634813 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-f2phz" Apr 16 04:26:19.638793 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:19.638769 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-st4wc\"" Apr 16 04:26:19.638929 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:19.638852 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 04:26:19.638929 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:19.638856 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 04:26:19.638929 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:19.638856 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 04:26:19.639108 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:19.638905 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 04:26:19.639108 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:19.639093 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 04:26:19.641063 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:19.641038 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-f2phz"] Apr 16 04:26:19.715580 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:19.715542 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/caa3e4a8-086e-41c4-8c8d-baead7f9e776-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-f2phz\" (UID: \"caa3e4a8-086e-41c4-8c8d-baead7f9e776\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-f2phz" Apr 16 04:26:19.715580 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:19.715581 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/caa3e4a8-086e-41c4-8c8d-baead7f9e776-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-f2phz\" (UID: \"caa3e4a8-086e-41c4-8c8d-baead7f9e776\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-f2phz" Apr 16 04:26:19.715876 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:19.715623 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjr5g\" (UniqueName: \"kubernetes.io/projected/caa3e4a8-086e-41c4-8c8d-baead7f9e776-kube-api-access-fjr5g\") pod \"openshift-state-metrics-5669946b84-f2phz\" (UID: \"caa3e4a8-086e-41c4-8c8d-baead7f9e776\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-f2phz" Apr 16 04:26:19.715876 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:19.715669 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/caa3e4a8-086e-41c4-8c8d-baead7f9e776-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-f2phz\" (UID: \"caa3e4a8-086e-41c4-8c8d-baead7f9e776\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-f2phz" Apr 16 04:26:19.750710 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:19.750671 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-lmlvh"] Apr 16 04:26:19.754482 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:19.754460 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-lmlvh" Apr 16 04:26:19.757524 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:19.757500 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-s26sx\"" Apr 16 04:26:19.757657 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:19.757532 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 04:26:19.757657 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:19.757602 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 04:26:19.757831 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:19.757817 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 04:26:19.817090 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:19.817054 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fjr5g\" (UniqueName: \"kubernetes.io/projected/caa3e4a8-086e-41c4-8c8d-baead7f9e776-kube-api-access-fjr5g\") pod \"openshift-state-metrics-5669946b84-f2phz\" (UID: \"caa3e4a8-086e-41c4-8c8d-baead7f9e776\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-f2phz" Apr 16 04:26:19.817299 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:19.817106 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/caa3e4a8-086e-41c4-8c8d-baead7f9e776-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-f2phz\" (UID: \"caa3e4a8-086e-41c4-8c8d-baead7f9e776\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-f2phz" Apr 16 04:26:19.817299 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:19.817249 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/caa3e4a8-086e-41c4-8c8d-baead7f9e776-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-f2phz\" (UID: \"caa3e4a8-086e-41c4-8c8d-baead7f9e776\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-f2phz" Apr 16 04:26:19.817299 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:19.817280 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/caa3e4a8-086e-41c4-8c8d-baead7f9e776-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-f2phz\" (UID: \"caa3e4a8-086e-41c4-8c8d-baead7f9e776\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-f2phz" Apr 16 04:26:19.817608 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:26:19.817588 2575 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 16 04:26:19.817702 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:26:19.817690 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/caa3e4a8-086e-41c4-8c8d-baead7f9e776-openshift-state-metrics-tls podName:caa3e4a8-086e-41c4-8c8d-baead7f9e776 nodeName:}" failed. No retries permitted until 2026-04-16 04:26:20.31767108 +0000 UTC m=+128.220447063 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/caa3e4a8-086e-41c4-8c8d-baead7f9e776-openshift-state-metrics-tls") pod "openshift-state-metrics-5669946b84-f2phz" (UID: "caa3e4a8-086e-41c4-8c8d-baead7f9e776") : secret "openshift-state-metrics-tls" not found Apr 16 04:26:19.818049 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:19.818029 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/caa3e4a8-086e-41c4-8c8d-baead7f9e776-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-f2phz\" (UID: \"caa3e4a8-086e-41c4-8c8d-baead7f9e776\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-f2phz" Apr 16 04:26:19.820866 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:19.820838 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/caa3e4a8-086e-41c4-8c8d-baead7f9e776-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-f2phz\" (UID: \"caa3e4a8-086e-41c4-8c8d-baead7f9e776\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-f2phz" Apr 16 04:26:19.829757 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:19.829730 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjr5g\" (UniqueName: \"kubernetes.io/projected/caa3e4a8-086e-41c4-8c8d-baead7f9e776-kube-api-access-fjr5g\") pod \"openshift-state-metrics-5669946b84-f2phz\" (UID: \"caa3e4a8-086e-41c4-8c8d-baead7f9e776\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-f2phz" Apr 16 04:26:19.918284 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:19.918192 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b4ab77b7-6bc2-4d37-b605-775e02af4e25-node-exporter-tls\") pod \"node-exporter-lmlvh\" (UID: \"b4ab77b7-6bc2-4d37-b605-775e02af4e25\") " pod="openshift-monitoring/node-exporter-lmlvh" Apr 16 04:26:19.918284 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:19.918234 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b4ab77b7-6bc2-4d37-b605-775e02af4e25-root\") pod \"node-exporter-lmlvh\" (UID: \"b4ab77b7-6bc2-4d37-b605-775e02af4e25\") " pod="openshift-monitoring/node-exporter-lmlvh" Apr 16 04:26:19.918494 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:19.918289 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b4ab77b7-6bc2-4d37-b605-775e02af4e25-sys\") pod \"node-exporter-lmlvh\" (UID: \"b4ab77b7-6bc2-4d37-b605-775e02af4e25\") " pod="openshift-monitoring/node-exporter-lmlvh" Apr 16 04:26:19.918494 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:19.918345 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b4ab77b7-6bc2-4d37-b605-775e02af4e25-node-exporter-accelerators-collector-config\") pod \"node-exporter-lmlvh\" (UID: \"b4ab77b7-6bc2-4d37-b605-775e02af4e25\") " pod="openshift-monitoring/node-exporter-lmlvh" Apr 16 04:26:19.918494 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:19.918372 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b4ab77b7-6bc2-4d37-b605-775e02af4e25-node-exporter-wtmp\") pod \"node-exporter-lmlvh\" (UID: \"b4ab77b7-6bc2-4d37-b605-775e02af4e25\") " pod="openshift-monitoring/node-exporter-lmlvh" Apr 16 04:26:19.918494 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:19.918392 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b4ab77b7-6bc2-4d37-b605-775e02af4e25-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lmlvh\" (UID: \"b4ab77b7-6bc2-4d37-b605-775e02af4e25\") " pod="openshift-monitoring/node-exporter-lmlvh" Apr 16 04:26:19.918494 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:19.918422 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b4ab77b7-6bc2-4d37-b605-775e02af4e25-metrics-client-ca\") pod \"node-exporter-lmlvh\" (UID: \"b4ab77b7-6bc2-4d37-b605-775e02af4e25\") " pod="openshift-monitoring/node-exporter-lmlvh" Apr 16 04:26:19.918494 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:19.918464 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5k6j\" (UniqueName: \"kubernetes.io/projected/b4ab77b7-6bc2-4d37-b605-775e02af4e25-kube-api-access-f5k6j\") pod \"node-exporter-lmlvh\" (UID: \"b4ab77b7-6bc2-4d37-b605-775e02af4e25\") " pod="openshift-monitoring/node-exporter-lmlvh" Apr 16 04:26:19.918494 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:19.918483 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b4ab77b7-6bc2-4d37-b605-775e02af4e25-node-exporter-textfile\") pod \"node-exporter-lmlvh\" (UID: \"b4ab77b7-6bc2-4d37-b605-775e02af4e25\") " pod="openshift-monitoring/node-exporter-lmlvh" Apr 16 04:26:20.019167 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:20.019134 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5k6j\" (UniqueName: \"kubernetes.io/projected/b4ab77b7-6bc2-4d37-b605-775e02af4e25-kube-api-access-f5k6j\") pod \"node-exporter-lmlvh\" (UID: \"b4ab77b7-6bc2-4d37-b605-775e02af4e25\") " pod="openshift-monitoring/node-exporter-lmlvh" Apr 16 04:26:20.019167 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:20.019181 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b4ab77b7-6bc2-4d37-b605-775e02af4e25-node-exporter-textfile\") pod \"node-exporter-lmlvh\" (UID: \"b4ab77b7-6bc2-4d37-b605-775e02af4e25\") " pod="openshift-monitoring/node-exporter-lmlvh" Apr 16 04:26:20.019470 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:20.019230 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b4ab77b7-6bc2-4d37-b605-775e02af4e25-node-exporter-tls\") pod \"node-exporter-lmlvh\" (UID: \"b4ab77b7-6bc2-4d37-b605-775e02af4e25\") " pod="openshift-monitoring/node-exporter-lmlvh" Apr 16 04:26:20.019470 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:20.019265 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b4ab77b7-6bc2-4d37-b605-775e02af4e25-root\") pod \"node-exporter-lmlvh\" (UID: \"b4ab77b7-6bc2-4d37-b605-775e02af4e25\") " pod="openshift-monitoring/node-exporter-lmlvh" Apr 16 04:26:20.019470 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:20.019320 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b4ab77b7-6bc2-4d37-b605-775e02af4e25-sys\") pod \"node-exporter-lmlvh\" (UID: \"b4ab77b7-6bc2-4d37-b605-775e02af4e25\") " pod="openshift-monitoring/node-exporter-lmlvh" Apr 16 04:26:20.019470 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:20.019388 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b4ab77b7-6bc2-4d37-b605-775e02af4e25-node-exporter-accelerators-collector-config\") pod \"node-exporter-lmlvh\" (UID: \"b4ab77b7-6bc2-4d37-b605-775e02af4e25\") " pod="openshift-monitoring/node-exporter-lmlvh" Apr 16 04:26:20.019470 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:20.019419 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b4ab77b7-6bc2-4d37-b605-775e02af4e25-node-exporter-wtmp\") pod \"node-exporter-lmlvh\" (UID: \"b4ab77b7-6bc2-4d37-b605-775e02af4e25\") " pod="openshift-monitoring/node-exporter-lmlvh" Apr 16 04:26:20.019470 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:20.019447 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b4ab77b7-6bc2-4d37-b605-775e02af4e25-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lmlvh\" (UID: \"b4ab77b7-6bc2-4d37-b605-775e02af4e25\") " pod="openshift-monitoring/node-exporter-lmlvh" Apr 16 04:26:20.019794 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:20.019480 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b4ab77b7-6bc2-4d37-b605-775e02af4e25-metrics-client-ca\") pod \"node-exporter-lmlvh\" (UID: \"b4ab77b7-6bc2-4d37-b605-775e02af4e25\") " pod="openshift-monitoring/node-exporter-lmlvh" Apr 16 04:26:20.019794 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:20.019498 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b4ab77b7-6bc2-4d37-b605-775e02af4e25-root\") pod \"node-exporter-lmlvh\" (UID: \"b4ab77b7-6bc2-4d37-b605-775e02af4e25\") " pod="openshift-monitoring/node-exporter-lmlvh" Apr 16 04:26:20.019794 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:26:20.019618 2575 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 04:26:20.019794 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:26:20.019692 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4ab77b7-6bc2-4d37-b605-775e02af4e25-node-exporter-tls podName:b4ab77b7-6bc2-4d37-b605-775e02af4e25 nodeName:}" failed. No retries permitted until 2026-04-16 04:26:20.519671819 +0000 UTC m=+128.422447805 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/b4ab77b7-6bc2-4d37-b605-775e02af4e25-node-exporter-tls") pod "node-exporter-lmlvh" (UID: "b4ab77b7-6bc2-4d37-b605-775e02af4e25") : secret "node-exporter-tls" not found Apr 16 04:26:20.020422 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:20.020089 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b4ab77b7-6bc2-4d37-b605-775e02af4e25-node-exporter-accelerators-collector-config\") pod \"node-exporter-lmlvh\" (UID: \"b4ab77b7-6bc2-4d37-b605-775e02af4e25\") " pod="openshift-monitoring/node-exporter-lmlvh" Apr 16 04:26:20.020422 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:20.020242 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b4ab77b7-6bc2-4d37-b605-775e02af4e25-node-exporter-wtmp\") pod \"node-exporter-lmlvh\" (UID: \"b4ab77b7-6bc2-4d37-b605-775e02af4e25\") " pod="openshift-monitoring/node-exporter-lmlvh" Apr 16 04:26:20.020422 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:20.020302 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b4ab77b7-6bc2-4d37-b605-775e02af4e25-sys\") pod \"node-exporter-lmlvh\" (UID: \"b4ab77b7-6bc2-4d37-b605-775e02af4e25\") " pod="openshift-monitoring/node-exporter-lmlvh" Apr 16 04:26:20.020422 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:20.020368 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b4ab77b7-6bc2-4d37-b605-775e02af4e25-metrics-client-ca\") pod \"node-exporter-lmlvh\" (UID: \"b4ab77b7-6bc2-4d37-b605-775e02af4e25\") " pod="openshift-monitoring/node-exporter-lmlvh" Apr 16 04:26:20.020665 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:20.020511 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b4ab77b7-6bc2-4d37-b605-775e02af4e25-node-exporter-textfile\") pod \"node-exporter-lmlvh\" (UID: \"b4ab77b7-6bc2-4d37-b605-775e02af4e25\") " pod="openshift-monitoring/node-exporter-lmlvh" Apr 16 04:26:20.022552 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:20.022520 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b4ab77b7-6bc2-4d37-b605-775e02af4e25-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lmlvh\" (UID: \"b4ab77b7-6bc2-4d37-b605-775e02af4e25\") " pod="openshift-monitoring/node-exporter-lmlvh" Apr 16 04:26:20.037883 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:20.037850 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5k6j\" (UniqueName: \"kubernetes.io/projected/b4ab77b7-6bc2-4d37-b605-775e02af4e25-kube-api-access-f5k6j\") pod \"node-exporter-lmlvh\" (UID: \"b4ab77b7-6bc2-4d37-b605-775e02af4e25\") " pod="openshift-monitoring/node-exporter-lmlvh" Apr 16 04:26:20.321469 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:20.321435 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/caa3e4a8-086e-41c4-8c8d-baead7f9e776-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-f2phz\" (UID: \"caa3e4a8-086e-41c4-8c8d-baead7f9e776\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-f2phz" Apr 16 04:26:20.323827 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:20.323801 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/caa3e4a8-086e-41c4-8c8d-baead7f9e776-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-f2phz\" (UID: \"caa3e4a8-086e-41c4-8c8d-baead7f9e776\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-f2phz" Apr 16 04:26:20.522828 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:20.522786 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b4ab77b7-6bc2-4d37-b605-775e02af4e25-node-exporter-tls\") pod \"node-exporter-lmlvh\" (UID: \"b4ab77b7-6bc2-4d37-b605-775e02af4e25\") " pod="openshift-monitoring/node-exporter-lmlvh" Apr 16 04:26:20.525088 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:20.525066 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b4ab77b7-6bc2-4d37-b605-775e02af4e25-node-exporter-tls\") pod \"node-exporter-lmlvh\" (UID: \"b4ab77b7-6bc2-4d37-b605-775e02af4e25\") " pod="openshift-monitoring/node-exporter-lmlvh" Apr 16 04:26:20.544157 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:20.544122 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-f2phz" Apr 16 04:26:20.660728 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:20.660697 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-f2phz"] Apr 16 04:26:20.663602 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:26:20.663573 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcaa3e4a8_086e_41c4_8c8d_baead7f9e776.slice/crio-d87f88f609cc27de94c4f659699492764935af64c88e4e9bbfb4a47910052347 WatchSource:0}: Error finding container d87f88f609cc27de94c4f659699492764935af64c88e4e9bbfb4a47910052347: Status 404 returned error can't find the container with id d87f88f609cc27de94c4f659699492764935af64c88e4e9bbfb4a47910052347 Apr 16 04:26:20.663730 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:20.663715 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-lmlvh" Apr 16 04:26:20.671655 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:26:20.671630 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4ab77b7_6bc2_4d37_b605_775e02af4e25.slice/crio-0f15115077896eebf19c12b27900b124d16cacb7c82720060a1c1de0607d0f5b WatchSource:0}: Error finding container 0f15115077896eebf19c12b27900b124d16cacb7c82720060a1c1de0607d0f5b: Status 404 returned error can't find the container with id 0f15115077896eebf19c12b27900b124d16cacb7c82720060a1c1de0607d0f5b Apr 16 04:26:20.847292 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:20.847219 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 04:26:20.850815 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:20.850794 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:26:20.853416 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:20.853395 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 04:26:20.853539 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:20.853428 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 04:26:20.853539 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:20.853445 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-j6ksw\"" Apr 16 04:26:20.853539 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:20.853398 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 04:26:20.853539 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:20.853525 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 04:26:20.854053 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:20.854032 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 04:26:20.854157 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:20.854052 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 04:26:20.854157 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:20.854061 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 04:26:20.854157 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:20.854114 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 04:26:20.854324 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:20.854175 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 04:26:20.863375 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:20.863353 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 04:26:21.026470 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.026437 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/04fc0933-f968-413e-a220-0cef26f8f792-web-config\") pod \"alertmanager-main-0\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:26:21.026643 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.026476 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/04fc0933-f968-413e-a220-0cef26f8f792-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:26:21.026643 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.026501 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmjvx\" (UniqueName: \"kubernetes.io/projected/04fc0933-f968-413e-a220-0cef26f8f792-kube-api-access-cmjvx\") pod \"alertmanager-main-0\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:26:21.026643 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.026523 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/04fc0933-f968-413e-a220-0cef26f8f792-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:26:21.026643 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.026571 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/04fc0933-f968-413e-a220-0cef26f8f792-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:26:21.026643 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.026613 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/04fc0933-f968-413e-a220-0cef26f8f792-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:26:21.026643 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.026631 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/04fc0933-f968-413e-a220-0cef26f8f792-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:26:21.026933 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.026650 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/04fc0933-f968-413e-a220-0cef26f8f792-tls-assets\") pod \"alertmanager-main-0\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:26:21.026933 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.026686 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/04fc0933-f968-413e-a220-0cef26f8f792-config-out\") pod \"alertmanager-main-0\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:26:21.026933 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.026750 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/04fc0933-f968-413e-a220-0cef26f8f792-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:26:21.026933 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.026805 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04fc0933-f968-413e-a220-0cef26f8f792-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:26:21.026933 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.026850 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/04fc0933-f968-413e-a220-0cef26f8f792-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:26:21.026933 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.026888 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/04fc0933-f968-413e-a220-0cef26f8f792-config-volume\") pod \"alertmanager-main-0\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:26:21.097983 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.097891 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lmlvh" event={"ID":"b4ab77b7-6bc2-4d37-b605-775e02af4e25","Type":"ContainerStarted","Data":"0f15115077896eebf19c12b27900b124d16cacb7c82720060a1c1de0607d0f5b"} Apr 16 04:26:21.099748 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.099716 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-f2phz" event={"ID":"caa3e4a8-086e-41c4-8c8d-baead7f9e776","Type":"ContainerStarted","Data":"94e4fce0a4f503b863074d626498a59f0eb1414ecb96231f9ebd0e8d6e77811b"} Apr 16 04:26:21.099886 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.099756 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-f2phz" event={"ID":"caa3e4a8-086e-41c4-8c8d-baead7f9e776","Type":"ContainerStarted","Data":"b26042783e4995a18372dfdafb89fd7c2056d435db768c643937066d3f36679f"} Apr 16 04:26:21.099886 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.099769 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-f2phz" event={"ID":"caa3e4a8-086e-41c4-8c8d-baead7f9e776","Type":"ContainerStarted","Data":"d87f88f609cc27de94c4f659699492764935af64c88e4e9bbfb4a47910052347"} Apr 16 04:26:21.128305 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.128260 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/04fc0933-f968-413e-a220-0cef26f8f792-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:26:21.128507 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.128347 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/04fc0933-f968-413e-a220-0cef26f8f792-config-volume\") pod \"alertmanager-main-0\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:26:21.128507 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.128396 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/04fc0933-f968-413e-a220-0cef26f8f792-web-config\") pod \"alertmanager-main-0\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:26:21.128507 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.128422 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/04fc0933-f968-413e-a220-0cef26f8f792-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:26:21.128507 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.128455 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cmjvx\" (UniqueName: \"kubernetes.io/projected/04fc0933-f968-413e-a220-0cef26f8f792-kube-api-access-cmjvx\") pod \"alertmanager-main-0\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:26:21.128507 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.128492 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/04fc0933-f968-413e-a220-0cef26f8f792-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:26:21.128751 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.128519 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/04fc0933-f968-413e-a220-0cef26f8f792-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:26:21.128751 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.128549 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/04fc0933-f968-413e-a220-0cef26f8f792-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:26:21.128751 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.128573 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/04fc0933-f968-413e-a220-0cef26f8f792-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:26:21.128751 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.128601 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/04fc0933-f968-413e-a220-0cef26f8f792-tls-assets\") pod \"alertmanager-main-0\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:26:21.128751 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.128631 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/04fc0933-f968-413e-a220-0cef26f8f792-config-out\") pod \"alertmanager-main-0\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:26:21.128751 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.128673 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/04fc0933-f968-413e-a220-0cef26f8f792-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:26:21.128751 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.128713 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04fc0933-f968-413e-a220-0cef26f8f792-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:26:21.129247 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.129211 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/04fc0933-f968-413e-a220-0cef26f8f792-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:26:21.130482 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.130131 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/04fc0933-f968-413e-a220-0cef26f8f792-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:26:21.130482 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.130299 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04fc0933-f968-413e-a220-0cef26f8f792-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:26:21.131748 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.131663 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/04fc0933-f968-413e-a220-0cef26f8f792-config-volume\") pod \"alertmanager-main-0\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:26:21.131748 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.131667 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/04fc0933-f968-413e-a220-0cef26f8f792-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:26:21.131899 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.131881 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/04fc0933-f968-413e-a220-0cef26f8f792-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:26:21.132676 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.132627 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/04fc0933-f968-413e-a220-0cef26f8f792-web-config\") pod \"alertmanager-main-0\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:26:21.132676 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.132648 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/04fc0933-f968-413e-a220-0cef26f8f792-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:26:21.133008 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.132985 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/04fc0933-f968-413e-a220-0cef26f8f792-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:26:21.133212 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.133187 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/04fc0933-f968-413e-a220-0cef26f8f792-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:26:21.133840 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.133816 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/04fc0933-f968-413e-a220-0cef26f8f792-config-out\") pod \"alertmanager-main-0\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:26:21.134079 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.134060 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/04fc0933-f968-413e-a220-0cef26f8f792-tls-assets\") pod \"alertmanager-main-0\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:26:21.139783 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.139760 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmjvx\" (UniqueName: \"kubernetes.io/projected/04fc0933-f968-413e-a220-0cef26f8f792-kube-api-access-cmjvx\") pod \"alertmanager-main-0\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:26:21.160756 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.160722 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:26:21.295458 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:21.295422 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 04:26:21.378717 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:26:21.378651 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04fc0933_f968_413e_a220_0cef26f8f792.slice/crio-81baf301a80bad968ef29adbbf203472fda793b201f3884ed81642cbd9354297 WatchSource:0}: Error finding container 81baf301a80bad968ef29adbbf203472fda793b201f3884ed81642cbd9354297: Status 404 returned error can't find the container with id 81baf301a80bad968ef29adbbf203472fda793b201f3884ed81642cbd9354297 Apr 16 04:26:22.105929 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:22.105896 2575 generic.go:358] "Generic (PLEG): container finished" podID="b4ab77b7-6bc2-4d37-b605-775e02af4e25" containerID="83d23e67d06d610415e3db39ad8ab65ab206baec02a3568380c27a8201df1651" exitCode=0 Apr 16 04:26:22.106315 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:22.105985 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lmlvh" event={"ID":"b4ab77b7-6bc2-4d37-b605-775e02af4e25","Type":"ContainerDied","Data":"83d23e67d06d610415e3db39ad8ab65ab206baec02a3568380c27a8201df1651"} Apr 16 04:26:22.107068 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:22.107050 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"04fc0933-f968-413e-a220-0cef26f8f792","Type":"ContainerStarted","Data":"81baf301a80bad968ef29adbbf203472fda793b201f3884ed81642cbd9354297"} Apr 16 04:26:22.441139 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:22.441055 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/414aeb12-26d8-4994-8fe8-1cad0342c929-metrics-certs\") pod \"network-metrics-daemon-875wc\" (UID: \"414aeb12-26d8-4994-8fe8-1cad0342c929\") " pod="openshift-multus/network-metrics-daemon-875wc" Apr 16 04:26:22.443953 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:22.443920 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/414aeb12-26d8-4994-8fe8-1cad0342c929-metrics-certs\") pod \"network-metrics-daemon-875wc\" (UID: \"414aeb12-26d8-4994-8fe8-1cad0342c929\") " pod="openshift-multus/network-metrics-daemon-875wc" Apr 16 04:26:22.617869 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:22.617834 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-xfxbt\"" Apr 16 04:26:22.626116 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:22.626080 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-875wc" Apr 16 04:26:22.725447 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:22.724906 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-84c6dc88d4-45qc8"] Apr 16 04:26:22.729075 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:22.729046 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-84c6dc88d4-45qc8" Apr 16 04:26:22.731594 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:22.731572 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 04:26:22.731748 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:22.731726 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 04:26:22.731889 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:22.731868 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-4ql5u781vabc8\"" Apr 16 04:26:22.732032 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:22.732012 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 04:26:22.732169 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:22.732150 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 04:26:22.732266 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:22.732150 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-xpxfd\"" Apr 16 04:26:22.740239 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:22.740200 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-84c6dc88d4-45qc8"] Apr 16 04:26:22.740557 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:22.740541 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 04:26:22.844832 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:22.844796 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b1e4d17e-936f-4c41-ae50-b17dd9528813-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-84c6dc88d4-45qc8\" (UID: \"b1e4d17e-936f-4c41-ae50-b17dd9528813\") " pod="openshift-monitoring/thanos-querier-84c6dc88d4-45qc8" Apr 16 04:26:22.845017 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:22.844841 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b1e4d17e-936f-4c41-ae50-b17dd9528813-secret-grpc-tls\") pod \"thanos-querier-84c6dc88d4-45qc8\" (UID: \"b1e4d17e-936f-4c41-ae50-b17dd9528813\") " pod="openshift-monitoring/thanos-querier-84c6dc88d4-45qc8" Apr 16 04:26:22.845017 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:22.844867 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/b1e4d17e-936f-4c41-ae50-b17dd9528813-secret-thanos-querier-tls\") pod \"thanos-querier-84c6dc88d4-45qc8\" (UID: \"b1e4d17e-936f-4c41-ae50-b17dd9528813\") " pod="openshift-monitoring/thanos-querier-84c6dc88d4-45qc8" Apr 16 04:26:22.845017 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:22.844920 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/b1e4d17e-936f-4c41-ae50-b17dd9528813-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-84c6dc88d4-45qc8\" (UID: \"b1e4d17e-936f-4c41-ae50-b17dd9528813\") " pod="openshift-monitoring/thanos-querier-84c6dc88d4-45qc8" Apr 16 04:26:22.845167 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:22.845024 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b1e4d17e-936f-4c41-ae50-b17dd9528813-metrics-client-ca\") pod \"thanos-querier-84c6dc88d4-45qc8\" (UID: \"b1e4d17e-936f-4c41-ae50-b17dd9528813\") " pod="openshift-monitoring/thanos-querier-84c6dc88d4-45qc8" Apr 16 04:26:22.845167 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:22.845063 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcrwd\" (UniqueName: \"kubernetes.io/projected/b1e4d17e-936f-4c41-ae50-b17dd9528813-kube-api-access-wcrwd\") pod \"thanos-querier-84c6dc88d4-45qc8\" (UID: \"b1e4d17e-936f-4c41-ae50-b17dd9528813\") " pod="openshift-monitoring/thanos-querier-84c6dc88d4-45qc8" Apr 16 04:26:22.845167 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:22.845112 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b1e4d17e-936f-4c41-ae50-b17dd9528813-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-84c6dc88d4-45qc8\" (UID: \"b1e4d17e-936f-4c41-ae50-b17dd9528813\") " pod="openshift-monitoring/thanos-querier-84c6dc88d4-45qc8" Apr 16 04:26:22.845167 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:22.845146 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/b1e4d17e-936f-4c41-ae50-b17dd9528813-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-84c6dc88d4-45qc8\" (UID: \"b1e4d17e-936f-4c41-ae50-b17dd9528813\") " pod="openshift-monitoring/thanos-querier-84c6dc88d4-45qc8" Apr 16 04:26:22.946671 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:22.946635 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b1e4d17e-936f-4c41-ae50-b17dd9528813-metrics-client-ca\") pod \"thanos-querier-84c6dc88d4-45qc8\" (UID: \"b1e4d17e-936f-4c41-ae50-b17dd9528813\") " pod="openshift-monitoring/thanos-querier-84c6dc88d4-45qc8" Apr 16 04:26:22.946786 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:22.946681 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wcrwd\" (UniqueName: \"kubernetes.io/projected/b1e4d17e-936f-4c41-ae50-b17dd9528813-kube-api-access-wcrwd\") pod \"thanos-querier-84c6dc88d4-45qc8\" (UID: \"b1e4d17e-936f-4c41-ae50-b17dd9528813\") " pod="openshift-monitoring/thanos-querier-84c6dc88d4-45qc8" Apr 16 04:26:22.946786 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:22.946721 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b1e4d17e-936f-4c41-ae50-b17dd9528813-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-84c6dc88d4-45qc8\" (UID: \"b1e4d17e-936f-4c41-ae50-b17dd9528813\") " pod="openshift-monitoring/thanos-querier-84c6dc88d4-45qc8" Apr 16 04:26:22.946786 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:22.946752 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/b1e4d17e-936f-4c41-ae50-b17dd9528813-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-84c6dc88d4-45qc8\" (UID: \"b1e4d17e-936f-4c41-ae50-b17dd9528813\") " pod="openshift-monitoring/thanos-querier-84c6dc88d4-45qc8" Apr 16 04:26:22.946880 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:22.946791 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b1e4d17e-936f-4c41-ae50-b17dd9528813-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-84c6dc88d4-45qc8\" (UID: \"b1e4d17e-936f-4c41-ae50-b17dd9528813\") " pod="openshift-monitoring/thanos-querier-84c6dc88d4-45qc8" Apr 16 04:26:22.946880 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:22.946819 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b1e4d17e-936f-4c41-ae50-b17dd9528813-secret-grpc-tls\") pod \"thanos-querier-84c6dc88d4-45qc8\" (UID: \"b1e4d17e-936f-4c41-ae50-b17dd9528813\") " pod="openshift-monitoring/thanos-querier-84c6dc88d4-45qc8" Apr 16 04:26:22.946880 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:22.946843 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/b1e4d17e-936f-4c41-ae50-b17dd9528813-secret-thanos-querier-tls\") pod \"thanos-querier-84c6dc88d4-45qc8\" (UID: \"b1e4d17e-936f-4c41-ae50-b17dd9528813\") " pod="openshift-monitoring/thanos-querier-84c6dc88d4-45qc8" Apr 16 04:26:22.946880 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:22.946870 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/b1e4d17e-936f-4c41-ae50-b17dd9528813-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-84c6dc88d4-45qc8\" (UID: \"b1e4d17e-936f-4c41-ae50-b17dd9528813\") " pod="openshift-monitoring/thanos-querier-84c6dc88d4-45qc8" Apr 16 04:26:22.947778 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:22.947746 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b1e4d17e-936f-4c41-ae50-b17dd9528813-metrics-client-ca\") pod \"thanos-querier-84c6dc88d4-45qc8\" (UID: \"b1e4d17e-936f-4c41-ae50-b17dd9528813\") " pod="openshift-monitoring/thanos-querier-84c6dc88d4-45qc8" Apr 16 04:26:22.950384 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:22.950095 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/b1e4d17e-936f-4c41-ae50-b17dd9528813-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-84c6dc88d4-45qc8\" (UID: \"b1e4d17e-936f-4c41-ae50-b17dd9528813\") " pod="openshift-monitoring/thanos-querier-84c6dc88d4-45qc8" Apr 16 04:26:22.951503 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:22.951075 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b1e4d17e-936f-4c41-ae50-b17dd9528813-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-84c6dc88d4-45qc8\" (UID: \"b1e4d17e-936f-4c41-ae50-b17dd9528813\") " pod="openshift-monitoring/thanos-querier-84c6dc88d4-45qc8" Apr 16 04:26:22.953170 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:22.952424 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/b1e4d17e-936f-4c41-ae50-b17dd9528813-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-84c6dc88d4-45qc8\" (UID: \"b1e4d17e-936f-4c41-ae50-b17dd9528813\") " pod="openshift-monitoring/thanos-querier-84c6dc88d4-45qc8" Apr 16 04:26:22.953170 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:22.952711 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/b1e4d17e-936f-4c41-ae50-b17dd9528813-secret-thanos-querier-tls\") pod \"thanos-querier-84c6dc88d4-45qc8\" (UID: \"b1e4d17e-936f-4c41-ae50-b17dd9528813\") " pod="openshift-monitoring/thanos-querier-84c6dc88d4-45qc8" Apr 16 04:26:22.953170 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:22.953085 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b1e4d17e-936f-4c41-ae50-b17dd9528813-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-84c6dc88d4-45qc8\" (UID: \"b1e4d17e-936f-4c41-ae50-b17dd9528813\") " pod="openshift-monitoring/thanos-querier-84c6dc88d4-45qc8" Apr 16 04:26:22.953701 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:22.953679 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b1e4d17e-936f-4c41-ae50-b17dd9528813-secret-grpc-tls\") pod \"thanos-querier-84c6dc88d4-45qc8\" (UID: \"b1e4d17e-936f-4c41-ae50-b17dd9528813\") " pod="openshift-monitoring/thanos-querier-84c6dc88d4-45qc8" Apr 16 04:26:22.955055 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:22.955033 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcrwd\" (UniqueName: \"kubernetes.io/projected/b1e4d17e-936f-4c41-ae50-b17dd9528813-kube-api-access-wcrwd\") pod \"thanos-querier-84c6dc88d4-45qc8\" (UID: \"b1e4d17e-936f-4c41-ae50-b17dd9528813\") " pod="openshift-monitoring/thanos-querier-84c6dc88d4-45qc8" Apr 16 04:26:23.010022 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:23.009976 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-875wc"] Apr 16 04:26:23.012637 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:26:23.012609 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod414aeb12_26d8_4994_8fe8_1cad0342c929.slice/crio-aca6b640824541a2fd654231a385f07f48cb52fe0d23fb4f6aea74618b3d7888 WatchSource:0}: Error finding container aca6b640824541a2fd654231a385f07f48cb52fe0d23fb4f6aea74618b3d7888: Status 404 returned error can't find the container with id aca6b640824541a2fd654231a385f07f48cb52fe0d23fb4f6aea74618b3d7888 Apr 16 04:26:23.045872 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:23.045840 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-84c6dc88d4-45qc8" Apr 16 04:26:23.113429 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:23.112888 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lmlvh" event={"ID":"b4ab77b7-6bc2-4d37-b605-775e02af4e25","Type":"ContainerStarted","Data":"7d05397f80ba4df06b7ce5f840febc4340efa808ed546270842a846c9fe73ef7"} Apr 16 04:26:23.113429 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:23.112934 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lmlvh" event={"ID":"b4ab77b7-6bc2-4d37-b605-775e02af4e25","Type":"ContainerStarted","Data":"134e4c4c75c16ed555a44f9c0e9aa9198d5f32b8851f55c741128d72990ef545"} Apr 16 04:26:23.115246 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:23.115217 2575 generic.go:358] "Generic (PLEG): container finished" podID="04fc0933-f968-413e-a220-0cef26f8f792" containerID="9e1b665ba78eded916841fb126b61d6e020f976d1511ceda99f09117f3945610" exitCode=0 Apr 16 04:26:23.115382 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:23.115295 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"04fc0933-f968-413e-a220-0cef26f8f792","Type":"ContainerDied","Data":"9e1b665ba78eded916841fb126b61d6e020f976d1511ceda99f09117f3945610"} Apr 16 04:26:23.117940 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:23.116854 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-875wc" event={"ID":"414aeb12-26d8-4994-8fe8-1cad0342c929","Type":"ContainerStarted","Data":"aca6b640824541a2fd654231a385f07f48cb52fe0d23fb4f6aea74618b3d7888"} Apr 16 04:26:23.119823 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:23.119797 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-f2phz" event={"ID":"caa3e4a8-086e-41c4-8c8d-baead7f9e776","Type":"ContainerStarted","Data":"d48e9f2285918e48dabcee0c0fab96ad05d5b88b93c6f79a3179bfbdca3cadbb"} Apr 16 04:26:23.130828 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:23.130775 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-lmlvh" podStartSLOduration=3.380555811 podStartE2EDuration="4.130754824s" podCreationTimestamp="2026-04-16 04:26:19 +0000 UTC" firstStartedPulling="2026-04-16 04:26:20.673060209 +0000 UTC m=+128.575836186" lastFinishedPulling="2026-04-16 04:26:21.423259218 +0000 UTC m=+129.326035199" observedRunningTime="2026-04-16 04:26:23.12996726 +0000 UTC m=+131.032743295" watchObservedRunningTime="2026-04-16 04:26:23.130754824 +0000 UTC m=+131.033530824" Apr 16 04:26:23.148391 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:23.148314 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-5669946b84-f2phz" podStartSLOduration=2.892139582 podStartE2EDuration="4.148295621s" podCreationTimestamp="2026-04-16 04:26:19 +0000 UTC" firstStartedPulling="2026-04-16 04:26:20.810093888 +0000 UTC m=+128.712869866" lastFinishedPulling="2026-04-16 04:26:22.066249928 +0000 UTC m=+129.969025905" observedRunningTime="2026-04-16 04:26:23.146162311 +0000 UTC m=+131.048938323" watchObservedRunningTime="2026-04-16 04:26:23.148295621 +0000 UTC m=+131.051071622" Apr 16 04:26:23.186927 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:23.186892 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-84c6dc88d4-45qc8"] Apr 16 04:26:23.190878 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:26:23.190848 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1e4d17e_936f_4c41_ae50_b17dd9528813.slice/crio-3e705efb6dd34da41f97424cb5902c2a19a341995a0ea21f1131babd299ff6cf WatchSource:0}: Error finding container 3e705efb6dd34da41f97424cb5902c2a19a341995a0ea21f1131babd299ff6cf: Status 404 returned error can't find the container with id 3e705efb6dd34da41f97424cb5902c2a19a341995a0ea21f1131babd299ff6cf Apr 16 04:26:24.125246 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:24.124685 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-84c6dc88d4-45qc8" event={"ID":"b1e4d17e-936f-4c41-ae50-b17dd9528813","Type":"ContainerStarted","Data":"3e705efb6dd34da41f97424cb5902c2a19a341995a0ea21f1131babd299ff6cf"} Apr 16 04:26:25.129763 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:25.129728 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-875wc" event={"ID":"414aeb12-26d8-4994-8fe8-1cad0342c929","Type":"ContainerStarted","Data":"56b00326fbcfb8dc8fbaa84e9c9c3d29788ace668b68c0d96baffec49115e1ea"} Apr 16 04:26:25.130174 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:25.129769 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-875wc" event={"ID":"414aeb12-26d8-4994-8fe8-1cad0342c929","Type":"ContainerStarted","Data":"da39cc9afcc553f9ad5d60e627bb7006e68413a5c16734e3a3b4b3457be68ac8"} Apr 16 04:26:25.147067 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:25.146991 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-875wc" podStartSLOduration=132.088819597 podStartE2EDuration="2m13.146971595s" podCreationTimestamp="2026-04-16 04:24:12 +0000 UTC" firstStartedPulling="2026-04-16 04:26:23.015012736 +0000 UTC m=+130.917788713" lastFinishedPulling="2026-04-16 04:26:24.073164731 +0000 UTC m=+131.975940711" observedRunningTime="2026-04-16 04:26:25.144519941 +0000 UTC m=+133.047295939" watchObservedRunningTime="2026-04-16 04:26:25.146971595 +0000 UTC m=+133.049747597" Apr 16 04:26:26.136299 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:26.136262 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"04fc0933-f968-413e-a220-0cef26f8f792","Type":"ContainerStarted","Data":"87e3bb9feb57b4c6d5afe470ed4f017d9e86f91fc61a24061b202961e547864f"} Apr 16 04:26:26.136299 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:26.136305 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"04fc0933-f968-413e-a220-0cef26f8f792","Type":"ContainerStarted","Data":"b3ee7bd95b6870061c2ab459cfde8d7f139b8da9c1f44c30a732e2c72f04727e"} Apr 16 04:26:26.136828 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:26.136316 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"04fc0933-f968-413e-a220-0cef26f8f792","Type":"ContainerStarted","Data":"7fb8dd22d8d0ea66b6075b542b76223af53e1dffb8879e1b089c1dd217b08ecb"} Apr 16 04:26:26.136828 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:26.136356 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"04fc0933-f968-413e-a220-0cef26f8f792","Type":"ContainerStarted","Data":"17b7ac979ac6a67fb3ad96fa61b71c8ee01f408a7b93c570e3b7e1e8621fc7ec"} Apr 16 04:26:26.136828 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:26.136370 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"04fc0933-f968-413e-a220-0cef26f8f792","Type":"ContainerStarted","Data":"c65b14134904a9ccd53dc817d15066c4f8309baffb19c39287d275208a00fa49"} Apr 16 04:26:26.138184 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:26.138155 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-84c6dc88d4-45qc8" event={"ID":"b1e4d17e-936f-4c41-ae50-b17dd9528813","Type":"ContainerStarted","Data":"254e48081471f14654aa422239a8b3743b94dc655f5ea3ddc1fdd54e382f2acb"} Apr 16 04:26:26.138286 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:26.138191 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-84c6dc88d4-45qc8" event={"ID":"b1e4d17e-936f-4c41-ae50-b17dd9528813","Type":"ContainerStarted","Data":"0045defce7ddf651fd3866b8a9d86fa3593dda4c427570b393d6d969ae296309"} Apr 16 04:26:26.138286 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:26.138205 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-84c6dc88d4-45qc8" event={"ID":"b1e4d17e-936f-4c41-ae50-b17dd9528813","Type":"ContainerStarted","Data":"730cf3cbf8c788b15b223c91b20417fed7a7144114c448c108b22e7321a8f2ba"} Apr 16 04:26:26.195807 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:26.195758 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-76684bdc88-r8jsk" podUID="3d8b1eea-523b-4786-aa84-e8f73ea58da2" containerName="registry" containerID="cri-o://0f28e45c5b5c82c50db14627b595a5d3ef7e7baff5bcb8a7bdc552f5bb63f3a9" gracePeriod=30 Apr 16 04:26:26.489294 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:26.489271 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-76684bdc88-r8jsk" Apr 16 04:26:26.581203 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:26.581181 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3d8b1eea-523b-4786-aa84-e8f73ea58da2-image-registry-private-configuration\") pod \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\" (UID: \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\") " Apr 16 04:26:26.581308 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:26.581234 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3d8b1eea-523b-4786-aa84-e8f73ea58da2-installation-pull-secrets\") pod \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\" (UID: \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\") " Apr 16 04:26:26.581308 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:26.581251 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3d8b1eea-523b-4786-aa84-e8f73ea58da2-registry-certificates\") pod \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\" (UID: \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\") " Apr 16 04:26:26.581308 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:26.581296 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3d8b1eea-523b-4786-aa84-e8f73ea58da2-bound-sa-token\") pod \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\" (UID: \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\") " Apr 16 04:26:26.581453 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:26.581318 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d8b1eea-523b-4786-aa84-e8f73ea58da2-registry-tls\") pod \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\" (UID: \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\") " Apr 16 04:26:26.581453 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:26.581428 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3d8b1eea-523b-4786-aa84-e8f73ea58da2-trusted-ca\") pod \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\" (UID: \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\") " Apr 16 04:26:26.581541 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:26.581500 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snjv2\" (UniqueName: \"kubernetes.io/projected/3d8b1eea-523b-4786-aa84-e8f73ea58da2-kube-api-access-snjv2\") pod \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\" (UID: \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\") " Apr 16 04:26:26.581590 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:26.581554 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3d8b1eea-523b-4786-aa84-e8f73ea58da2-ca-trust-extracted\") pod \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\" (UID: \"3d8b1eea-523b-4786-aa84-e8f73ea58da2\") " Apr 16 04:26:26.581798 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:26.581763 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d8b1eea-523b-4786-aa84-e8f73ea58da2-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "3d8b1eea-523b-4786-aa84-e8f73ea58da2" (UID: "3d8b1eea-523b-4786-aa84-e8f73ea58da2"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 04:26:26.581914 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:26.581813 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d8b1eea-523b-4786-aa84-e8f73ea58da2-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "3d8b1eea-523b-4786-aa84-e8f73ea58da2" (UID: "3d8b1eea-523b-4786-aa84-e8f73ea58da2"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 04:26:26.583642 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:26.583609 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d8b1eea-523b-4786-aa84-e8f73ea58da2-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "3d8b1eea-523b-4786-aa84-e8f73ea58da2" (UID: "3d8b1eea-523b-4786-aa84-e8f73ea58da2"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 04:26:26.583764 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:26.583739 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d8b1eea-523b-4786-aa84-e8f73ea58da2-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "3d8b1eea-523b-4786-aa84-e8f73ea58da2" (UID: "3d8b1eea-523b-4786-aa84-e8f73ea58da2"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 04:26:26.583859 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:26.583838 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d8b1eea-523b-4786-aa84-e8f73ea58da2-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "3d8b1eea-523b-4786-aa84-e8f73ea58da2" (UID: "3d8b1eea-523b-4786-aa84-e8f73ea58da2"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 04:26:26.583992 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:26.583965 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d8b1eea-523b-4786-aa84-e8f73ea58da2-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "3d8b1eea-523b-4786-aa84-e8f73ea58da2" (UID: "3d8b1eea-523b-4786-aa84-e8f73ea58da2"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 04:26:26.584070 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:26.584000 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d8b1eea-523b-4786-aa84-e8f73ea58da2-kube-api-access-snjv2" (OuterVolumeSpecName: "kube-api-access-snjv2") pod "3d8b1eea-523b-4786-aa84-e8f73ea58da2" (UID: "3d8b1eea-523b-4786-aa84-e8f73ea58da2"). InnerVolumeSpecName "kube-api-access-snjv2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 04:26:26.590073 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:26.590044 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d8b1eea-523b-4786-aa84-e8f73ea58da2-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "3d8b1eea-523b-4786-aa84-e8f73ea58da2" (UID: "3d8b1eea-523b-4786-aa84-e8f73ea58da2"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 04:26:26.682258 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:26.682177 2575 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3d8b1eea-523b-4786-aa84-e8f73ea58da2-bound-sa-token\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:26:26.682258 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:26.682214 2575 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d8b1eea-523b-4786-aa84-e8f73ea58da2-registry-tls\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:26:26.682258 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:26.682228 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3d8b1eea-523b-4786-aa84-e8f73ea58da2-trusted-ca\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:26:26.682258 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:26.682242 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-snjv2\" (UniqueName: \"kubernetes.io/projected/3d8b1eea-523b-4786-aa84-e8f73ea58da2-kube-api-access-snjv2\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:26:26.682258 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:26.682256 2575 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3d8b1eea-523b-4786-aa84-e8f73ea58da2-ca-trust-extracted\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:26:26.682585 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:26.682272 2575 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3d8b1eea-523b-4786-aa84-e8f73ea58da2-image-registry-private-configuration\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:26:26.682585 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:26.682287 2575 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3d8b1eea-523b-4786-aa84-e8f73ea58da2-installation-pull-secrets\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:26:26.682585 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:26.682302 2575 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3d8b1eea-523b-4786-aa84-e8f73ea58da2-registry-certificates\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:26:27.143560 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:27.143519 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"04fc0933-f968-413e-a220-0cef26f8f792","Type":"ContainerStarted","Data":"b0a65288e643ed0f93c109311470663c9c98aa967b54eb49a4880c54f7119ea6"} Apr 16 04:26:27.145932 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:27.145905 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-84c6dc88d4-45qc8" event={"ID":"b1e4d17e-936f-4c41-ae50-b17dd9528813","Type":"ContainerStarted","Data":"969f8332ec7fe1667bb071694f535af7354b76982335c9c041ae4a39d26d2b80"} Apr 16 04:26:27.146054 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:27.145939 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-84c6dc88d4-45qc8" event={"ID":"b1e4d17e-936f-4c41-ae50-b17dd9528813","Type":"ContainerStarted","Data":"003e54b3c08feb45591b0af11ef8fc1a8e9506a73c8743ad88d6d2e098de3f3e"} Apr 16 04:26:27.146054 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:27.145953 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-84c6dc88d4-45qc8" event={"ID":"b1e4d17e-936f-4c41-ae50-b17dd9528813","Type":"ContainerStarted","Data":"3caeb55da8edfe0167d298179494b8a065b951a34d0fb02db21ec467f324c606"} Apr 16 04:26:27.146154 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:27.146059 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-84c6dc88d4-45qc8" Apr 16 04:26:27.146866 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:27.146842 2575 generic.go:358] "Generic (PLEG): container finished" podID="3d8b1eea-523b-4786-aa84-e8f73ea58da2" containerID="0f28e45c5b5c82c50db14627b595a5d3ef7e7baff5bcb8a7bdc552f5bb63f3a9" exitCode=0 Apr 16 04:26:27.146939 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:27.146876 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-76684bdc88-r8jsk" event={"ID":"3d8b1eea-523b-4786-aa84-e8f73ea58da2","Type":"ContainerDied","Data":"0f28e45c5b5c82c50db14627b595a5d3ef7e7baff5bcb8a7bdc552f5bb63f3a9"} Apr 16 04:26:27.146939 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:27.146893 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-76684bdc88-r8jsk" event={"ID":"3d8b1eea-523b-4786-aa84-e8f73ea58da2","Type":"ContainerDied","Data":"9d6e207a327cf7d21da354d664c2b4ffadc47632dff697fef10d256419016034"} Apr 16 04:26:27.146939 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:27.146908 2575 scope.go:117] "RemoveContainer" containerID="0f28e45c5b5c82c50db14627b595a5d3ef7e7baff5bcb8a7bdc552f5bb63f3a9" Apr 16 04:26:27.146939 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:27.146915 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-76684bdc88-r8jsk" Apr 16 04:26:27.154525 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:27.154510 2575 scope.go:117] "RemoveContainer" containerID="0f28e45c5b5c82c50db14627b595a5d3ef7e7baff5bcb8a7bdc552f5bb63f3a9" Apr 16 04:26:27.154789 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:26:27.154769 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f28e45c5b5c82c50db14627b595a5d3ef7e7baff5bcb8a7bdc552f5bb63f3a9\": container with ID starting with 0f28e45c5b5c82c50db14627b595a5d3ef7e7baff5bcb8a7bdc552f5bb63f3a9 not found: ID does not exist" containerID="0f28e45c5b5c82c50db14627b595a5d3ef7e7baff5bcb8a7bdc552f5bb63f3a9" Apr 16 04:26:27.154853 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:27.154800 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f28e45c5b5c82c50db14627b595a5d3ef7e7baff5bcb8a7bdc552f5bb63f3a9"} err="failed to get container status \"0f28e45c5b5c82c50db14627b595a5d3ef7e7baff5bcb8a7bdc552f5bb63f3a9\": rpc error: code = NotFound desc = could not find container \"0f28e45c5b5c82c50db14627b595a5d3ef7e7baff5bcb8a7bdc552f5bb63f3a9\": container with ID starting with 0f28e45c5b5c82c50db14627b595a5d3ef7e7baff5bcb8a7bdc552f5bb63f3a9 not found: ID does not exist" Apr 16 04:26:27.172452 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:27.172408 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.122586689 podStartE2EDuration="7.172394131s" podCreationTimestamp="2026-04-16 04:26:20 +0000 UTC" firstStartedPulling="2026-04-16 04:26:21.380648596 +0000 UTC m=+129.283424576" lastFinishedPulling="2026-04-16 04:26:26.430456038 +0000 UTC m=+134.333232018" observedRunningTime="2026-04-16 04:26:27.171697816 +0000 UTC m=+135.074473815" watchObservedRunningTime="2026-04-16 04:26:27.172394131 +0000 UTC m=+135.075170129" Apr 16 04:26:27.194788 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:27.194755 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-76684bdc88-r8jsk"] Apr 16 04:26:27.202895 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:27.202866 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-76684bdc88-r8jsk"] Apr 16 04:26:27.221768 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:27.221719 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-84c6dc88d4-45qc8" podStartSLOduration=1.9859699800000001 podStartE2EDuration="5.221705477s" podCreationTimestamp="2026-04-16 04:26:22 +0000 UTC" firstStartedPulling="2026-04-16 04:26:23.193206689 +0000 UTC m=+131.095982666" lastFinishedPulling="2026-04-16 04:26:26.428942173 +0000 UTC m=+134.331718163" observedRunningTime="2026-04-16 04:26:27.219748763 +0000 UTC m=+135.122524763" watchObservedRunningTime="2026-04-16 04:26:27.221705477 +0000 UTC m=+135.124481476" Apr 16 04:26:28.706454 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:28.706423 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d8b1eea-523b-4786-aa84-e8f73ea58da2" path="/var/lib/kubelet/pods/3d8b1eea-523b-4786-aa84-e8f73ea58da2/volumes" Apr 16 04:26:30.702846 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:30.702815 2575 scope.go:117] "RemoveContainer" containerID="7ab2b2baacc31df355cdb23eee54e03836b143954c38232b25d178b4ec0bd03c" Apr 16 04:26:31.162532 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:31.162504 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kn57s_ed401949-3fdf-455d-bb0a-400a7a15466c/console-operator/2.log" Apr 16 04:26:31.162691 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:31.162583 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-kn57s" event={"ID":"ed401949-3fdf-455d-bb0a-400a7a15466c","Type":"ContainerStarted","Data":"0a4d17ce5fcbc5359fe1c25e21d5f1418b9440844239cd66dc9f5392b0fb8526"} Apr 16 04:26:31.162856 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:31.162830 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-kn57s" Apr 16 04:26:31.179195 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:31.179141 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-d87b8d5fc-kn57s" podStartSLOduration=55.663506611 podStartE2EDuration="58.179127093s" podCreationTimestamp="2026-04-16 04:25:33 +0000 UTC" firstStartedPulling="2026-04-16 04:25:34.390941601 +0000 UTC m=+82.293717582" lastFinishedPulling="2026-04-16 04:25:36.906562072 +0000 UTC m=+84.809338064" observedRunningTime="2026-04-16 04:26:31.177688612 +0000 UTC m=+139.080464628" watchObservedRunningTime="2026-04-16 04:26:31.179127093 +0000 UTC m=+139.081903091" Apr 16 04:26:31.554959 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:31.554932 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-d87b8d5fc-kn57s" Apr 16 04:26:31.720546 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:31.720512 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-586b57c7b4-q9zfd"] Apr 16 04:26:31.720895 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:31.720802 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3d8b1eea-523b-4786-aa84-e8f73ea58da2" containerName="registry" Apr 16 04:26:31.720895 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:31.720813 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d8b1eea-523b-4786-aa84-e8f73ea58da2" containerName="registry" Apr 16 04:26:31.720895 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:31.720862 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="3d8b1eea-523b-4786-aa84-e8f73ea58da2" containerName="registry" Apr 16 04:26:31.723693 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:31.723677 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-q9zfd" Apr 16 04:26:31.726420 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:31.726398 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-d2qmb\"" Apr 16 04:26:31.726543 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:31.726398 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 04:26:31.726855 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:31.726832 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 04:26:31.731049 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:31.731014 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-q9zfd"] Apr 16 04:26:31.823852 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:31.823758 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x22c\" (UniqueName: \"kubernetes.io/projected/fab06a06-1137-449c-860c-0eb917231cd3-kube-api-access-6x22c\") pod \"downloads-586b57c7b4-q9zfd\" (UID: \"fab06a06-1137-449c-860c-0eb917231cd3\") " pod="openshift-console/downloads-586b57c7b4-q9zfd" Apr 16 04:26:31.924935 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:31.924899 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6x22c\" (UniqueName: \"kubernetes.io/projected/fab06a06-1137-449c-860c-0eb917231cd3-kube-api-access-6x22c\") pod \"downloads-586b57c7b4-q9zfd\" (UID: \"fab06a06-1137-449c-860c-0eb917231cd3\") " pod="openshift-console/downloads-586b57c7b4-q9zfd" Apr 16 04:26:31.933160 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:31.933127 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x22c\" (UniqueName: \"kubernetes.io/projected/fab06a06-1137-449c-860c-0eb917231cd3-kube-api-access-6x22c\") pod \"downloads-586b57c7b4-q9zfd\" (UID: \"fab06a06-1137-449c-860c-0eb917231cd3\") " pod="openshift-console/downloads-586b57c7b4-q9zfd" Apr 16 04:26:32.034109 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:32.034067 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-q9zfd" Apr 16 04:26:32.168297 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:32.168274 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-q9zfd"] Apr 16 04:26:32.170456 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:26:32.170425 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfab06a06_1137_449c_860c_0eb917231cd3.slice/crio-8df4313769bebc9f374afdaccd85894a6f91e08653f235e915f3340f68632c0f WatchSource:0}: Error finding container 8df4313769bebc9f374afdaccd85894a6f91e08653f235e915f3340f68632c0f: Status 404 returned error can't find the container with id 8df4313769bebc9f374afdaccd85894a6f91e08653f235e915f3340f68632c0f Apr 16 04:26:33.159973 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:33.159945 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-84c6dc88d4-45qc8" Apr 16 04:26:33.170076 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:33.170043 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-q9zfd" event={"ID":"fab06a06-1137-449c-860c-0eb917231cd3","Type":"ContainerStarted","Data":"8df4313769bebc9f374afdaccd85894a6f91e08653f235e915f3340f68632c0f"} Apr 16 04:26:42.226891 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:42.226855 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-64d4f5b557-ngpk2"] Apr 16 04:26:42.231725 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:42.231696 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64d4f5b557-ngpk2" Apr 16 04:26:42.234578 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:42.234550 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 04:26:42.234702 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:42.234554 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 04:26:42.235758 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:42.235732 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 04:26:42.235882 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:42.235780 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 04:26:42.236358 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:42.236113 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 04:26:42.236358 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:42.236125 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-k9bfv\"" Apr 16 04:26:42.238606 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:42.238583 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64d4f5b557-ngpk2"] Apr 16 04:26:42.307461 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:42.307408 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/056a5c11-8892-4001-9ab3-ed3bc1c17305-console-serving-cert\") pod \"console-64d4f5b557-ngpk2\" (UID: \"056a5c11-8892-4001-9ab3-ed3bc1c17305\") " pod="openshift-console/console-64d4f5b557-ngpk2" Apr 16 04:26:42.307627 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:42.307468 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/056a5c11-8892-4001-9ab3-ed3bc1c17305-service-ca\") pod \"console-64d4f5b557-ngpk2\" (UID: \"056a5c11-8892-4001-9ab3-ed3bc1c17305\") " pod="openshift-console/console-64d4f5b557-ngpk2" Apr 16 04:26:42.307627 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:42.307597 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/056a5c11-8892-4001-9ab3-ed3bc1c17305-console-config\") pod \"console-64d4f5b557-ngpk2\" (UID: \"056a5c11-8892-4001-9ab3-ed3bc1c17305\") " pod="openshift-console/console-64d4f5b557-ngpk2" Apr 16 04:26:42.307733 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:42.307641 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/056a5c11-8892-4001-9ab3-ed3bc1c17305-console-oauth-config\") pod \"console-64d4f5b557-ngpk2\" (UID: \"056a5c11-8892-4001-9ab3-ed3bc1c17305\") " pod="openshift-console/console-64d4f5b557-ngpk2" Apr 16 04:26:42.307788 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:42.307728 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc88t\" (UniqueName: \"kubernetes.io/projected/056a5c11-8892-4001-9ab3-ed3bc1c17305-kube-api-access-lc88t\") pod \"console-64d4f5b557-ngpk2\" (UID: \"056a5c11-8892-4001-9ab3-ed3bc1c17305\") " pod="openshift-console/console-64d4f5b557-ngpk2" Apr 16 04:26:42.307788 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:42.307771 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/056a5c11-8892-4001-9ab3-ed3bc1c17305-oauth-serving-cert\") pod \"console-64d4f5b557-ngpk2\" (UID: \"056a5c11-8892-4001-9ab3-ed3bc1c17305\") " pod="openshift-console/console-64d4f5b557-ngpk2" Apr 16 04:26:42.408755 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:42.408709 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/056a5c11-8892-4001-9ab3-ed3bc1c17305-console-config\") pod \"console-64d4f5b557-ngpk2\" (UID: \"056a5c11-8892-4001-9ab3-ed3bc1c17305\") " pod="openshift-console/console-64d4f5b557-ngpk2" Apr 16 04:26:42.408949 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:42.408780 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/056a5c11-8892-4001-9ab3-ed3bc1c17305-console-oauth-config\") pod \"console-64d4f5b557-ngpk2\" (UID: \"056a5c11-8892-4001-9ab3-ed3bc1c17305\") " pod="openshift-console/console-64d4f5b557-ngpk2" Apr 16 04:26:42.408949 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:42.408814 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lc88t\" (UniqueName: \"kubernetes.io/projected/056a5c11-8892-4001-9ab3-ed3bc1c17305-kube-api-access-lc88t\") pod \"console-64d4f5b557-ngpk2\" (UID: \"056a5c11-8892-4001-9ab3-ed3bc1c17305\") " pod="openshift-console/console-64d4f5b557-ngpk2" Apr 16 04:26:42.408949 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:42.408841 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/056a5c11-8892-4001-9ab3-ed3bc1c17305-oauth-serving-cert\") pod \"console-64d4f5b557-ngpk2\" (UID: \"056a5c11-8892-4001-9ab3-ed3bc1c17305\") " pod="openshift-console/console-64d4f5b557-ngpk2" Apr 16 04:26:42.408949 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:42.408870 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/056a5c11-8892-4001-9ab3-ed3bc1c17305-console-serving-cert\") pod \"console-64d4f5b557-ngpk2\" (UID: \"056a5c11-8892-4001-9ab3-ed3bc1c17305\") " pod="openshift-console/console-64d4f5b557-ngpk2" Apr 16 04:26:42.408949 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:42.408892 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/056a5c11-8892-4001-9ab3-ed3bc1c17305-service-ca\") pod \"console-64d4f5b557-ngpk2\" (UID: \"056a5c11-8892-4001-9ab3-ed3bc1c17305\") " pod="openshift-console/console-64d4f5b557-ngpk2" Apr 16 04:26:42.409629 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:42.409567 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/056a5c11-8892-4001-9ab3-ed3bc1c17305-console-config\") pod \"console-64d4f5b557-ngpk2\" (UID: \"056a5c11-8892-4001-9ab3-ed3bc1c17305\") " pod="openshift-console/console-64d4f5b557-ngpk2" Apr 16 04:26:42.409757 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:42.409636 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/056a5c11-8892-4001-9ab3-ed3bc1c17305-service-ca\") pod \"console-64d4f5b557-ngpk2\" (UID: \"056a5c11-8892-4001-9ab3-ed3bc1c17305\") " pod="openshift-console/console-64d4f5b557-ngpk2" Apr 16 04:26:42.410533 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:42.410509 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/056a5c11-8892-4001-9ab3-ed3bc1c17305-oauth-serving-cert\") pod \"console-64d4f5b557-ngpk2\" (UID: \"056a5c11-8892-4001-9ab3-ed3bc1c17305\") " pod="openshift-console/console-64d4f5b557-ngpk2" Apr 16 04:26:42.411715 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:42.411691 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/056a5c11-8892-4001-9ab3-ed3bc1c17305-console-oauth-config\") pod \"console-64d4f5b557-ngpk2\" (UID: \"056a5c11-8892-4001-9ab3-ed3bc1c17305\") " pod="openshift-console/console-64d4f5b557-ngpk2" Apr 16 04:26:42.411812 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:42.411729 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/056a5c11-8892-4001-9ab3-ed3bc1c17305-console-serving-cert\") pod \"console-64d4f5b557-ngpk2\" (UID: \"056a5c11-8892-4001-9ab3-ed3bc1c17305\") " pod="openshift-console/console-64d4f5b557-ngpk2" Apr 16 04:26:42.417718 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:42.417694 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc88t\" (UniqueName: \"kubernetes.io/projected/056a5c11-8892-4001-9ab3-ed3bc1c17305-kube-api-access-lc88t\") pod \"console-64d4f5b557-ngpk2\" (UID: \"056a5c11-8892-4001-9ab3-ed3bc1c17305\") " pod="openshift-console/console-64d4f5b557-ngpk2" Apr 16 04:26:42.544603 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:42.544566 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64d4f5b557-ngpk2" Apr 16 04:26:49.130854 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:49.130758 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64d4f5b557-ngpk2"] Apr 16 04:26:49.143501 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:26:49.143469 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod056a5c11_8892_4001_9ab3_ed3bc1c17305.slice/crio-4fbef342508f26fe2fdbb2d02722e2579818b00772d51bc1175d04d47ab2a035 WatchSource:0}: Error finding container 4fbef342508f26fe2fdbb2d02722e2579818b00772d51bc1175d04d47ab2a035: Status 404 returned error can't find the container with id 4fbef342508f26fe2fdbb2d02722e2579818b00772d51bc1175d04d47ab2a035 Apr 16 04:26:49.219411 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:49.219379 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-q9zfd" event={"ID":"fab06a06-1137-449c-860c-0eb917231cd3","Type":"ContainerStarted","Data":"09459c76349b2a0909effda727982c0496c1bf81ca6da92e5a9235499f378a09"} Apr 16 04:26:49.219606 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:49.219585 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-586b57c7b4-q9zfd" Apr 16 04:26:49.220767 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:49.220742 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64d4f5b557-ngpk2" event={"ID":"056a5c11-8892-4001-9ab3-ed3bc1c17305","Type":"ContainerStarted","Data":"4fbef342508f26fe2fdbb2d02722e2579818b00772d51bc1175d04d47ab2a035"} Apr 16 04:26:49.221208 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:49.221189 2575 patch_prober.go:28] interesting pod/downloads-586b57c7b4-q9zfd container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.134.0.17:8080/\": dial tcp 10.134.0.17:8080: connect: connection refused" start-of-body= Apr 16 04:26:49.221270 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:49.221239 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-586b57c7b4-q9zfd" podUID="fab06a06-1137-449c-860c-0eb917231cd3" containerName="download-server" probeResult="failure" output="Get \"http://10.134.0.17:8080/\": dial tcp 10.134.0.17:8080: connect: connection refused" Apr 16 04:26:49.235783 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:49.235719 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-586b57c7b4-q9zfd" podStartSLOduration=1.303663108 podStartE2EDuration="18.235703344s" podCreationTimestamp="2026-04-16 04:26:31 +0000 UTC" firstStartedPulling="2026-04-16 04:26:32.172361926 +0000 UTC m=+140.075137903" lastFinishedPulling="2026-04-16 04:26:49.104402145 +0000 UTC m=+157.007178139" observedRunningTime="2026-04-16 04:26:49.234271071 +0000 UTC m=+157.137047069" watchObservedRunningTime="2026-04-16 04:26:49.235703344 +0000 UTC m=+157.138479342" Apr 16 04:26:50.239345 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:50.239303 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-586b57c7b4-q9zfd" Apr 16 04:26:50.443011 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:50.442956 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-d95bc46c9-rpks7"] Apr 16 04:26:50.446667 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:50.446641 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d95bc46c9-rpks7" Apr 16 04:26:50.454446 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:50.454317 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 04:26:50.458546 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:50.458502 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-d95bc46c9-rpks7"] Apr 16 04:26:50.485295 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:50.485256 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b4643967-3602-4e57-bac6-5500591809f5-console-config\") pod \"console-d95bc46c9-rpks7\" (UID: \"b4643967-3602-4e57-bac6-5500591809f5\") " pod="openshift-console/console-d95bc46c9-rpks7" Apr 16 04:26:50.485295 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:50.485304 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dxrf\" (UniqueName: \"kubernetes.io/projected/b4643967-3602-4e57-bac6-5500591809f5-kube-api-access-4dxrf\") pod \"console-d95bc46c9-rpks7\" (UID: \"b4643967-3602-4e57-bac6-5500591809f5\") " pod="openshift-console/console-d95bc46c9-rpks7" Apr 16 04:26:50.485628 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:50.485364 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b4643967-3602-4e57-bac6-5500591809f5-service-ca\") pod \"console-d95bc46c9-rpks7\" (UID: \"b4643967-3602-4e57-bac6-5500591809f5\") " pod="openshift-console/console-d95bc46c9-rpks7" Apr 16 04:26:50.485628 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:50.485395 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4643967-3602-4e57-bac6-5500591809f5-trusted-ca-bundle\") pod \"console-d95bc46c9-rpks7\" (UID: \"b4643967-3602-4e57-bac6-5500591809f5\") " pod="openshift-console/console-d95bc46c9-rpks7" Apr 16 04:26:50.485628 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:50.485424 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b4643967-3602-4e57-bac6-5500591809f5-console-oauth-config\") pod \"console-d95bc46c9-rpks7\" (UID: \"b4643967-3602-4e57-bac6-5500591809f5\") " pod="openshift-console/console-d95bc46c9-rpks7" Apr 16 04:26:50.485628 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:50.485468 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b4643967-3602-4e57-bac6-5500591809f5-console-serving-cert\") pod \"console-d95bc46c9-rpks7\" (UID: \"b4643967-3602-4e57-bac6-5500591809f5\") " pod="openshift-console/console-d95bc46c9-rpks7" Apr 16 04:26:50.485628 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:50.485514 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b4643967-3602-4e57-bac6-5500591809f5-oauth-serving-cert\") pod \"console-d95bc46c9-rpks7\" (UID: \"b4643967-3602-4e57-bac6-5500591809f5\") " pod="openshift-console/console-d95bc46c9-rpks7" Apr 16 04:26:50.586665 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:50.586621 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b4643967-3602-4e57-bac6-5500591809f5-console-serving-cert\") pod \"console-d95bc46c9-rpks7\" (UID: \"b4643967-3602-4e57-bac6-5500591809f5\") " pod="openshift-console/console-d95bc46c9-rpks7" Apr 16 04:26:50.586861 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:50.586692 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b4643967-3602-4e57-bac6-5500591809f5-oauth-serving-cert\") pod \"console-d95bc46c9-rpks7\" (UID: \"b4643967-3602-4e57-bac6-5500591809f5\") " pod="openshift-console/console-d95bc46c9-rpks7" Apr 16 04:26:50.586861 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:50.586779 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b4643967-3602-4e57-bac6-5500591809f5-console-config\") pod \"console-d95bc46c9-rpks7\" (UID: \"b4643967-3602-4e57-bac6-5500591809f5\") " pod="openshift-console/console-d95bc46c9-rpks7" Apr 16 04:26:50.586861 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:50.586806 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4dxrf\" (UniqueName: \"kubernetes.io/projected/b4643967-3602-4e57-bac6-5500591809f5-kube-api-access-4dxrf\") pod \"console-d95bc46c9-rpks7\" (UID: \"b4643967-3602-4e57-bac6-5500591809f5\") " pod="openshift-console/console-d95bc46c9-rpks7" Apr 16 04:26:50.586861 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:50.586838 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b4643967-3602-4e57-bac6-5500591809f5-service-ca\") pod \"console-d95bc46c9-rpks7\" (UID: \"b4643967-3602-4e57-bac6-5500591809f5\") " pod="openshift-console/console-d95bc46c9-rpks7" Apr 16 04:26:50.587088 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:50.586864 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4643967-3602-4e57-bac6-5500591809f5-trusted-ca-bundle\") pod \"console-d95bc46c9-rpks7\" (UID: \"b4643967-3602-4e57-bac6-5500591809f5\") " pod="openshift-console/console-d95bc46c9-rpks7" Apr 16 04:26:50.587088 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:50.586894 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b4643967-3602-4e57-bac6-5500591809f5-console-oauth-config\") pod \"console-d95bc46c9-rpks7\" (UID: \"b4643967-3602-4e57-bac6-5500591809f5\") " pod="openshift-console/console-d95bc46c9-rpks7" Apr 16 04:26:50.588144 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:50.588054 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4643967-3602-4e57-bac6-5500591809f5-trusted-ca-bundle\") pod \"console-d95bc46c9-rpks7\" (UID: \"b4643967-3602-4e57-bac6-5500591809f5\") " pod="openshift-console/console-d95bc46c9-rpks7" Apr 16 04:26:50.588144 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:50.588065 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b4643967-3602-4e57-bac6-5500591809f5-oauth-serving-cert\") pod \"console-d95bc46c9-rpks7\" (UID: \"b4643967-3602-4e57-bac6-5500591809f5\") " pod="openshift-console/console-d95bc46c9-rpks7" Apr 16 04:26:50.588144 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:50.588139 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b4643967-3602-4e57-bac6-5500591809f5-service-ca\") pod \"console-d95bc46c9-rpks7\" (UID: \"b4643967-3602-4e57-bac6-5500591809f5\") " pod="openshift-console/console-d95bc46c9-rpks7" Apr 16 04:26:50.588594 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:50.588561 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b4643967-3602-4e57-bac6-5500591809f5-console-config\") pod \"console-d95bc46c9-rpks7\" (UID: \"b4643967-3602-4e57-bac6-5500591809f5\") " pod="openshift-console/console-d95bc46c9-rpks7" Apr 16 04:26:50.590715 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:50.590694 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b4643967-3602-4e57-bac6-5500591809f5-console-oauth-config\") pod \"console-d95bc46c9-rpks7\" (UID: \"b4643967-3602-4e57-bac6-5500591809f5\") " pod="openshift-console/console-d95bc46c9-rpks7" Apr 16 04:26:50.591468 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:50.591443 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b4643967-3602-4e57-bac6-5500591809f5-console-serving-cert\") pod \"console-d95bc46c9-rpks7\" (UID: \"b4643967-3602-4e57-bac6-5500591809f5\") " pod="openshift-console/console-d95bc46c9-rpks7" Apr 16 04:26:50.596049 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:50.596028 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dxrf\" (UniqueName: \"kubernetes.io/projected/b4643967-3602-4e57-bac6-5500591809f5-kube-api-access-4dxrf\") pod \"console-d95bc46c9-rpks7\" (UID: \"b4643967-3602-4e57-bac6-5500591809f5\") " pod="openshift-console/console-d95bc46c9-rpks7" Apr 16 04:26:50.761911 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:50.761871 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d95bc46c9-rpks7" Apr 16 04:26:50.913856 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:50.913821 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-d95bc46c9-rpks7"] Apr 16 04:26:50.916360 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:26:50.916298 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4643967_3602_4e57_bac6_5500591809f5.slice/crio-c0a9d33a4b98b019944ac1144288ea9cfdd278c00d0f17fbc29edab3d1151a3b WatchSource:0}: Error finding container c0a9d33a4b98b019944ac1144288ea9cfdd278c00d0f17fbc29edab3d1151a3b: Status 404 returned error can't find the container with id c0a9d33a4b98b019944ac1144288ea9cfdd278c00d0f17fbc29edab3d1151a3b Apr 16 04:26:51.230146 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:51.229978 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d95bc46c9-rpks7" event={"ID":"b4643967-3602-4e57-bac6-5500591809f5","Type":"ContainerStarted","Data":"c0a9d33a4b98b019944ac1144288ea9cfdd278c00d0f17fbc29edab3d1151a3b"} Apr 16 04:26:53.239289 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:53.238747 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64d4f5b557-ngpk2" event={"ID":"056a5c11-8892-4001-9ab3-ed3bc1c17305","Type":"ContainerStarted","Data":"0577bb20bbcc156a22319d8088e1d22a55c068b327e70c4c41297acb21972cb7"} Apr 16 04:26:53.240381 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:53.240352 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d95bc46c9-rpks7" event={"ID":"b4643967-3602-4e57-bac6-5500591809f5","Type":"ContainerStarted","Data":"b6830a33262aaaa291f956ddb77fc261e7ebec4a00d9655dff6d75c1b5d5142e"} Apr 16 04:26:53.255701 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:53.255653 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-64d4f5b557-ngpk2" podStartSLOduration=7.482043905 podStartE2EDuration="11.255638217s" podCreationTimestamp="2026-04-16 04:26:42 +0000 UTC" firstStartedPulling="2026-04-16 04:26:49.145585298 +0000 UTC m=+157.048361275" lastFinishedPulling="2026-04-16 04:26:52.919179607 +0000 UTC m=+160.821955587" observedRunningTime="2026-04-16 04:26:53.254714033 +0000 UTC m=+161.157490035" watchObservedRunningTime="2026-04-16 04:26:53.255638217 +0000 UTC m=+161.158414215" Apr 16 04:26:53.271154 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:53.271096 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-d95bc46c9-rpks7" podStartSLOduration=1.269413002 podStartE2EDuration="3.271076311s" podCreationTimestamp="2026-04-16 04:26:50 +0000 UTC" firstStartedPulling="2026-04-16 04:26:50.918656163 +0000 UTC m=+158.821432148" lastFinishedPulling="2026-04-16 04:26:52.920319466 +0000 UTC m=+160.823095457" observedRunningTime="2026-04-16 04:26:53.269989124 +0000 UTC m=+161.172765152" watchObservedRunningTime="2026-04-16 04:26:53.271076311 +0000 UTC m=+161.173852311" Apr 16 04:26:58.258913 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:58.258872 2575 generic.go:358] "Generic (PLEG): container finished" podID="b6856769-4d20-4817-9f00-63ded00b2073" containerID="c7fbd8c27966bbde3275dbe01b1463c35f5314a016ad4b42ad87bacd52c1ec46" exitCode=0 Apr 16 04:26:58.259413 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:58.258956 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-79vz7" event={"ID":"b6856769-4d20-4817-9f00-63ded00b2073","Type":"ContainerDied","Data":"c7fbd8c27966bbde3275dbe01b1463c35f5314a016ad4b42ad87bacd52c1ec46"} Apr 16 04:26:58.259413 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:58.259386 2575 scope.go:117] "RemoveContainer" containerID="c7fbd8c27966bbde3275dbe01b1463c35f5314a016ad4b42ad87bacd52c1ec46" Apr 16 04:26:59.263215 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:26:59.263179 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-79vz7" event={"ID":"b6856769-4d20-4817-9f00-63ded00b2073","Type":"ContainerStarted","Data":"c0c0bca09b86db705a7e022bbcd10ecddbaf6bbc83ca33bfe356537ab3534365"} Apr 16 04:27:00.762506 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:00.762469 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-d95bc46c9-rpks7" Apr 16 04:27:00.762973 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:00.762517 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-d95bc46c9-rpks7" Apr 16 04:27:00.767385 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:00.767358 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-d95bc46c9-rpks7" Apr 16 04:27:01.272215 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:01.272182 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-d95bc46c9-rpks7" Apr 16 04:27:01.315577 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:01.315546 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-64d4f5b557-ngpk2"] Apr 16 04:27:02.545323 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:02.545286 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-64d4f5b557-ngpk2" Apr 16 04:27:26.338586 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:26.338542 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-64d4f5b557-ngpk2" podUID="056a5c11-8892-4001-9ab3-ed3bc1c17305" containerName="console" containerID="cri-o://0577bb20bbcc156a22319d8088e1d22a55c068b327e70c4c41297acb21972cb7" gracePeriod=15 Apr 16 04:27:26.593436 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:26.593374 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-64d4f5b557-ngpk2_056a5c11-8892-4001-9ab3-ed3bc1c17305/console/0.log" Apr 16 04:27:26.593545 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:26.593439 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64d4f5b557-ngpk2" Apr 16 04:27:26.686407 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:26.686372 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/056a5c11-8892-4001-9ab3-ed3bc1c17305-console-config\") pod \"056a5c11-8892-4001-9ab3-ed3bc1c17305\" (UID: \"056a5c11-8892-4001-9ab3-ed3bc1c17305\") " Apr 16 04:27:26.686596 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:26.686426 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/056a5c11-8892-4001-9ab3-ed3bc1c17305-console-oauth-config\") pod \"056a5c11-8892-4001-9ab3-ed3bc1c17305\" (UID: \"056a5c11-8892-4001-9ab3-ed3bc1c17305\") " Apr 16 04:27:26.686596 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:26.686496 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/056a5c11-8892-4001-9ab3-ed3bc1c17305-console-serving-cert\") pod \"056a5c11-8892-4001-9ab3-ed3bc1c17305\" (UID: \"056a5c11-8892-4001-9ab3-ed3bc1c17305\") " Apr 16 04:27:26.686596 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:26.686519 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lc88t\" (UniqueName: \"kubernetes.io/projected/056a5c11-8892-4001-9ab3-ed3bc1c17305-kube-api-access-lc88t\") pod \"056a5c11-8892-4001-9ab3-ed3bc1c17305\" (UID: \"056a5c11-8892-4001-9ab3-ed3bc1c17305\") " Apr 16 04:27:26.686596 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:26.686550 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/056a5c11-8892-4001-9ab3-ed3bc1c17305-oauth-serving-cert\") pod \"056a5c11-8892-4001-9ab3-ed3bc1c17305\" (UID: \"056a5c11-8892-4001-9ab3-ed3bc1c17305\") " Apr 16 04:27:26.686596 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:26.686586 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/056a5c11-8892-4001-9ab3-ed3bc1c17305-service-ca\") pod \"056a5c11-8892-4001-9ab3-ed3bc1c17305\" (UID: \"056a5c11-8892-4001-9ab3-ed3bc1c17305\") " Apr 16 04:27:26.686909 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:26.686791 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/056a5c11-8892-4001-9ab3-ed3bc1c17305-console-config" (OuterVolumeSpecName: "console-config") pod "056a5c11-8892-4001-9ab3-ed3bc1c17305" (UID: "056a5c11-8892-4001-9ab3-ed3bc1c17305"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 04:27:26.687013 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:26.686962 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/056a5c11-8892-4001-9ab3-ed3bc1c17305-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "056a5c11-8892-4001-9ab3-ed3bc1c17305" (UID: "056a5c11-8892-4001-9ab3-ed3bc1c17305"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 04:27:26.687058 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:26.687012 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/056a5c11-8892-4001-9ab3-ed3bc1c17305-service-ca" (OuterVolumeSpecName: "service-ca") pod "056a5c11-8892-4001-9ab3-ed3bc1c17305" (UID: "056a5c11-8892-4001-9ab3-ed3bc1c17305"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 04:27:26.688867 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:26.688836 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/056a5c11-8892-4001-9ab3-ed3bc1c17305-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "056a5c11-8892-4001-9ab3-ed3bc1c17305" (UID: "056a5c11-8892-4001-9ab3-ed3bc1c17305"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 04:27:26.688965 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:26.688875 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/056a5c11-8892-4001-9ab3-ed3bc1c17305-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "056a5c11-8892-4001-9ab3-ed3bc1c17305" (UID: "056a5c11-8892-4001-9ab3-ed3bc1c17305"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 04:27:26.688965 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:26.688928 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/056a5c11-8892-4001-9ab3-ed3bc1c17305-kube-api-access-lc88t" (OuterVolumeSpecName: "kube-api-access-lc88t") pod "056a5c11-8892-4001-9ab3-ed3bc1c17305" (UID: "056a5c11-8892-4001-9ab3-ed3bc1c17305"). InnerVolumeSpecName "kube-api-access-lc88t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 04:27:26.788042 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:26.787645 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/056a5c11-8892-4001-9ab3-ed3bc1c17305-console-serving-cert\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:27:26.788237 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:26.788135 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lc88t\" (UniqueName: \"kubernetes.io/projected/056a5c11-8892-4001-9ab3-ed3bc1c17305-kube-api-access-lc88t\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:27:26.788297 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:26.788241 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/056a5c11-8892-4001-9ab3-ed3bc1c17305-oauth-serving-cert\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:27:26.788404 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:26.788299 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/056a5c11-8892-4001-9ab3-ed3bc1c17305-service-ca\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:27:26.792765 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:26.792738 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/056a5c11-8892-4001-9ab3-ed3bc1c17305-console-config\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:27:26.793098 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:26.793078 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/056a5c11-8892-4001-9ab3-ed3bc1c17305-console-oauth-config\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:27:27.359173 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:27.359144 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-64d4f5b557-ngpk2_056a5c11-8892-4001-9ab3-ed3bc1c17305/console/0.log" Apr 16 04:27:27.359638 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:27.359187 2575 generic.go:358] "Generic (PLEG): container finished" podID="056a5c11-8892-4001-9ab3-ed3bc1c17305" containerID="0577bb20bbcc156a22319d8088e1d22a55c068b327e70c4c41297acb21972cb7" exitCode=2 Apr 16 04:27:27.359638 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:27.359225 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64d4f5b557-ngpk2" event={"ID":"056a5c11-8892-4001-9ab3-ed3bc1c17305","Type":"ContainerDied","Data":"0577bb20bbcc156a22319d8088e1d22a55c068b327e70c4c41297acb21972cb7"} Apr 16 04:27:27.359638 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:27.359268 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64d4f5b557-ngpk2" event={"ID":"056a5c11-8892-4001-9ab3-ed3bc1c17305","Type":"ContainerDied","Data":"4fbef342508f26fe2fdbb2d02722e2579818b00772d51bc1175d04d47ab2a035"} Apr 16 04:27:27.359638 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:27.359279 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64d4f5b557-ngpk2" Apr 16 04:27:27.359638 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:27.359286 2575 scope.go:117] "RemoveContainer" containerID="0577bb20bbcc156a22319d8088e1d22a55c068b327e70c4c41297acb21972cb7" Apr 16 04:27:27.366935 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:27.366920 2575 scope.go:117] "RemoveContainer" containerID="0577bb20bbcc156a22319d8088e1d22a55c068b327e70c4c41297acb21972cb7" Apr 16 04:27:27.367175 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:27:27.367156 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0577bb20bbcc156a22319d8088e1d22a55c068b327e70c4c41297acb21972cb7\": container with ID starting with 0577bb20bbcc156a22319d8088e1d22a55c068b327e70c4c41297acb21972cb7 not found: ID does not exist" containerID="0577bb20bbcc156a22319d8088e1d22a55c068b327e70c4c41297acb21972cb7" Apr 16 04:27:27.367223 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:27.367184 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0577bb20bbcc156a22319d8088e1d22a55c068b327e70c4c41297acb21972cb7"} err="failed to get container status \"0577bb20bbcc156a22319d8088e1d22a55c068b327e70c4c41297acb21972cb7\": rpc error: code = NotFound desc = could not find container \"0577bb20bbcc156a22319d8088e1d22a55c068b327e70c4c41297acb21972cb7\": container with ID starting with 0577bb20bbcc156a22319d8088e1d22a55c068b327e70c4c41297acb21972cb7 not found: ID does not exist" Apr 16 04:27:27.375091 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:27.375064 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-64d4f5b557-ngpk2"] Apr 16 04:27:27.378966 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:27.378946 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-64d4f5b557-ngpk2"] Apr 16 04:27:28.706935 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:28.706904 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="056a5c11-8892-4001-9ab3-ed3bc1c17305" path="/var/lib/kubelet/pods/056a5c11-8892-4001-9ab3-ed3bc1c17305/volumes" Apr 16 04:27:40.071967 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:40.071921 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 04:27:40.072447 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:40.072353 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="04fc0933-f968-413e-a220-0cef26f8f792" containerName="alertmanager" containerID="cri-o://c65b14134904a9ccd53dc817d15066c4f8309baffb19c39287d275208a00fa49" gracePeriod=120 Apr 16 04:27:40.072447 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:40.072393 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="04fc0933-f968-413e-a220-0cef26f8f792" containerName="kube-rbac-proxy" containerID="cri-o://b3ee7bd95b6870061c2ab459cfde8d7f139b8da9c1f44c30a732e2c72f04727e" gracePeriod=120 Apr 16 04:27:40.072447 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:40.072412 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="04fc0933-f968-413e-a220-0cef26f8f792" containerName="kube-rbac-proxy-metric" containerID="cri-o://87e3bb9feb57b4c6d5afe470ed4f017d9e86f91fc61a24061b202961e547864f" gracePeriod=120 Apr 16 04:27:40.072662 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:40.072433 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="04fc0933-f968-413e-a220-0cef26f8f792" containerName="kube-rbac-proxy-web" containerID="cri-o://7fb8dd22d8d0ea66b6075b542b76223af53e1dffb8879e1b089c1dd217b08ecb" gracePeriod=120 Apr 16 04:27:40.072662 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:40.072451 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="04fc0933-f968-413e-a220-0cef26f8f792" containerName="config-reloader" containerID="cri-o://17b7ac979ac6a67fb3ad96fa61b71c8ee01f408a7b93c570e3b7e1e8621fc7ec" gracePeriod=120 Apr 16 04:27:40.072662 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:40.072387 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="04fc0933-f968-413e-a220-0cef26f8f792" containerName="prom-label-proxy" containerID="cri-o://b0a65288e643ed0f93c109311470663c9c98aa967b54eb49a4880c54f7119ea6" gracePeriod=120 Apr 16 04:27:40.401264 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:40.401176 2575 generic.go:358] "Generic (PLEG): container finished" podID="04fc0933-f968-413e-a220-0cef26f8f792" containerID="b0a65288e643ed0f93c109311470663c9c98aa967b54eb49a4880c54f7119ea6" exitCode=0 Apr 16 04:27:40.401264 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:40.401199 2575 generic.go:358] "Generic (PLEG): container finished" podID="04fc0933-f968-413e-a220-0cef26f8f792" containerID="b3ee7bd95b6870061c2ab459cfde8d7f139b8da9c1f44c30a732e2c72f04727e" exitCode=0 Apr 16 04:27:40.401264 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:40.401206 2575 generic.go:358] "Generic (PLEG): container finished" podID="04fc0933-f968-413e-a220-0cef26f8f792" containerID="17b7ac979ac6a67fb3ad96fa61b71c8ee01f408a7b93c570e3b7e1e8621fc7ec" exitCode=0 Apr 16 04:27:40.401264 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:40.401211 2575 generic.go:358] "Generic (PLEG): container finished" podID="04fc0933-f968-413e-a220-0cef26f8f792" containerID="c65b14134904a9ccd53dc817d15066c4f8309baffb19c39287d275208a00fa49" exitCode=0 Apr 16 04:27:40.401264 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:40.401250 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"04fc0933-f968-413e-a220-0cef26f8f792","Type":"ContainerDied","Data":"b0a65288e643ed0f93c109311470663c9c98aa967b54eb49a4880c54f7119ea6"} Apr 16 04:27:40.401566 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:40.401286 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"04fc0933-f968-413e-a220-0cef26f8f792","Type":"ContainerDied","Data":"b3ee7bd95b6870061c2ab459cfde8d7f139b8da9c1f44c30a732e2c72f04727e"} Apr 16 04:27:40.401566 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:40.401296 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"04fc0933-f968-413e-a220-0cef26f8f792","Type":"ContainerDied","Data":"17b7ac979ac6a67fb3ad96fa61b71c8ee01f408a7b93c570e3b7e1e8621fc7ec"} Apr 16 04:27:40.401566 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:40.401304 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"04fc0933-f968-413e-a220-0cef26f8f792","Type":"ContainerDied","Data":"c65b14134904a9ccd53dc817d15066c4f8309baffb19c39287d275208a00fa49"} Apr 16 04:27:41.320720 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.320698 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:27:41.402560 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.402473 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04fc0933-f968-413e-a220-0cef26f8f792-alertmanager-trusted-ca-bundle\") pod \"04fc0933-f968-413e-a220-0cef26f8f792\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " Apr 16 04:27:41.402560 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.402518 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/04fc0933-f968-413e-a220-0cef26f8f792-secret-alertmanager-kube-rbac-proxy\") pod \"04fc0933-f968-413e-a220-0cef26f8f792\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " Apr 16 04:27:41.402560 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.402548 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/04fc0933-f968-413e-a220-0cef26f8f792-secret-alertmanager-kube-rbac-proxy-metric\") pod \"04fc0933-f968-413e-a220-0cef26f8f792\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " Apr 16 04:27:41.402833 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.402576 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/04fc0933-f968-413e-a220-0cef26f8f792-alertmanager-main-db\") pod \"04fc0933-f968-413e-a220-0cef26f8f792\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " Apr 16 04:27:41.402833 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.402593 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/04fc0933-f968-413e-a220-0cef26f8f792-config-volume\") pod \"04fc0933-f968-413e-a220-0cef26f8f792\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " Apr 16 04:27:41.402833 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.402653 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/04fc0933-f968-413e-a220-0cef26f8f792-tls-assets\") pod \"04fc0933-f968-413e-a220-0cef26f8f792\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " Apr 16 04:27:41.402833 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.402726 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/04fc0933-f968-413e-a220-0cef26f8f792-secret-alertmanager-main-tls\") pod \"04fc0933-f968-413e-a220-0cef26f8f792\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " Apr 16 04:27:41.403007 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.402888 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04fc0933-f968-413e-a220-0cef26f8f792-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "04fc0933-f968-413e-a220-0cef26f8f792" (UID: "04fc0933-f968-413e-a220-0cef26f8f792"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 04:27:41.403007 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.402932 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04fc0933-f968-413e-a220-0cef26f8f792-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "04fc0933-f968-413e-a220-0cef26f8f792" (UID: "04fc0933-f968-413e-a220-0cef26f8f792"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 04:27:41.403118 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.403018 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/04fc0933-f968-413e-a220-0cef26f8f792-web-config\") pod \"04fc0933-f968-413e-a220-0cef26f8f792\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " Apr 16 04:27:41.403118 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.403052 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/04fc0933-f968-413e-a220-0cef26f8f792-cluster-tls-config\") pod \"04fc0933-f968-413e-a220-0cef26f8f792\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " Apr 16 04:27:41.403118 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.403085 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/04fc0933-f968-413e-a220-0cef26f8f792-config-out\") pod \"04fc0933-f968-413e-a220-0cef26f8f792\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " Apr 16 04:27:41.403285 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.403128 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/04fc0933-f968-413e-a220-0cef26f8f792-secret-alertmanager-kube-rbac-proxy-web\") pod \"04fc0933-f968-413e-a220-0cef26f8f792\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " Apr 16 04:27:41.403285 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.403159 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmjvx\" (UniqueName: \"kubernetes.io/projected/04fc0933-f968-413e-a220-0cef26f8f792-kube-api-access-cmjvx\") pod \"04fc0933-f968-413e-a220-0cef26f8f792\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " Apr 16 04:27:41.403285 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.403211 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/04fc0933-f968-413e-a220-0cef26f8f792-metrics-client-ca\") pod \"04fc0933-f968-413e-a220-0cef26f8f792\" (UID: \"04fc0933-f968-413e-a220-0cef26f8f792\") " Apr 16 04:27:41.405146 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.403487 2575 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/04fc0933-f968-413e-a220-0cef26f8f792-alertmanager-main-db\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:27:41.405146 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.403507 2575 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04fc0933-f968-413e-a220-0cef26f8f792-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:27:41.405146 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.403881 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04fc0933-f968-413e-a220-0cef26f8f792-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "04fc0933-f968-413e-a220-0cef26f8f792" (UID: "04fc0933-f968-413e-a220-0cef26f8f792"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 04:27:41.406696 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.405754 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04fc0933-f968-413e-a220-0cef26f8f792-config-volume" (OuterVolumeSpecName: "config-volume") pod "04fc0933-f968-413e-a220-0cef26f8f792" (UID: "04fc0933-f968-413e-a220-0cef26f8f792"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 04:27:41.406696 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.405769 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04fc0933-f968-413e-a220-0cef26f8f792-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "04fc0933-f968-413e-a220-0cef26f8f792" (UID: "04fc0933-f968-413e-a220-0cef26f8f792"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 04:27:41.406696 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.406192 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04fc0933-f968-413e-a220-0cef26f8f792-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "04fc0933-f968-413e-a220-0cef26f8f792" (UID: "04fc0933-f968-413e-a220-0cef26f8f792"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 04:27:41.406696 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.406632 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04fc0933-f968-413e-a220-0cef26f8f792-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "04fc0933-f968-413e-a220-0cef26f8f792" (UID: "04fc0933-f968-413e-a220-0cef26f8f792"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 04:27:41.406696 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.406640 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04fc0933-f968-413e-a220-0cef26f8f792-kube-api-access-cmjvx" (OuterVolumeSpecName: "kube-api-access-cmjvx") pod "04fc0933-f968-413e-a220-0cef26f8f792" (UID: "04fc0933-f968-413e-a220-0cef26f8f792"). InnerVolumeSpecName "kube-api-access-cmjvx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 04:27:41.407456 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.407418 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04fc0933-f968-413e-a220-0cef26f8f792-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "04fc0933-f968-413e-a220-0cef26f8f792" (UID: "04fc0933-f968-413e-a220-0cef26f8f792"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 04:27:41.407724 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.407685 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04fc0933-f968-413e-a220-0cef26f8f792-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "04fc0933-f968-413e-a220-0cef26f8f792" (UID: "04fc0933-f968-413e-a220-0cef26f8f792"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 04:27:41.407826 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.407773 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04fc0933-f968-413e-a220-0cef26f8f792-config-out" (OuterVolumeSpecName: "config-out") pod "04fc0933-f968-413e-a220-0cef26f8f792" (UID: "04fc0933-f968-413e-a220-0cef26f8f792"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 04:27:41.409272 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.409245 2575 generic.go:358] "Generic (PLEG): container finished" podID="04fc0933-f968-413e-a220-0cef26f8f792" containerID="87e3bb9feb57b4c6d5afe470ed4f017d9e86f91fc61a24061b202961e547864f" exitCode=0 Apr 16 04:27:41.409272 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.409271 2575 generic.go:358] "Generic (PLEG): container finished" podID="04fc0933-f968-413e-a220-0cef26f8f792" containerID="7fb8dd22d8d0ea66b6075b542b76223af53e1dffb8879e1b089c1dd217b08ecb" exitCode=0 Apr 16 04:27:41.409479 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.409389 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"04fc0933-f968-413e-a220-0cef26f8f792","Type":"ContainerDied","Data":"87e3bb9feb57b4c6d5afe470ed4f017d9e86f91fc61a24061b202961e547864f"} Apr 16 04:27:41.409479 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.409434 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:27:41.409479 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.409448 2575 scope.go:117] "RemoveContainer" containerID="b0a65288e643ed0f93c109311470663c9c98aa967b54eb49a4880c54f7119ea6" Apr 16 04:27:41.409690 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.409434 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"04fc0933-f968-413e-a220-0cef26f8f792","Type":"ContainerDied","Data":"7fb8dd22d8d0ea66b6075b542b76223af53e1dffb8879e1b089c1dd217b08ecb"} Apr 16 04:27:41.409690 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.409563 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"04fc0933-f968-413e-a220-0cef26f8f792","Type":"ContainerDied","Data":"81baf301a80bad968ef29adbbf203472fda793b201f3884ed81642cbd9354297"} Apr 16 04:27:41.411782 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.411759 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04fc0933-f968-413e-a220-0cef26f8f792-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "04fc0933-f968-413e-a220-0cef26f8f792" (UID: "04fc0933-f968-413e-a220-0cef26f8f792"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 04:27:41.418487 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.418464 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04fc0933-f968-413e-a220-0cef26f8f792-web-config" (OuterVolumeSpecName: "web-config") pod "04fc0933-f968-413e-a220-0cef26f8f792" (UID: "04fc0933-f968-413e-a220-0cef26f8f792"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 04:27:41.419869 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.419850 2575 scope.go:117] "RemoveContainer" containerID="87e3bb9feb57b4c6d5afe470ed4f017d9e86f91fc61a24061b202961e547864f" Apr 16 04:27:41.426474 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.426452 2575 scope.go:117] "RemoveContainer" containerID="b3ee7bd95b6870061c2ab459cfde8d7f139b8da9c1f44c30a732e2c72f04727e" Apr 16 04:27:41.432906 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.432890 2575 scope.go:117] "RemoveContainer" containerID="7fb8dd22d8d0ea66b6075b542b76223af53e1dffb8879e1b089c1dd217b08ecb" Apr 16 04:27:41.439138 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.439119 2575 scope.go:117] "RemoveContainer" containerID="17b7ac979ac6a67fb3ad96fa61b71c8ee01f408a7b93c570e3b7e1e8621fc7ec" Apr 16 04:27:41.445476 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.445454 2575 scope.go:117] "RemoveContainer" containerID="c65b14134904a9ccd53dc817d15066c4f8309baffb19c39287d275208a00fa49" Apr 16 04:27:41.451910 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.451892 2575 scope.go:117] "RemoveContainer" containerID="9e1b665ba78eded916841fb126b61d6e020f976d1511ceda99f09117f3945610" Apr 16 04:27:41.458226 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.458207 2575 scope.go:117] "RemoveContainer" containerID="b0a65288e643ed0f93c109311470663c9c98aa967b54eb49a4880c54f7119ea6" Apr 16 04:27:41.458503 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:27:41.458484 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0a65288e643ed0f93c109311470663c9c98aa967b54eb49a4880c54f7119ea6\": container with ID starting with b0a65288e643ed0f93c109311470663c9c98aa967b54eb49a4880c54f7119ea6 not found: ID does not exist" containerID="b0a65288e643ed0f93c109311470663c9c98aa967b54eb49a4880c54f7119ea6" Apr 16 04:27:41.458562 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.458514 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0a65288e643ed0f93c109311470663c9c98aa967b54eb49a4880c54f7119ea6"} err="failed to get container status \"b0a65288e643ed0f93c109311470663c9c98aa967b54eb49a4880c54f7119ea6\": rpc error: code = NotFound desc = could not find container \"b0a65288e643ed0f93c109311470663c9c98aa967b54eb49a4880c54f7119ea6\": container with ID starting with b0a65288e643ed0f93c109311470663c9c98aa967b54eb49a4880c54f7119ea6 not found: ID does not exist" Apr 16 04:27:41.458562 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.458538 2575 scope.go:117] "RemoveContainer" containerID="87e3bb9feb57b4c6d5afe470ed4f017d9e86f91fc61a24061b202961e547864f" Apr 16 04:27:41.458798 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:27:41.458778 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87e3bb9feb57b4c6d5afe470ed4f017d9e86f91fc61a24061b202961e547864f\": container with ID starting with 87e3bb9feb57b4c6d5afe470ed4f017d9e86f91fc61a24061b202961e547864f not found: ID does not exist" containerID="87e3bb9feb57b4c6d5afe470ed4f017d9e86f91fc61a24061b202961e547864f" Apr 16 04:27:41.458850 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.458806 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87e3bb9feb57b4c6d5afe470ed4f017d9e86f91fc61a24061b202961e547864f"} err="failed to get container status \"87e3bb9feb57b4c6d5afe470ed4f017d9e86f91fc61a24061b202961e547864f\": rpc error: code = NotFound desc = could not find container \"87e3bb9feb57b4c6d5afe470ed4f017d9e86f91fc61a24061b202961e547864f\": container with ID starting with 87e3bb9feb57b4c6d5afe470ed4f017d9e86f91fc61a24061b202961e547864f not found: ID does not exist" Apr 16 04:27:41.458850 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.458823 2575 scope.go:117] "RemoveContainer" containerID="b3ee7bd95b6870061c2ab459cfde8d7f139b8da9c1f44c30a732e2c72f04727e" Apr 16 04:27:41.459049 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:27:41.459029 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3ee7bd95b6870061c2ab459cfde8d7f139b8da9c1f44c30a732e2c72f04727e\": container with ID starting with b3ee7bd95b6870061c2ab459cfde8d7f139b8da9c1f44c30a732e2c72f04727e not found: ID does not exist" containerID="b3ee7bd95b6870061c2ab459cfde8d7f139b8da9c1f44c30a732e2c72f04727e" Apr 16 04:27:41.459092 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.459053 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3ee7bd95b6870061c2ab459cfde8d7f139b8da9c1f44c30a732e2c72f04727e"} err="failed to get container status \"b3ee7bd95b6870061c2ab459cfde8d7f139b8da9c1f44c30a732e2c72f04727e\": rpc error: code = NotFound desc = could not find container \"b3ee7bd95b6870061c2ab459cfde8d7f139b8da9c1f44c30a732e2c72f04727e\": container with ID starting with b3ee7bd95b6870061c2ab459cfde8d7f139b8da9c1f44c30a732e2c72f04727e not found: ID does not exist" Apr 16 04:27:41.459092 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.459068 2575 scope.go:117] "RemoveContainer" containerID="7fb8dd22d8d0ea66b6075b542b76223af53e1dffb8879e1b089c1dd217b08ecb" Apr 16 04:27:41.459280 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:27:41.459259 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fb8dd22d8d0ea66b6075b542b76223af53e1dffb8879e1b089c1dd217b08ecb\": container with ID starting with 7fb8dd22d8d0ea66b6075b542b76223af53e1dffb8879e1b089c1dd217b08ecb not found: ID does not exist" containerID="7fb8dd22d8d0ea66b6075b542b76223af53e1dffb8879e1b089c1dd217b08ecb" Apr 16 04:27:41.459349 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.459283 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fb8dd22d8d0ea66b6075b542b76223af53e1dffb8879e1b089c1dd217b08ecb"} err="failed to get container status \"7fb8dd22d8d0ea66b6075b542b76223af53e1dffb8879e1b089c1dd217b08ecb\": rpc error: code = NotFound desc = could not find container \"7fb8dd22d8d0ea66b6075b542b76223af53e1dffb8879e1b089c1dd217b08ecb\": container with ID starting with 7fb8dd22d8d0ea66b6075b542b76223af53e1dffb8879e1b089c1dd217b08ecb not found: ID does not exist" Apr 16 04:27:41.459349 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.459295 2575 scope.go:117] "RemoveContainer" containerID="17b7ac979ac6a67fb3ad96fa61b71c8ee01f408a7b93c570e3b7e1e8621fc7ec" Apr 16 04:27:41.459503 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:27:41.459487 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17b7ac979ac6a67fb3ad96fa61b71c8ee01f408a7b93c570e3b7e1e8621fc7ec\": container with ID starting with 17b7ac979ac6a67fb3ad96fa61b71c8ee01f408a7b93c570e3b7e1e8621fc7ec not found: ID does not exist" containerID="17b7ac979ac6a67fb3ad96fa61b71c8ee01f408a7b93c570e3b7e1e8621fc7ec" Apr 16 04:27:41.459542 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.459506 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17b7ac979ac6a67fb3ad96fa61b71c8ee01f408a7b93c570e3b7e1e8621fc7ec"} err="failed to get container status \"17b7ac979ac6a67fb3ad96fa61b71c8ee01f408a7b93c570e3b7e1e8621fc7ec\": rpc error: code = NotFound desc = could not find container \"17b7ac979ac6a67fb3ad96fa61b71c8ee01f408a7b93c570e3b7e1e8621fc7ec\": container with ID starting with 17b7ac979ac6a67fb3ad96fa61b71c8ee01f408a7b93c570e3b7e1e8621fc7ec not found: ID does not exist" Apr 16 04:27:41.459542 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.459520 2575 scope.go:117] "RemoveContainer" containerID="c65b14134904a9ccd53dc817d15066c4f8309baffb19c39287d275208a00fa49" Apr 16 04:27:41.459722 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:27:41.459706 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c65b14134904a9ccd53dc817d15066c4f8309baffb19c39287d275208a00fa49\": container with ID starting with c65b14134904a9ccd53dc817d15066c4f8309baffb19c39287d275208a00fa49 not found: ID does not exist" containerID="c65b14134904a9ccd53dc817d15066c4f8309baffb19c39287d275208a00fa49" Apr 16 04:27:41.459761 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.459727 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c65b14134904a9ccd53dc817d15066c4f8309baffb19c39287d275208a00fa49"} err="failed to get container status \"c65b14134904a9ccd53dc817d15066c4f8309baffb19c39287d275208a00fa49\": rpc error: code = NotFound desc = could not find container \"c65b14134904a9ccd53dc817d15066c4f8309baffb19c39287d275208a00fa49\": container with ID starting with c65b14134904a9ccd53dc817d15066c4f8309baffb19c39287d275208a00fa49 not found: ID does not exist" Apr 16 04:27:41.459761 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.459745 2575 scope.go:117] "RemoveContainer" containerID="9e1b665ba78eded916841fb126b61d6e020f976d1511ceda99f09117f3945610" Apr 16 04:27:41.459952 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:27:41.459927 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e1b665ba78eded916841fb126b61d6e020f976d1511ceda99f09117f3945610\": container with ID starting with 9e1b665ba78eded916841fb126b61d6e020f976d1511ceda99f09117f3945610 not found: ID does not exist" containerID="9e1b665ba78eded916841fb126b61d6e020f976d1511ceda99f09117f3945610" Apr 16 04:27:41.460007 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.459954 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e1b665ba78eded916841fb126b61d6e020f976d1511ceda99f09117f3945610"} err="failed to get container status \"9e1b665ba78eded916841fb126b61d6e020f976d1511ceda99f09117f3945610\": rpc error: code = NotFound desc = could not find container \"9e1b665ba78eded916841fb126b61d6e020f976d1511ceda99f09117f3945610\": container with ID starting with 9e1b665ba78eded916841fb126b61d6e020f976d1511ceda99f09117f3945610 not found: ID does not exist" Apr 16 04:27:41.460007 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.459966 2575 scope.go:117] "RemoveContainer" containerID="b0a65288e643ed0f93c109311470663c9c98aa967b54eb49a4880c54f7119ea6" Apr 16 04:27:41.460146 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.460132 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0a65288e643ed0f93c109311470663c9c98aa967b54eb49a4880c54f7119ea6"} err="failed to get container status \"b0a65288e643ed0f93c109311470663c9c98aa967b54eb49a4880c54f7119ea6\": rpc error: code = NotFound desc = could not find container \"b0a65288e643ed0f93c109311470663c9c98aa967b54eb49a4880c54f7119ea6\": container with ID starting with b0a65288e643ed0f93c109311470663c9c98aa967b54eb49a4880c54f7119ea6 not found: ID does not exist" Apr 16 04:27:41.460189 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.460145 2575 scope.go:117] "RemoveContainer" containerID="87e3bb9feb57b4c6d5afe470ed4f017d9e86f91fc61a24061b202961e547864f" Apr 16 04:27:41.460358 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.460321 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87e3bb9feb57b4c6d5afe470ed4f017d9e86f91fc61a24061b202961e547864f"} err="failed to get container status \"87e3bb9feb57b4c6d5afe470ed4f017d9e86f91fc61a24061b202961e547864f\": rpc error: code = NotFound desc = could not find container \"87e3bb9feb57b4c6d5afe470ed4f017d9e86f91fc61a24061b202961e547864f\": container with ID starting with 87e3bb9feb57b4c6d5afe470ed4f017d9e86f91fc61a24061b202961e547864f not found: ID does not exist" Apr 16 04:27:41.460358 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.460357 2575 scope.go:117] "RemoveContainer" containerID="b3ee7bd95b6870061c2ab459cfde8d7f139b8da9c1f44c30a732e2c72f04727e" Apr 16 04:27:41.460578 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.460560 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3ee7bd95b6870061c2ab459cfde8d7f139b8da9c1f44c30a732e2c72f04727e"} err="failed to get container status \"b3ee7bd95b6870061c2ab459cfde8d7f139b8da9c1f44c30a732e2c72f04727e\": rpc error: code = NotFound desc = could not find container \"b3ee7bd95b6870061c2ab459cfde8d7f139b8da9c1f44c30a732e2c72f04727e\": container with ID starting with b3ee7bd95b6870061c2ab459cfde8d7f139b8da9c1f44c30a732e2c72f04727e not found: ID does not exist" Apr 16 04:27:41.460616 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.460578 2575 scope.go:117] "RemoveContainer" containerID="7fb8dd22d8d0ea66b6075b542b76223af53e1dffb8879e1b089c1dd217b08ecb" Apr 16 04:27:41.460799 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.460781 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fb8dd22d8d0ea66b6075b542b76223af53e1dffb8879e1b089c1dd217b08ecb"} err="failed to get container status \"7fb8dd22d8d0ea66b6075b542b76223af53e1dffb8879e1b089c1dd217b08ecb\": rpc error: code = NotFound desc = could not find container \"7fb8dd22d8d0ea66b6075b542b76223af53e1dffb8879e1b089c1dd217b08ecb\": container with ID starting with 7fb8dd22d8d0ea66b6075b542b76223af53e1dffb8879e1b089c1dd217b08ecb not found: ID does not exist" Apr 16 04:27:41.460799 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.460799 2575 scope.go:117] "RemoveContainer" containerID="17b7ac979ac6a67fb3ad96fa61b71c8ee01f408a7b93c570e3b7e1e8621fc7ec" Apr 16 04:27:41.460995 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.460980 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17b7ac979ac6a67fb3ad96fa61b71c8ee01f408a7b93c570e3b7e1e8621fc7ec"} err="failed to get container status \"17b7ac979ac6a67fb3ad96fa61b71c8ee01f408a7b93c570e3b7e1e8621fc7ec\": rpc error: code = NotFound desc = could not find container \"17b7ac979ac6a67fb3ad96fa61b71c8ee01f408a7b93c570e3b7e1e8621fc7ec\": container with ID starting with 17b7ac979ac6a67fb3ad96fa61b71c8ee01f408a7b93c570e3b7e1e8621fc7ec not found: ID does not exist" Apr 16 04:27:41.461035 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.460995 2575 scope.go:117] "RemoveContainer" containerID="c65b14134904a9ccd53dc817d15066c4f8309baffb19c39287d275208a00fa49" Apr 16 04:27:41.461196 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.461180 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c65b14134904a9ccd53dc817d15066c4f8309baffb19c39287d275208a00fa49"} err="failed to get container status \"c65b14134904a9ccd53dc817d15066c4f8309baffb19c39287d275208a00fa49\": rpc error: code = NotFound desc = could not find container \"c65b14134904a9ccd53dc817d15066c4f8309baffb19c39287d275208a00fa49\": container with ID starting with c65b14134904a9ccd53dc817d15066c4f8309baffb19c39287d275208a00fa49 not found: ID does not exist" Apr 16 04:27:41.461242 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.461195 2575 scope.go:117] "RemoveContainer" containerID="9e1b665ba78eded916841fb126b61d6e020f976d1511ceda99f09117f3945610" Apr 16 04:27:41.461430 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.461414 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e1b665ba78eded916841fb126b61d6e020f976d1511ceda99f09117f3945610"} err="failed to get container status \"9e1b665ba78eded916841fb126b61d6e020f976d1511ceda99f09117f3945610\": rpc error: code = NotFound desc = could not find container \"9e1b665ba78eded916841fb126b61d6e020f976d1511ceda99f09117f3945610\": container with ID starting with 9e1b665ba78eded916841fb126b61d6e020f976d1511ceda99f09117f3945610 not found: ID does not exist" Apr 16 04:27:41.504679 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.504648 2575 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/04fc0933-f968-413e-a220-0cef26f8f792-tls-assets\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:27:41.504679 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.504678 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/04fc0933-f968-413e-a220-0cef26f8f792-secret-alertmanager-main-tls\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:27:41.504876 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.504690 2575 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/04fc0933-f968-413e-a220-0cef26f8f792-web-config\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:27:41.504876 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.504701 2575 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/04fc0933-f968-413e-a220-0cef26f8f792-cluster-tls-config\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:27:41.504876 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.504710 2575 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/04fc0933-f968-413e-a220-0cef26f8f792-config-out\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:27:41.504876 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.504720 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/04fc0933-f968-413e-a220-0cef26f8f792-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:27:41.504876 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.504728 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cmjvx\" (UniqueName: \"kubernetes.io/projected/04fc0933-f968-413e-a220-0cef26f8f792-kube-api-access-cmjvx\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:27:41.504876 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.504737 2575 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/04fc0933-f968-413e-a220-0cef26f8f792-metrics-client-ca\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:27:41.504876 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.504746 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/04fc0933-f968-413e-a220-0cef26f8f792-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:27:41.504876 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.504755 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/04fc0933-f968-413e-a220-0cef26f8f792-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:27:41.504876 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.504764 2575 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/04fc0933-f968-413e-a220-0cef26f8f792-config-volume\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:27:41.738794 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.738750 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 04:27:41.741906 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.741874 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 04:27:41.764724 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.764692 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 04:27:41.764993 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.764982 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="04fc0933-f968-413e-a220-0cef26f8f792" containerName="kube-rbac-proxy" Apr 16 04:27:41.765050 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.764995 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="04fc0933-f968-413e-a220-0cef26f8f792" containerName="kube-rbac-proxy" Apr 16 04:27:41.765050 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.765005 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="04fc0933-f968-413e-a220-0cef26f8f792" containerName="prom-label-proxy" Apr 16 04:27:41.765050 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.765010 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="04fc0933-f968-413e-a220-0cef26f8f792" containerName="prom-label-proxy" Apr 16 04:27:41.765050 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.765018 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="04fc0933-f968-413e-a220-0cef26f8f792" containerName="kube-rbac-proxy-metric" Apr 16 04:27:41.765050 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.765024 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="04fc0933-f968-413e-a220-0cef26f8f792" containerName="kube-rbac-proxy-metric" Apr 16 04:27:41.765050 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.765029 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="056a5c11-8892-4001-9ab3-ed3bc1c17305" containerName="console" Apr 16 04:27:41.765050 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.765034 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="056a5c11-8892-4001-9ab3-ed3bc1c17305" containerName="console" Apr 16 04:27:41.765050 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.765047 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="04fc0933-f968-413e-a220-0cef26f8f792" containerName="alertmanager" Apr 16 04:27:41.765050 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.765052 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="04fc0933-f968-413e-a220-0cef26f8f792" containerName="alertmanager" Apr 16 04:27:41.765314 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.765059 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="04fc0933-f968-413e-a220-0cef26f8f792" containerName="config-reloader" Apr 16 04:27:41.765314 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.765064 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="04fc0933-f968-413e-a220-0cef26f8f792" containerName="config-reloader" Apr 16 04:27:41.765314 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.765070 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="04fc0933-f968-413e-a220-0cef26f8f792" containerName="kube-rbac-proxy-web" Apr 16 04:27:41.765314 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.765076 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="04fc0933-f968-413e-a220-0cef26f8f792" containerName="kube-rbac-proxy-web" Apr 16 04:27:41.765314 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.765084 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="04fc0933-f968-413e-a220-0cef26f8f792" containerName="init-config-reloader" Apr 16 04:27:41.765314 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.765090 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="04fc0933-f968-413e-a220-0cef26f8f792" containerName="init-config-reloader" Apr 16 04:27:41.765314 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.765131 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="04fc0933-f968-413e-a220-0cef26f8f792" containerName="prom-label-proxy" Apr 16 04:27:41.765314 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.765140 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="04fc0933-f968-413e-a220-0cef26f8f792" containerName="kube-rbac-proxy-web" Apr 16 04:27:41.765314 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.765148 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="04fc0933-f968-413e-a220-0cef26f8f792" containerName="config-reloader" Apr 16 04:27:41.765314 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.765153 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="056a5c11-8892-4001-9ab3-ed3bc1c17305" containerName="console" Apr 16 04:27:41.765314 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.765160 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="04fc0933-f968-413e-a220-0cef26f8f792" containerName="alertmanager" Apr 16 04:27:41.765314 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.765166 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="04fc0933-f968-413e-a220-0cef26f8f792" containerName="kube-rbac-proxy" Apr 16 04:27:41.765314 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.765172 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="04fc0933-f968-413e-a220-0cef26f8f792" containerName="kube-rbac-proxy-metric" Apr 16 04:27:41.768422 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.768407 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:27:41.771227 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.771198 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 04:27:41.771227 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.771206 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-j6ksw\"" Apr 16 04:27:41.771445 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.771277 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 04:27:41.771506 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.771483 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 04:27:41.771575 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.771554 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 04:27:41.771635 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.771619 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 04:27:41.771692 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.771675 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 04:27:41.771793 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.771763 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 04:27:41.772030 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.771863 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 04:27:41.776220 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.776201 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 04:27:41.781113 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.781092 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 04:27:41.806048 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.806017 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/11531649-599b-411c-a280-3750a982723c-web-config\") pod \"alertmanager-main-0\" (UID: \"11531649-599b-411c-a280-3750a982723c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:27:41.806048 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.806056 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/11531649-599b-411c-a280-3750a982723c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"11531649-599b-411c-a280-3750a982723c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:27:41.806285 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.806081 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/11531649-599b-411c-a280-3750a982723c-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"11531649-599b-411c-a280-3750a982723c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:27:41.806285 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.806104 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/11531649-599b-411c-a280-3750a982723c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"11531649-599b-411c-a280-3750a982723c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:27:41.806285 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.806140 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/11531649-599b-411c-a280-3750a982723c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"11531649-599b-411c-a280-3750a982723c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:27:41.806285 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.806180 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/11531649-599b-411c-a280-3750a982723c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"11531649-599b-411c-a280-3750a982723c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:27:41.806285 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.806219 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/11531649-599b-411c-a280-3750a982723c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"11531649-599b-411c-a280-3750a982723c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:27:41.806524 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.806289 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn7qd\" (UniqueName: \"kubernetes.io/projected/11531649-599b-411c-a280-3750a982723c-kube-api-access-qn7qd\") pod \"alertmanager-main-0\" (UID: \"11531649-599b-411c-a280-3750a982723c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:27:41.806524 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.806315 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/11531649-599b-411c-a280-3750a982723c-config-volume\") pod \"alertmanager-main-0\" (UID: \"11531649-599b-411c-a280-3750a982723c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:27:41.806524 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.806379 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/11531649-599b-411c-a280-3750a982723c-config-out\") pod \"alertmanager-main-0\" (UID: \"11531649-599b-411c-a280-3750a982723c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:27:41.806524 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.806435 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/11531649-599b-411c-a280-3750a982723c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"11531649-599b-411c-a280-3750a982723c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:27:41.806524 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.806462 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/11531649-599b-411c-a280-3750a982723c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"11531649-599b-411c-a280-3750a982723c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:27:41.806673 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.806530 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11531649-599b-411c-a280-3750a982723c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"11531649-599b-411c-a280-3750a982723c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:27:41.907478 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.907439 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11531649-599b-411c-a280-3750a982723c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"11531649-599b-411c-a280-3750a982723c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:27:41.907478 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.907480 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/11531649-599b-411c-a280-3750a982723c-web-config\") pod \"alertmanager-main-0\" (UID: \"11531649-599b-411c-a280-3750a982723c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:27:41.907734 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.907499 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/11531649-599b-411c-a280-3750a982723c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"11531649-599b-411c-a280-3750a982723c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:27:41.907734 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.907527 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/11531649-599b-411c-a280-3750a982723c-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"11531649-599b-411c-a280-3750a982723c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:27:41.907734 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.907550 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/11531649-599b-411c-a280-3750a982723c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"11531649-599b-411c-a280-3750a982723c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:27:41.907734 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.907577 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/11531649-599b-411c-a280-3750a982723c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"11531649-599b-411c-a280-3750a982723c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:27:41.907734 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.907635 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/11531649-599b-411c-a280-3750a982723c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"11531649-599b-411c-a280-3750a982723c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:27:41.907734 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.907666 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/11531649-599b-411c-a280-3750a982723c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"11531649-599b-411c-a280-3750a982723c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:27:41.908102 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.908080 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qn7qd\" (UniqueName: \"kubernetes.io/projected/11531649-599b-411c-a280-3750a982723c-kube-api-access-qn7qd\") pod \"alertmanager-main-0\" (UID: \"11531649-599b-411c-a280-3750a982723c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:27:41.908262 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.908246 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/11531649-599b-411c-a280-3750a982723c-config-volume\") pod \"alertmanager-main-0\" (UID: \"11531649-599b-411c-a280-3750a982723c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:27:41.908427 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.908405 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/11531649-599b-411c-a280-3750a982723c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"11531649-599b-411c-a280-3750a982723c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:27:41.908521 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.908504 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/11531649-599b-411c-a280-3750a982723c-config-out\") pod \"alertmanager-main-0\" (UID: \"11531649-599b-411c-a280-3750a982723c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:27:41.908661 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.908646 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/11531649-599b-411c-a280-3750a982723c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"11531649-599b-411c-a280-3750a982723c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:27:41.908748 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.908729 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/11531649-599b-411c-a280-3750a982723c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"11531649-599b-411c-a280-3750a982723c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:27:41.908830 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.907941 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/11531649-599b-411c-a280-3750a982723c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"11531649-599b-411c-a280-3750a982723c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:27:41.908830 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.908524 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11531649-599b-411c-a280-3750a982723c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"11531649-599b-411c-a280-3750a982723c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:27:41.910941 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.910618 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/11531649-599b-411c-a280-3750a982723c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"11531649-599b-411c-a280-3750a982723c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:27:41.910941 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.910676 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/11531649-599b-411c-a280-3750a982723c-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"11531649-599b-411c-a280-3750a982723c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:27:41.910941 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.910881 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/11531649-599b-411c-a280-3750a982723c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"11531649-599b-411c-a280-3750a982723c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:27:41.911109 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.910959 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/11531649-599b-411c-a280-3750a982723c-web-config\") pod \"alertmanager-main-0\" (UID: \"11531649-599b-411c-a280-3750a982723c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:27:41.911109 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.911005 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/11531649-599b-411c-a280-3750a982723c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"11531649-599b-411c-a280-3750a982723c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:27:41.911109 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.911047 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/11531649-599b-411c-a280-3750a982723c-config-volume\") pod \"alertmanager-main-0\" (UID: \"11531649-599b-411c-a280-3750a982723c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:27:41.911208 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.911110 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/11531649-599b-411c-a280-3750a982723c-config-out\") pod \"alertmanager-main-0\" (UID: \"11531649-599b-411c-a280-3750a982723c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:27:41.911435 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.911419 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/11531649-599b-411c-a280-3750a982723c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"11531649-599b-411c-a280-3750a982723c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:27:41.912497 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.912480 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/11531649-599b-411c-a280-3750a982723c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"11531649-599b-411c-a280-3750a982723c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:27:41.916394 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:41.916352 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn7qd\" (UniqueName: \"kubernetes.io/projected/11531649-599b-411c-a280-3750a982723c-kube-api-access-qn7qd\") pod \"alertmanager-main-0\" (UID: \"11531649-599b-411c-a280-3750a982723c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:27:42.078351 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:42.078291 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 04:27:42.203563 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:42.203536 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 04:27:42.206297 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:27:42.206272 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11531649_599b_411c_a280_3750a982723c.slice/crio-3e18138ce1d61a93bb48925a1281a54512758900b11e424bde01df814d689d9d WatchSource:0}: Error finding container 3e18138ce1d61a93bb48925a1281a54512758900b11e424bde01df814d689d9d: Status 404 returned error can't find the container with id 3e18138ce1d61a93bb48925a1281a54512758900b11e424bde01df814d689d9d Apr 16 04:27:42.414161 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:42.414067 2575 generic.go:358] "Generic (PLEG): container finished" podID="11531649-599b-411c-a280-3750a982723c" containerID="44371499fa55758745999e2aaf45d85aa9658a09e63aac0bcc5218307e45b3e7" exitCode=0 Apr 16 04:27:42.414629 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:42.414154 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"11531649-599b-411c-a280-3750a982723c","Type":"ContainerDied","Data":"44371499fa55758745999e2aaf45d85aa9658a09e63aac0bcc5218307e45b3e7"} Apr 16 04:27:42.414629 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:42.414191 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"11531649-599b-411c-a280-3750a982723c","Type":"ContainerStarted","Data":"3e18138ce1d61a93bb48925a1281a54512758900b11e424bde01df814d689d9d"} Apr 16 04:27:42.706197 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:42.706170 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04fc0933-f968-413e-a220-0cef26f8f792" path="/var/lib/kubelet/pods/04fc0933-f968-413e-a220-0cef26f8f792/volumes" Apr 16 04:27:43.421095 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:43.421007 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"11531649-599b-411c-a280-3750a982723c","Type":"ContainerStarted","Data":"b858d89e10f1bec98378b9e4ae5b2b9b66000fb9a987b668beba742aeefd339b"} Apr 16 04:27:43.421095 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:43.421042 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"11531649-599b-411c-a280-3750a982723c","Type":"ContainerStarted","Data":"71772f8fdf2f9f57900079566effc326c4a85992d64b710b4b99c7c8716fc1c2"} Apr 16 04:27:43.421095 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:43.421053 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"11531649-599b-411c-a280-3750a982723c","Type":"ContainerStarted","Data":"3b8d1e021575059de37dbfdecdfd53c50e56ad00f85a216c8f7401b73a44d835"} Apr 16 04:27:43.421095 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:43.421062 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"11531649-599b-411c-a280-3750a982723c","Type":"ContainerStarted","Data":"3767a488c4c4bf86ab9fda28447e676cbc32fedbc4fef752de1efb3fb00d92a1"} Apr 16 04:27:43.421095 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:43.421070 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"11531649-599b-411c-a280-3750a982723c","Type":"ContainerStarted","Data":"1c6264c513f847db61241ab3372f2f801c302b59aa7d6e578fa715e72840852d"} Apr 16 04:27:43.421095 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:43.421077 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"11531649-599b-411c-a280-3750a982723c","Type":"ContainerStarted","Data":"0a1eb3945bb581e0971dc8c1fd67e6613069ee745b44d349cf8d5a06f907b015"} Apr 16 04:27:43.446664 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:43.446615 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.446599356 podStartE2EDuration="2.446599356s" podCreationTimestamp="2026-04-16 04:27:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 04:27:43.444858492 +0000 UTC m=+211.347634490" watchObservedRunningTime="2026-04-16 04:27:43.446599356 +0000 UTC m=+211.349375354" Apr 16 04:27:59.026265 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:59.026227 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-88b6ddcd5-x7jmn"] Apr 16 04:27:59.031014 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:59.030150 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-88b6ddcd5-x7jmn" Apr 16 04:27:59.041367 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:59.041317 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-88b6ddcd5-x7jmn"] Apr 16 04:27:59.151134 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:59.151096 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ef624b49-759e-4d94-85ea-e26cf039a0fc-service-ca\") pod \"console-88b6ddcd5-x7jmn\" (UID: \"ef624b49-759e-4d94-85ea-e26cf039a0fc\") " pod="openshift-console/console-88b6ddcd5-x7jmn" Apr 16 04:27:59.151134 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:59.151145 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr2zt\" (UniqueName: \"kubernetes.io/projected/ef624b49-759e-4d94-85ea-e26cf039a0fc-kube-api-access-zr2zt\") pod \"console-88b6ddcd5-x7jmn\" (UID: \"ef624b49-759e-4d94-85ea-e26cf039a0fc\") " pod="openshift-console/console-88b6ddcd5-x7jmn" Apr 16 04:27:59.151408 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:59.151174 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef624b49-759e-4d94-85ea-e26cf039a0fc-console-serving-cert\") pod \"console-88b6ddcd5-x7jmn\" (UID: \"ef624b49-759e-4d94-85ea-e26cf039a0fc\") " pod="openshift-console/console-88b6ddcd5-x7jmn" Apr 16 04:27:59.151408 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:59.151207 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef624b49-759e-4d94-85ea-e26cf039a0fc-trusted-ca-bundle\") pod \"console-88b6ddcd5-x7jmn\" (UID: \"ef624b49-759e-4d94-85ea-e26cf039a0fc\") " pod="openshift-console/console-88b6ddcd5-x7jmn" Apr 16 04:27:59.151408 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:59.151280 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ef624b49-759e-4d94-85ea-e26cf039a0fc-console-oauth-config\") pod \"console-88b6ddcd5-x7jmn\" (UID: \"ef624b49-759e-4d94-85ea-e26cf039a0fc\") " pod="openshift-console/console-88b6ddcd5-x7jmn" Apr 16 04:27:59.151408 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:59.151319 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ef624b49-759e-4d94-85ea-e26cf039a0fc-console-config\") pod \"console-88b6ddcd5-x7jmn\" (UID: \"ef624b49-759e-4d94-85ea-e26cf039a0fc\") " pod="openshift-console/console-88b6ddcd5-x7jmn" Apr 16 04:27:59.151408 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:59.151377 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ef624b49-759e-4d94-85ea-e26cf039a0fc-oauth-serving-cert\") pod \"console-88b6ddcd5-x7jmn\" (UID: \"ef624b49-759e-4d94-85ea-e26cf039a0fc\") " pod="openshift-console/console-88b6ddcd5-x7jmn" Apr 16 04:27:59.252619 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:59.252574 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ef624b49-759e-4d94-85ea-e26cf039a0fc-service-ca\") pod \"console-88b6ddcd5-x7jmn\" (UID: \"ef624b49-759e-4d94-85ea-e26cf039a0fc\") " pod="openshift-console/console-88b6ddcd5-x7jmn" Apr 16 04:27:59.252619 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:59.252619 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zr2zt\" (UniqueName: \"kubernetes.io/projected/ef624b49-759e-4d94-85ea-e26cf039a0fc-kube-api-access-zr2zt\") pod \"console-88b6ddcd5-x7jmn\" (UID: \"ef624b49-759e-4d94-85ea-e26cf039a0fc\") " pod="openshift-console/console-88b6ddcd5-x7jmn" Apr 16 04:27:59.252829 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:59.252648 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef624b49-759e-4d94-85ea-e26cf039a0fc-console-serving-cert\") pod \"console-88b6ddcd5-x7jmn\" (UID: \"ef624b49-759e-4d94-85ea-e26cf039a0fc\") " pod="openshift-console/console-88b6ddcd5-x7jmn" Apr 16 04:27:59.252829 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:59.252769 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef624b49-759e-4d94-85ea-e26cf039a0fc-trusted-ca-bundle\") pod \"console-88b6ddcd5-x7jmn\" (UID: \"ef624b49-759e-4d94-85ea-e26cf039a0fc\") " pod="openshift-console/console-88b6ddcd5-x7jmn" Apr 16 04:27:59.252949 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:59.252843 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ef624b49-759e-4d94-85ea-e26cf039a0fc-console-oauth-config\") pod \"console-88b6ddcd5-x7jmn\" (UID: \"ef624b49-759e-4d94-85ea-e26cf039a0fc\") " pod="openshift-console/console-88b6ddcd5-x7jmn" Apr 16 04:27:59.252949 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:59.252876 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ef624b49-759e-4d94-85ea-e26cf039a0fc-console-config\") pod \"console-88b6ddcd5-x7jmn\" (UID: \"ef624b49-759e-4d94-85ea-e26cf039a0fc\") " pod="openshift-console/console-88b6ddcd5-x7jmn" Apr 16 04:27:59.252949 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:59.252906 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ef624b49-759e-4d94-85ea-e26cf039a0fc-oauth-serving-cert\") pod \"console-88b6ddcd5-x7jmn\" (UID: \"ef624b49-759e-4d94-85ea-e26cf039a0fc\") " pod="openshift-console/console-88b6ddcd5-x7jmn" Apr 16 04:27:59.253539 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:59.253505 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ef624b49-759e-4d94-85ea-e26cf039a0fc-service-ca\") pod \"console-88b6ddcd5-x7jmn\" (UID: \"ef624b49-759e-4d94-85ea-e26cf039a0fc\") " pod="openshift-console/console-88b6ddcd5-x7jmn" Apr 16 04:27:59.253672 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:59.253616 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ef624b49-759e-4d94-85ea-e26cf039a0fc-console-config\") pod \"console-88b6ddcd5-x7jmn\" (UID: \"ef624b49-759e-4d94-85ea-e26cf039a0fc\") " pod="openshift-console/console-88b6ddcd5-x7jmn" Apr 16 04:27:59.253729 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:59.253692 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef624b49-759e-4d94-85ea-e26cf039a0fc-trusted-ca-bundle\") pod \"console-88b6ddcd5-x7jmn\" (UID: \"ef624b49-759e-4d94-85ea-e26cf039a0fc\") " pod="openshift-console/console-88b6ddcd5-x7jmn" Apr 16 04:27:59.253763 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:59.253748 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ef624b49-759e-4d94-85ea-e26cf039a0fc-oauth-serving-cert\") pod \"console-88b6ddcd5-x7jmn\" (UID: \"ef624b49-759e-4d94-85ea-e26cf039a0fc\") " pod="openshift-console/console-88b6ddcd5-x7jmn" Apr 16 04:27:59.255244 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:59.255226 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ef624b49-759e-4d94-85ea-e26cf039a0fc-console-oauth-config\") pod \"console-88b6ddcd5-x7jmn\" (UID: \"ef624b49-759e-4d94-85ea-e26cf039a0fc\") " pod="openshift-console/console-88b6ddcd5-x7jmn" Apr 16 04:27:59.255364 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:59.255348 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef624b49-759e-4d94-85ea-e26cf039a0fc-console-serving-cert\") pod \"console-88b6ddcd5-x7jmn\" (UID: \"ef624b49-759e-4d94-85ea-e26cf039a0fc\") " pod="openshift-console/console-88b6ddcd5-x7jmn" Apr 16 04:27:59.259930 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:59.259908 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr2zt\" (UniqueName: \"kubernetes.io/projected/ef624b49-759e-4d94-85ea-e26cf039a0fc-kube-api-access-zr2zt\") pod \"console-88b6ddcd5-x7jmn\" (UID: \"ef624b49-759e-4d94-85ea-e26cf039a0fc\") " pod="openshift-console/console-88b6ddcd5-x7jmn" Apr 16 04:27:59.343772 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:59.343670 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-88b6ddcd5-x7jmn" Apr 16 04:27:59.467092 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:59.467064 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-88b6ddcd5-x7jmn"] Apr 16 04:27:59.469623 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:27:59.469600 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef624b49_759e_4d94_85ea_e26cf039a0fc.slice/crio-61d8d4659941950e20d5ec218a1c1436724c9a912e046651e5dd0f0af806df52 WatchSource:0}: Error finding container 61d8d4659941950e20d5ec218a1c1436724c9a912e046651e5dd0f0af806df52: Status 404 returned error can't find the container with id 61d8d4659941950e20d5ec218a1c1436724c9a912e046651e5dd0f0af806df52 Apr 16 04:27:59.473740 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:27:59.473715 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-88b6ddcd5-x7jmn" event={"ID":"ef624b49-759e-4d94-85ea-e26cf039a0fc","Type":"ContainerStarted","Data":"61d8d4659941950e20d5ec218a1c1436724c9a912e046651e5dd0f0af806df52"} Apr 16 04:28:00.477508 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:00.477477 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-88b6ddcd5-x7jmn" event={"ID":"ef624b49-759e-4d94-85ea-e26cf039a0fc","Type":"ContainerStarted","Data":"f3998ea6d54d446189df1a40937de910aed477d18396af751ab3a45362932f10"} Apr 16 04:28:00.494140 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:00.494096 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-88b6ddcd5-x7jmn" podStartSLOduration=1.494083232 podStartE2EDuration="1.494083232s" podCreationTimestamp="2026-04-16 04:27:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 04:28:00.493456098 +0000 UTC m=+228.396232109" watchObservedRunningTime="2026-04-16 04:28:00.494083232 +0000 UTC m=+228.396859231" Apr 16 04:28:09.343837 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:09.343798 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-88b6ddcd5-x7jmn" Apr 16 04:28:09.344231 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:09.343852 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-88b6ddcd5-x7jmn" Apr 16 04:28:09.349095 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:09.349070 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-88b6ddcd5-x7jmn" Apr 16 04:28:09.506889 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:09.506863 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-88b6ddcd5-x7jmn" Apr 16 04:28:09.549816 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:09.549781 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-d95bc46c9-rpks7"] Apr 16 04:28:12.664393 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:12.664360 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-fm95k"] Apr 16 04:28:12.668707 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:12.668686 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fm95k" Apr 16 04:28:12.672025 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:12.672002 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 04:28:12.675114 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:12.675088 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-fm95k"] Apr 16 04:28:12.770854 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:12.770802 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/5b75a953-7e5c-49dd-823f-92d909687592-dbus\") pod \"global-pull-secret-syncer-fm95k\" (UID: \"5b75a953-7e5c-49dd-823f-92d909687592\") " pod="kube-system/global-pull-secret-syncer-fm95k" Apr 16 04:28:12.771034 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:12.770873 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5b75a953-7e5c-49dd-823f-92d909687592-original-pull-secret\") pod \"global-pull-secret-syncer-fm95k\" (UID: \"5b75a953-7e5c-49dd-823f-92d909687592\") " pod="kube-system/global-pull-secret-syncer-fm95k" Apr 16 04:28:12.771034 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:12.770945 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/5b75a953-7e5c-49dd-823f-92d909687592-kubelet-config\") pod \"global-pull-secret-syncer-fm95k\" (UID: \"5b75a953-7e5c-49dd-823f-92d909687592\") " pod="kube-system/global-pull-secret-syncer-fm95k" Apr 16 04:28:12.872447 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:12.872310 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5b75a953-7e5c-49dd-823f-92d909687592-original-pull-secret\") pod \"global-pull-secret-syncer-fm95k\" (UID: \"5b75a953-7e5c-49dd-823f-92d909687592\") " pod="kube-system/global-pull-secret-syncer-fm95k" Apr 16 04:28:12.872447 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:12.872401 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/5b75a953-7e5c-49dd-823f-92d909687592-kubelet-config\") pod \"global-pull-secret-syncer-fm95k\" (UID: \"5b75a953-7e5c-49dd-823f-92d909687592\") " pod="kube-system/global-pull-secret-syncer-fm95k" Apr 16 04:28:12.872447 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:12.872444 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/5b75a953-7e5c-49dd-823f-92d909687592-dbus\") pod \"global-pull-secret-syncer-fm95k\" (UID: \"5b75a953-7e5c-49dd-823f-92d909687592\") " pod="kube-system/global-pull-secret-syncer-fm95k" Apr 16 04:28:12.872744 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:12.872528 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/5b75a953-7e5c-49dd-823f-92d909687592-kubelet-config\") pod \"global-pull-secret-syncer-fm95k\" (UID: \"5b75a953-7e5c-49dd-823f-92d909687592\") " pod="kube-system/global-pull-secret-syncer-fm95k" Apr 16 04:28:12.872744 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:12.872593 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/5b75a953-7e5c-49dd-823f-92d909687592-dbus\") pod \"global-pull-secret-syncer-fm95k\" (UID: \"5b75a953-7e5c-49dd-823f-92d909687592\") " pod="kube-system/global-pull-secret-syncer-fm95k" Apr 16 04:28:12.874874 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:12.874844 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5b75a953-7e5c-49dd-823f-92d909687592-original-pull-secret\") pod \"global-pull-secret-syncer-fm95k\" (UID: \"5b75a953-7e5c-49dd-823f-92d909687592\") " pod="kube-system/global-pull-secret-syncer-fm95k" Apr 16 04:28:12.980213 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:12.980164 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fm95k" Apr 16 04:28:13.099206 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:13.099172 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-fm95k"] Apr 16 04:28:13.102107 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:28:13.102078 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b75a953_7e5c_49dd_823f_92d909687592.slice/crio-99ddd00aca2ee1186b45f18866dfc2e34a84f5562a82775c0a49c19d9a596aea WatchSource:0}: Error finding container 99ddd00aca2ee1186b45f18866dfc2e34a84f5562a82775c0a49c19d9a596aea: Status 404 returned error can't find the container with id 99ddd00aca2ee1186b45f18866dfc2e34a84f5562a82775c0a49c19d9a596aea Apr 16 04:28:13.517453 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:13.517418 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-fm95k" event={"ID":"5b75a953-7e5c-49dd-823f-92d909687592","Type":"ContainerStarted","Data":"99ddd00aca2ee1186b45f18866dfc2e34a84f5562a82775c0a49c19d9a596aea"} Apr 16 04:28:17.530037 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:17.530002 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-fm95k" event={"ID":"5b75a953-7e5c-49dd-823f-92d909687592","Type":"ContainerStarted","Data":"d71717e9c3db271c25c438ca6a1b37a31255ba89a172bee8e02464a2fd533ec0"} Apr 16 04:28:17.546720 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:17.546675 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-fm95k" podStartSLOduration=1.307439966 podStartE2EDuration="5.546658623s" podCreationTimestamp="2026-04-16 04:28:12 +0000 UTC" firstStartedPulling="2026-04-16 04:28:13.103642759 +0000 UTC m=+241.006418736" lastFinishedPulling="2026-04-16 04:28:17.342861415 +0000 UTC m=+245.245637393" observedRunningTime="2026-04-16 04:28:17.544750214 +0000 UTC m=+245.447526224" watchObservedRunningTime="2026-04-16 04:28:17.546658623 +0000 UTC m=+245.449434622" Apr 16 04:28:34.574638 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:34.574527 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-d95bc46c9-rpks7" podUID="b4643967-3602-4e57-bac6-5500591809f5" containerName="console" containerID="cri-o://b6830a33262aaaa291f956ddb77fc261e7ebec4a00d9655dff6d75c1b5d5142e" gracePeriod=15 Apr 16 04:28:34.814632 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:34.814605 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-d95bc46c9-rpks7_b4643967-3602-4e57-bac6-5500591809f5/console/0.log" Apr 16 04:28:34.814760 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:34.814675 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d95bc46c9-rpks7" Apr 16 04:28:34.845574 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:34.845493 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b4643967-3602-4e57-bac6-5500591809f5-service-ca\") pod \"b4643967-3602-4e57-bac6-5500591809f5\" (UID: \"b4643967-3602-4e57-bac6-5500591809f5\") " Apr 16 04:28:34.845574 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:34.845532 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b4643967-3602-4e57-bac6-5500591809f5-oauth-serving-cert\") pod \"b4643967-3602-4e57-bac6-5500591809f5\" (UID: \"b4643967-3602-4e57-bac6-5500591809f5\") " Apr 16 04:28:34.845574 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:34.845555 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dxrf\" (UniqueName: \"kubernetes.io/projected/b4643967-3602-4e57-bac6-5500591809f5-kube-api-access-4dxrf\") pod \"b4643967-3602-4e57-bac6-5500591809f5\" (UID: \"b4643967-3602-4e57-bac6-5500591809f5\") " Apr 16 04:28:34.845793 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:34.845742 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b4643967-3602-4e57-bac6-5500591809f5-console-serving-cert\") pod \"b4643967-3602-4e57-bac6-5500591809f5\" (UID: \"b4643967-3602-4e57-bac6-5500591809f5\") " Apr 16 04:28:34.845837 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:34.845810 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b4643967-3602-4e57-bac6-5500591809f5-console-config\") pod \"b4643967-3602-4e57-bac6-5500591809f5\" (UID: \"b4643967-3602-4e57-bac6-5500591809f5\") " Apr 16 04:28:34.845880 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:34.845846 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4643967-3602-4e57-bac6-5500591809f5-trusted-ca-bundle\") pod \"b4643967-3602-4e57-bac6-5500591809f5\" (UID: \"b4643967-3602-4e57-bac6-5500591809f5\") " Apr 16 04:28:34.845880 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:34.845868 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b4643967-3602-4e57-bac6-5500591809f5-console-oauth-config\") pod \"b4643967-3602-4e57-bac6-5500591809f5\" (UID: \"b4643967-3602-4e57-bac6-5500591809f5\") " Apr 16 04:28:34.845988 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:34.845925 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4643967-3602-4e57-bac6-5500591809f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "b4643967-3602-4e57-bac6-5500591809f5" (UID: "b4643967-3602-4e57-bac6-5500591809f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 04:28:34.845988 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:34.845973 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4643967-3602-4e57-bac6-5500591809f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b4643967-3602-4e57-bac6-5500591809f5" (UID: "b4643967-3602-4e57-bac6-5500591809f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 04:28:34.846245 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:34.846222 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b4643967-3602-4e57-bac6-5500591809f5-service-ca\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:28:34.846363 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:34.846251 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b4643967-3602-4e57-bac6-5500591809f5-oauth-serving-cert\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:28:34.846363 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:34.846216 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4643967-3602-4e57-bac6-5500591809f5-console-config" (OuterVolumeSpecName: "console-config") pod "b4643967-3602-4e57-bac6-5500591809f5" (UID: "b4643967-3602-4e57-bac6-5500591809f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 04:28:34.846363 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:34.846297 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4643967-3602-4e57-bac6-5500591809f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "b4643967-3602-4e57-bac6-5500591809f5" (UID: "b4643967-3602-4e57-bac6-5500591809f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 04:28:34.848102 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:34.848079 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4643967-3602-4e57-bac6-5500591809f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b4643967-3602-4e57-bac6-5500591809f5" (UID: "b4643967-3602-4e57-bac6-5500591809f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 04:28:34.848364 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:34.848303 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4643967-3602-4e57-bac6-5500591809f5-kube-api-access-4dxrf" (OuterVolumeSpecName: "kube-api-access-4dxrf") pod "b4643967-3602-4e57-bac6-5500591809f5" (UID: "b4643967-3602-4e57-bac6-5500591809f5"). InnerVolumeSpecName "kube-api-access-4dxrf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 04:28:34.848364 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:34.848311 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4643967-3602-4e57-bac6-5500591809f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b4643967-3602-4e57-bac6-5500591809f5" (UID: "b4643967-3602-4e57-bac6-5500591809f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 04:28:34.947537 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:34.947498 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b4643967-3602-4e57-bac6-5500591809f5-console-serving-cert\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:28:34.947537 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:34.947531 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b4643967-3602-4e57-bac6-5500591809f5-console-config\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:28:34.947537 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:34.947541 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4643967-3602-4e57-bac6-5500591809f5-trusted-ca-bundle\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:28:34.947772 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:34.947550 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b4643967-3602-4e57-bac6-5500591809f5-console-oauth-config\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:28:34.947772 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:34.947559 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4dxrf\" (UniqueName: \"kubernetes.io/projected/b4643967-3602-4e57-bac6-5500591809f5-kube-api-access-4dxrf\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:28:35.585631 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:35.585603 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-d95bc46c9-rpks7_b4643967-3602-4e57-bac6-5500591809f5/console/0.log" Apr 16 04:28:35.586020 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:35.585641 2575 generic.go:358] "Generic (PLEG): container finished" podID="b4643967-3602-4e57-bac6-5500591809f5" containerID="b6830a33262aaaa291f956ddb77fc261e7ebec4a00d9655dff6d75c1b5d5142e" exitCode=2 Apr 16 04:28:35.586020 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:35.585712 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d95bc46c9-rpks7" Apr 16 04:28:35.586020 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:35.585718 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d95bc46c9-rpks7" event={"ID":"b4643967-3602-4e57-bac6-5500591809f5","Type":"ContainerDied","Data":"b6830a33262aaaa291f956ddb77fc261e7ebec4a00d9655dff6d75c1b5d5142e"} Apr 16 04:28:35.586020 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:35.585810 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d95bc46c9-rpks7" event={"ID":"b4643967-3602-4e57-bac6-5500591809f5","Type":"ContainerDied","Data":"c0a9d33a4b98b019944ac1144288ea9cfdd278c00d0f17fbc29edab3d1151a3b"} Apr 16 04:28:35.586020 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:35.585824 2575 scope.go:117] "RemoveContainer" containerID="b6830a33262aaaa291f956ddb77fc261e7ebec4a00d9655dff6d75c1b5d5142e" Apr 16 04:28:35.593807 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:35.593790 2575 scope.go:117] "RemoveContainer" containerID="b6830a33262aaaa291f956ddb77fc261e7ebec4a00d9655dff6d75c1b5d5142e" Apr 16 04:28:35.594073 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:28:35.594056 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6830a33262aaaa291f956ddb77fc261e7ebec4a00d9655dff6d75c1b5d5142e\": container with ID starting with b6830a33262aaaa291f956ddb77fc261e7ebec4a00d9655dff6d75c1b5d5142e not found: ID does not exist" containerID="b6830a33262aaaa291f956ddb77fc261e7ebec4a00d9655dff6d75c1b5d5142e" Apr 16 04:28:35.594115 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:35.594083 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6830a33262aaaa291f956ddb77fc261e7ebec4a00d9655dff6d75c1b5d5142e"} err="failed to get container status \"b6830a33262aaaa291f956ddb77fc261e7ebec4a00d9655dff6d75c1b5d5142e\": rpc error: code = NotFound desc = could not find container \"b6830a33262aaaa291f956ddb77fc261e7ebec4a00d9655dff6d75c1b5d5142e\": container with ID starting with b6830a33262aaaa291f956ddb77fc261e7ebec4a00d9655dff6d75c1b5d5142e not found: ID does not exist" Apr 16 04:28:35.605182 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:35.605154 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-d95bc46c9-rpks7"] Apr 16 04:28:35.610603 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:35.610580 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-d95bc46c9-rpks7"] Apr 16 04:28:36.706831 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:36.706801 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4643967-3602-4e57-bac6-5500591809f5" path="/var/lib/kubelet/pods/b4643967-3602-4e57-bac6-5500591809f5/volumes" Apr 16 04:28:53.848558 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:53.848528 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-tkfpm"] Apr 16 04:28:53.848919 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:53.848820 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b4643967-3602-4e57-bac6-5500591809f5" containerName="console" Apr 16 04:28:53.848919 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:53.848832 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4643967-3602-4e57-bac6-5500591809f5" containerName="console" Apr 16 04:28:53.848919 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:53.848892 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="b4643967-3602-4e57-bac6-5500591809f5" containerName="console" Apr 16 04:28:53.853774 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:53.853756 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-tkfpm" Apr 16 04:28:53.856490 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:53.856467 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 16 04:28:53.856626 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:53.856508 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-lbzdm\"" Apr 16 04:28:53.856626 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:53.856533 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 16 04:28:53.860681 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:53.860660 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-tkfpm"] Apr 16 04:28:53.896219 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:53.896178 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b89708df-116c-4a40-b01b-c31cc4839cf5-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-tkfpm\" (UID: \"b89708df-116c-4a40-b01b-c31cc4839cf5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-tkfpm" Apr 16 04:28:53.896430 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:53.896398 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46mrp\" (UniqueName: \"kubernetes.io/projected/b89708df-116c-4a40-b01b-c31cc4839cf5-kube-api-access-46mrp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-tkfpm\" (UID: \"b89708df-116c-4a40-b01b-c31cc4839cf5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-tkfpm" Apr 16 04:28:53.997299 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:53.997260 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b89708df-116c-4a40-b01b-c31cc4839cf5-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-tkfpm\" (UID: \"b89708df-116c-4a40-b01b-c31cc4839cf5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-tkfpm" Apr 16 04:28:53.997503 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:53.997376 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-46mrp\" (UniqueName: \"kubernetes.io/projected/b89708df-116c-4a40-b01b-c31cc4839cf5-kube-api-access-46mrp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-tkfpm\" (UID: \"b89708df-116c-4a40-b01b-c31cc4839cf5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-tkfpm" Apr 16 04:28:53.997678 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:53.997656 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b89708df-116c-4a40-b01b-c31cc4839cf5-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-tkfpm\" (UID: \"b89708df-116c-4a40-b01b-c31cc4839cf5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-tkfpm" Apr 16 04:28:54.005290 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:54.005258 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-46mrp\" (UniqueName: \"kubernetes.io/projected/b89708df-116c-4a40-b01b-c31cc4839cf5-kube-api-access-46mrp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-tkfpm\" (UID: \"b89708df-116c-4a40-b01b-c31cc4839cf5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-tkfpm" Apr 16 04:28:54.163897 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:54.163815 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-tkfpm" Apr 16 04:28:54.286371 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:54.286205 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-tkfpm"] Apr 16 04:28:54.289369 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:28:54.289343 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb89708df_116c_4a40_b01b_c31cc4839cf5.slice/crio-d637094d94eac1743e9330effe59d196fd2655ed3da18e4721279076b033e437 WatchSource:0}: Error finding container d637094d94eac1743e9330effe59d196fd2655ed3da18e4721279076b033e437: Status 404 returned error can't find the container with id d637094d94eac1743e9330effe59d196fd2655ed3da18e4721279076b033e437 Apr 16 04:28:54.641248 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:54.641212 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-tkfpm" event={"ID":"b89708df-116c-4a40-b01b-c31cc4839cf5","Type":"ContainerStarted","Data":"d637094d94eac1743e9330effe59d196fd2655ed3da18e4721279076b033e437"} Apr 16 04:28:57.653360 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:57.653312 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-tkfpm" event={"ID":"b89708df-116c-4a40-b01b-c31cc4839cf5","Type":"ContainerStarted","Data":"8d94c814c47ea96d144ca25e2cab7cb787ebdb5c66971d10e4c47b69db474104"} Apr 16 04:28:57.674846 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:57.674793 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-tkfpm" podStartSLOduration=2.303876861 podStartE2EDuration="4.674777912s" podCreationTimestamp="2026-04-16 04:28:53 +0000 UTC" firstStartedPulling="2026-04-16 04:28:54.291900601 +0000 UTC m=+282.194676579" lastFinishedPulling="2026-04-16 04:28:56.662801638 +0000 UTC m=+284.565577630" observedRunningTime="2026-04-16 04:28:57.673250148 +0000 UTC m=+285.576026150" watchObservedRunningTime="2026-04-16 04:28:57.674777912 +0000 UTC m=+285.577553911" Apr 16 04:28:58.771616 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:58.771580 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqx87n"] Apr 16 04:28:58.774602 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:58.774586 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqx87n" Apr 16 04:28:58.777200 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:58.777179 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 04:28:58.778357 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:58.778341 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 04:28:58.778455 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:58.778397 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-v68fx\"" Apr 16 04:28:58.782480 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:58.782457 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqx87n"] Apr 16 04:28:58.841560 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:58.841521 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/354e9f15-dba3-4a6d-94c5-fefd47ef6f37-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqx87n\" (UID: \"354e9f15-dba3-4a6d-94c5-fefd47ef6f37\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqx87n" Apr 16 04:28:58.841761 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:58.841571 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b72nr\" (UniqueName: \"kubernetes.io/projected/354e9f15-dba3-4a6d-94c5-fefd47ef6f37-kube-api-access-b72nr\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqx87n\" (UID: \"354e9f15-dba3-4a6d-94c5-fefd47ef6f37\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqx87n" Apr 16 04:28:58.841761 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:58.841684 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/354e9f15-dba3-4a6d-94c5-fefd47ef6f37-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqx87n\" (UID: \"354e9f15-dba3-4a6d-94c5-fefd47ef6f37\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqx87n" Apr 16 04:28:58.942580 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:58.942528 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/354e9f15-dba3-4a6d-94c5-fefd47ef6f37-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqx87n\" (UID: \"354e9f15-dba3-4a6d-94c5-fefd47ef6f37\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqx87n" Apr 16 04:28:58.942580 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:58.942591 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b72nr\" (UniqueName: \"kubernetes.io/projected/354e9f15-dba3-4a6d-94c5-fefd47ef6f37-kube-api-access-b72nr\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqx87n\" (UID: \"354e9f15-dba3-4a6d-94c5-fefd47ef6f37\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqx87n" Apr 16 04:28:58.942839 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:58.942635 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/354e9f15-dba3-4a6d-94c5-fefd47ef6f37-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqx87n\" (UID: \"354e9f15-dba3-4a6d-94c5-fefd47ef6f37\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqx87n" Apr 16 04:28:58.942925 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:58.942903 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/354e9f15-dba3-4a6d-94c5-fefd47ef6f37-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqx87n\" (UID: \"354e9f15-dba3-4a6d-94c5-fefd47ef6f37\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqx87n" Apr 16 04:28:58.943009 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:58.942990 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/354e9f15-dba3-4a6d-94c5-fefd47ef6f37-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqx87n\" (UID: \"354e9f15-dba3-4a6d-94c5-fefd47ef6f37\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqx87n" Apr 16 04:28:58.950134 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:58.950109 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b72nr\" (UniqueName: \"kubernetes.io/projected/354e9f15-dba3-4a6d-94c5-fefd47ef6f37-kube-api-access-b72nr\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqx87n\" (UID: \"354e9f15-dba3-4a6d-94c5-fefd47ef6f37\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqx87n" Apr 16 04:28:59.084771 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:59.084678 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqx87n" Apr 16 04:28:59.208321 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:59.208289 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqx87n"] Apr 16 04:28:59.212053 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:28:59.212023 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod354e9f15_dba3_4a6d_94c5_fefd47ef6f37.slice/crio-df5c14da018020613e9587e53961353dcd8d50e30989c028d484efa326fdc796 WatchSource:0}: Error finding container df5c14da018020613e9587e53961353dcd8d50e30989c028d484efa326fdc796: Status 404 returned error can't find the container with id df5c14da018020613e9587e53961353dcd8d50e30989c028d484efa326fdc796 Apr 16 04:28:59.661408 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:59.661368 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqx87n" event={"ID":"354e9f15-dba3-4a6d-94c5-fefd47ef6f37","Type":"ContainerStarted","Data":"df5c14da018020613e9587e53961353dcd8d50e30989c028d484efa326fdc796"} Apr 16 04:28:59.889414 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:59.889378 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-2kzxz"] Apr 16 04:28:59.892637 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:59.892616 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-2kzxz" Apr 16 04:28:59.895103 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:59.895076 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 04:28:59.895288 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:59.895269 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 04:28:59.896488 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:59.896470 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-vmfjt\"" Apr 16 04:28:59.900907 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:59.900879 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-2kzxz"] Apr 16 04:28:59.952006 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:59.951866 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/88bef77f-a67f-4920-93f6-c7785e20a4d9-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-2kzxz\" (UID: \"88bef77f-a67f-4920-93f6-c7785e20a4d9\") " pod="cert-manager/cert-manager-webhook-597b96b99b-2kzxz" Apr 16 04:28:59.952006 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:28:59.951926 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47hss\" (UniqueName: \"kubernetes.io/projected/88bef77f-a67f-4920-93f6-c7785e20a4d9-kube-api-access-47hss\") pod \"cert-manager-webhook-597b96b99b-2kzxz\" (UID: \"88bef77f-a67f-4920-93f6-c7785e20a4d9\") " pod="cert-manager/cert-manager-webhook-597b96b99b-2kzxz" Apr 16 04:29:00.053165 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:00.053125 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/88bef77f-a67f-4920-93f6-c7785e20a4d9-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-2kzxz\" (UID: \"88bef77f-a67f-4920-93f6-c7785e20a4d9\") " pod="cert-manager/cert-manager-webhook-597b96b99b-2kzxz" Apr 16 04:29:00.053165 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:00.053169 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-47hss\" (UniqueName: \"kubernetes.io/projected/88bef77f-a67f-4920-93f6-c7785e20a4d9-kube-api-access-47hss\") pod \"cert-manager-webhook-597b96b99b-2kzxz\" (UID: \"88bef77f-a67f-4920-93f6-c7785e20a4d9\") " pod="cert-manager/cert-manager-webhook-597b96b99b-2kzxz" Apr 16 04:29:00.061655 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:00.061603 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/88bef77f-a67f-4920-93f6-c7785e20a4d9-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-2kzxz\" (UID: \"88bef77f-a67f-4920-93f6-c7785e20a4d9\") " pod="cert-manager/cert-manager-webhook-597b96b99b-2kzxz" Apr 16 04:29:00.061805 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:00.061751 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-47hss\" (UniqueName: \"kubernetes.io/projected/88bef77f-a67f-4920-93f6-c7785e20a4d9-kube-api-access-47hss\") pod \"cert-manager-webhook-597b96b99b-2kzxz\" (UID: \"88bef77f-a67f-4920-93f6-c7785e20a4d9\") " pod="cert-manager/cert-manager-webhook-597b96b99b-2kzxz" Apr 16 04:29:00.210691 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:00.210599 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-2kzxz" Apr 16 04:29:00.353670 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:00.353642 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-2kzxz"] Apr 16 04:29:00.356452 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:29:00.356415 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88bef77f_a67f_4920_93f6_c7785e20a4d9.slice/crio-b84cc61d84e4f3194dfa271885609b5256ebf37e6ea8c3bb8458ac330dafa571 WatchSource:0}: Error finding container b84cc61d84e4f3194dfa271885609b5256ebf37e6ea8c3bb8458ac330dafa571: Status 404 returned error can't find the container with id b84cc61d84e4f3194dfa271885609b5256ebf37e6ea8c3bb8458ac330dafa571 Apr 16 04:29:00.665366 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:00.665305 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-2kzxz" event={"ID":"88bef77f-a67f-4920-93f6-c7785e20a4d9","Type":"ContainerStarted","Data":"b84cc61d84e4f3194dfa271885609b5256ebf37e6ea8c3bb8458ac330dafa571"} Apr 16 04:29:05.684631 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:05.684597 2575 generic.go:358] "Generic (PLEG): container finished" podID="354e9f15-dba3-4a6d-94c5-fefd47ef6f37" containerID="d4ae9274ed52459459908bce51d5ef3f75ee84e8ae5924f7e46a24907b31a0a6" exitCode=0 Apr 16 04:29:05.685195 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:05.684692 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqx87n" event={"ID":"354e9f15-dba3-4a6d-94c5-fefd47ef6f37","Type":"ContainerDied","Data":"d4ae9274ed52459459908bce51d5ef3f75ee84e8ae5924f7e46a24907b31a0a6"} Apr 16 04:29:05.686150 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:05.686131 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-2kzxz" event={"ID":"88bef77f-a67f-4920-93f6-c7785e20a4d9","Type":"ContainerStarted","Data":"2b7b91a2e7bc38337f5869bb2dd61e9473847860d5e5aaa6c0c1a7a320c6012e"} Apr 16 04:29:05.686272 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:05.686256 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-2kzxz" Apr 16 04:29:05.713023 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:05.712969 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-2kzxz" podStartSLOduration=2.007891173 podStartE2EDuration="6.712951087s" podCreationTimestamp="2026-04-16 04:28:59 +0000 UTC" firstStartedPulling="2026-04-16 04:29:00.358710737 +0000 UTC m=+288.261486716" lastFinishedPulling="2026-04-16 04:29:05.06377065 +0000 UTC m=+292.966546630" observedRunningTime="2026-04-16 04:29:05.712529098 +0000 UTC m=+293.615305108" watchObservedRunningTime="2026-04-16 04:29:05.712951087 +0000 UTC m=+293.615727088" Apr 16 04:29:07.695427 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:07.695394 2575 generic.go:358] "Generic (PLEG): container finished" podID="354e9f15-dba3-4a6d-94c5-fefd47ef6f37" containerID="14dc2f51ebb70b7135a0773bb9b5add8e56b0a2e5e408934961534c8c91c0774" exitCode=0 Apr 16 04:29:07.695807 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:07.695477 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqx87n" event={"ID":"354e9f15-dba3-4a6d-94c5-fefd47ef6f37","Type":"ContainerDied","Data":"14dc2f51ebb70b7135a0773bb9b5add8e56b0a2e5e408934961534c8c91c0774"} Apr 16 04:29:10.915803 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:10.915761 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-rwzmr"] Apr 16 04:29:10.918210 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:10.918185 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-rwzmr" Apr 16 04:29:10.920622 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:10.920599 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-dxk6z\"" Apr 16 04:29:10.928808 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:10.928781 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-rwzmr"] Apr 16 04:29:11.046937 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:11.046901 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/67ec97f0-9a7b-4f84-a35a-9d84ae496f59-bound-sa-token\") pod \"cert-manager-759f64656b-rwzmr\" (UID: \"67ec97f0-9a7b-4f84-a35a-9d84ae496f59\") " pod="cert-manager/cert-manager-759f64656b-rwzmr" Apr 16 04:29:11.047142 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:11.046966 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mhkl\" (UniqueName: \"kubernetes.io/projected/67ec97f0-9a7b-4f84-a35a-9d84ae496f59-kube-api-access-8mhkl\") pod \"cert-manager-759f64656b-rwzmr\" (UID: \"67ec97f0-9a7b-4f84-a35a-9d84ae496f59\") " pod="cert-manager/cert-manager-759f64656b-rwzmr" Apr 16 04:29:11.148046 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:11.147996 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/67ec97f0-9a7b-4f84-a35a-9d84ae496f59-bound-sa-token\") pod \"cert-manager-759f64656b-rwzmr\" (UID: \"67ec97f0-9a7b-4f84-a35a-9d84ae496f59\") " pod="cert-manager/cert-manager-759f64656b-rwzmr" Apr 16 04:29:11.148216 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:11.148078 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8mhkl\" (UniqueName: \"kubernetes.io/projected/67ec97f0-9a7b-4f84-a35a-9d84ae496f59-kube-api-access-8mhkl\") pod \"cert-manager-759f64656b-rwzmr\" (UID: \"67ec97f0-9a7b-4f84-a35a-9d84ae496f59\") " pod="cert-manager/cert-manager-759f64656b-rwzmr" Apr 16 04:29:11.156163 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:11.156121 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/67ec97f0-9a7b-4f84-a35a-9d84ae496f59-bound-sa-token\") pod \"cert-manager-759f64656b-rwzmr\" (UID: \"67ec97f0-9a7b-4f84-a35a-9d84ae496f59\") " pod="cert-manager/cert-manager-759f64656b-rwzmr" Apr 16 04:29:11.156311 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:11.156210 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mhkl\" (UniqueName: \"kubernetes.io/projected/67ec97f0-9a7b-4f84-a35a-9d84ae496f59-kube-api-access-8mhkl\") pod \"cert-manager-759f64656b-rwzmr\" (UID: \"67ec97f0-9a7b-4f84-a35a-9d84ae496f59\") " pod="cert-manager/cert-manager-759f64656b-rwzmr" Apr 16 04:29:11.228479 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:11.228384 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-rwzmr" Apr 16 04:29:11.692513 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:11.692485 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-2kzxz" Apr 16 04:29:13.289898 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:13.289869 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kn57s_ed401949-3fdf-455d-bb0a-400a7a15466c/console-operator/2.log" Apr 16 04:29:13.290962 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:13.290935 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kn57s_ed401949-3fdf-455d-bb0a-400a7a15466c/console-operator/2.log" Apr 16 04:29:13.302141 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:13.302072 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 04:29:13.381655 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:13.381635 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-rwzmr"] Apr 16 04:29:13.384902 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:29:13.384869 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67ec97f0_9a7b_4f84_a35a_9d84ae496f59.slice/crio-655b7ed80313bc724787ece346575a7637c0ebc5d679dd0a44f6540bb4fc7328 WatchSource:0}: Error finding container 655b7ed80313bc724787ece346575a7637c0ebc5d679dd0a44f6540bb4fc7328: Status 404 returned error can't find the container with id 655b7ed80313bc724787ece346575a7637c0ebc5d679dd0a44f6540bb4fc7328 Apr 16 04:29:13.720099 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:13.720061 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-rwzmr" event={"ID":"67ec97f0-9a7b-4f84-a35a-9d84ae496f59","Type":"ContainerStarted","Data":"d41a5bb228d70561a6b9a15a6de64d812675803a3f2b6007f53db410d4e9581d"} Apr 16 04:29:13.720099 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:13.720102 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-rwzmr" event={"ID":"67ec97f0-9a7b-4f84-a35a-9d84ae496f59","Type":"ContainerStarted","Data":"655b7ed80313bc724787ece346575a7637c0ebc5d679dd0a44f6540bb4fc7328"} Apr 16 04:29:13.724654 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:13.724616 2575 generic.go:358] "Generic (PLEG): container finished" podID="354e9f15-dba3-4a6d-94c5-fefd47ef6f37" containerID="6b5f6f59f7f188efc63cb6bd7bc8583cf3ba3b98651cd6b499c0b445be0021a8" exitCode=0 Apr 16 04:29:13.724785 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:13.724676 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqx87n" event={"ID":"354e9f15-dba3-4a6d-94c5-fefd47ef6f37","Type":"ContainerDied","Data":"6b5f6f59f7f188efc63cb6bd7bc8583cf3ba3b98651cd6b499c0b445be0021a8"} Apr 16 04:29:13.736516 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:13.736471 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-rwzmr" podStartSLOduration=3.736456358 podStartE2EDuration="3.736456358s" podCreationTimestamp="2026-04-16 04:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 04:29:13.734727964 +0000 UTC m=+301.637503962" watchObservedRunningTime="2026-04-16 04:29:13.736456358 +0000 UTC m=+301.639232360" Apr 16 04:29:14.850443 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:14.850420 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqx87n" Apr 16 04:29:14.985413 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:14.985296 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/354e9f15-dba3-4a6d-94c5-fefd47ef6f37-bundle\") pod \"354e9f15-dba3-4a6d-94c5-fefd47ef6f37\" (UID: \"354e9f15-dba3-4a6d-94c5-fefd47ef6f37\") " Apr 16 04:29:14.985413 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:14.985376 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b72nr\" (UniqueName: \"kubernetes.io/projected/354e9f15-dba3-4a6d-94c5-fefd47ef6f37-kube-api-access-b72nr\") pod \"354e9f15-dba3-4a6d-94c5-fefd47ef6f37\" (UID: \"354e9f15-dba3-4a6d-94c5-fefd47ef6f37\") " Apr 16 04:29:14.985645 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:14.985478 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/354e9f15-dba3-4a6d-94c5-fefd47ef6f37-util\") pod \"354e9f15-dba3-4a6d-94c5-fefd47ef6f37\" (UID: \"354e9f15-dba3-4a6d-94c5-fefd47ef6f37\") " Apr 16 04:29:14.985704 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:14.985649 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/354e9f15-dba3-4a6d-94c5-fefd47ef6f37-bundle" (OuterVolumeSpecName: "bundle") pod "354e9f15-dba3-4a6d-94c5-fefd47ef6f37" (UID: "354e9f15-dba3-4a6d-94c5-fefd47ef6f37"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 04:29:14.987655 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:14.987632 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/354e9f15-dba3-4a6d-94c5-fefd47ef6f37-kube-api-access-b72nr" (OuterVolumeSpecName: "kube-api-access-b72nr") pod "354e9f15-dba3-4a6d-94c5-fefd47ef6f37" (UID: "354e9f15-dba3-4a6d-94c5-fefd47ef6f37"). InnerVolumeSpecName "kube-api-access-b72nr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 04:29:14.989585 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:14.989560 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/354e9f15-dba3-4a6d-94c5-fefd47ef6f37-util" (OuterVolumeSpecName: "util") pod "354e9f15-dba3-4a6d-94c5-fefd47ef6f37" (UID: "354e9f15-dba3-4a6d-94c5-fefd47ef6f37"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 04:29:15.086238 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:15.086201 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/354e9f15-dba3-4a6d-94c5-fefd47ef6f37-util\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:29:15.086238 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:15.086229 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/354e9f15-dba3-4a6d-94c5-fefd47ef6f37-bundle\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:29:15.086238 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:15.086238 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b72nr\" (UniqueName: \"kubernetes.io/projected/354e9f15-dba3-4a6d-94c5-fefd47ef6f37-kube-api-access-b72nr\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:29:15.733411 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:15.733372 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqx87n" event={"ID":"354e9f15-dba3-4a6d-94c5-fefd47ef6f37","Type":"ContainerDied","Data":"df5c14da018020613e9587e53961353dcd8d50e30989c028d484efa326fdc796"} Apr 16 04:29:15.733411 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:15.733402 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqx87n" Apr 16 04:29:15.733411 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:15.733412 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df5c14da018020613e9587e53961353dcd8d50e30989c028d484efa326fdc796" Apr 16 04:29:39.550522 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:39.550484 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-c7946b447-rkb2q"] Apr 16 04:29:39.551012 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:39.550869 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="354e9f15-dba3-4a6d-94c5-fefd47ef6f37" containerName="util" Apr 16 04:29:39.551012 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:39.550883 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="354e9f15-dba3-4a6d-94c5-fefd47ef6f37" containerName="util" Apr 16 04:29:39.551012 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:39.550900 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="354e9f15-dba3-4a6d-94c5-fefd47ef6f37" containerName="pull" Apr 16 04:29:39.551012 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:39.550909 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="354e9f15-dba3-4a6d-94c5-fefd47ef6f37" containerName="pull" Apr 16 04:29:39.551012 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:39.550924 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="354e9f15-dba3-4a6d-94c5-fefd47ef6f37" containerName="extract" Apr 16 04:29:39.551012 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:39.550929 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="354e9f15-dba3-4a6d-94c5-fefd47ef6f37" containerName="extract" Apr 16 04:29:39.551012 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:39.550995 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="354e9f15-dba3-4a6d-94c5-fefd47ef6f37" containerName="extract" Apr 16 04:29:39.553971 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:39.553954 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-rkb2q" Apr 16 04:29:39.557313 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:39.557273 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 16 04:29:39.557830 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:39.557806 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-zzbzl\"" Apr 16 04:29:39.557955 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:39.557845 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 16 04:29:39.558145 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:39.558129 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 16 04:29:39.558228 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:39.558216 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 16 04:29:39.568385 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:39.568360 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-c7946b447-rkb2q"] Apr 16 04:29:39.598831 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:39.598800 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hqzq\" (UniqueName: \"kubernetes.io/projected/41916b10-588d-4396-b383-4bd43536f029-kube-api-access-7hqzq\") pod \"opendatahub-operator-controller-manager-c7946b447-rkb2q\" (UID: \"41916b10-588d-4396-b383-4bd43536f029\") " pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-rkb2q" Apr 16 04:29:39.599014 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:39.598884 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/41916b10-588d-4396-b383-4bd43536f029-apiservice-cert\") pod \"opendatahub-operator-controller-manager-c7946b447-rkb2q\" (UID: \"41916b10-588d-4396-b383-4bd43536f029\") " pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-rkb2q" Apr 16 04:29:39.599014 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:39.598967 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/41916b10-588d-4396-b383-4bd43536f029-webhook-cert\") pod \"opendatahub-operator-controller-manager-c7946b447-rkb2q\" (UID: \"41916b10-588d-4396-b383-4bd43536f029\") " pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-rkb2q" Apr 16 04:29:39.700013 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:39.699977 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7hqzq\" (UniqueName: \"kubernetes.io/projected/41916b10-588d-4396-b383-4bd43536f029-kube-api-access-7hqzq\") pod \"opendatahub-operator-controller-manager-c7946b447-rkb2q\" (UID: \"41916b10-588d-4396-b383-4bd43536f029\") " pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-rkb2q" Apr 16 04:29:39.700190 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:39.700064 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/41916b10-588d-4396-b383-4bd43536f029-apiservice-cert\") pod \"opendatahub-operator-controller-manager-c7946b447-rkb2q\" (UID: \"41916b10-588d-4396-b383-4bd43536f029\") " pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-rkb2q" Apr 16 04:29:39.700190 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:39.700121 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/41916b10-588d-4396-b383-4bd43536f029-webhook-cert\") pod \"opendatahub-operator-controller-manager-c7946b447-rkb2q\" (UID: \"41916b10-588d-4396-b383-4bd43536f029\") " pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-rkb2q" Apr 16 04:29:39.702552 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:39.702526 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/41916b10-588d-4396-b383-4bd43536f029-webhook-cert\") pod \"opendatahub-operator-controller-manager-c7946b447-rkb2q\" (UID: \"41916b10-588d-4396-b383-4bd43536f029\") " pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-rkb2q" Apr 16 04:29:39.702663 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:39.702575 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/41916b10-588d-4396-b383-4bd43536f029-apiservice-cert\") pod \"opendatahub-operator-controller-manager-c7946b447-rkb2q\" (UID: \"41916b10-588d-4396-b383-4bd43536f029\") " pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-rkb2q" Apr 16 04:29:39.707600 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:39.707577 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hqzq\" (UniqueName: \"kubernetes.io/projected/41916b10-588d-4396-b383-4bd43536f029-kube-api-access-7hqzq\") pod \"opendatahub-operator-controller-manager-c7946b447-rkb2q\" (UID: \"41916b10-588d-4396-b383-4bd43536f029\") " pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-rkb2q" Apr 16 04:29:39.865581 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:39.865484 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-rkb2q" Apr 16 04:29:39.996437 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:39.996407 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-c7946b447-rkb2q"] Apr 16 04:29:39.999946 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:29:39.999918 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41916b10_588d_4396_b383_4bd43536f029.slice/crio-93c96c6d97334a08d67a5b1279b5cb6cd75d6e3c8aab55f5632db7bd63d1f663 WatchSource:0}: Error finding container 93c96c6d97334a08d67a5b1279b5cb6cd75d6e3c8aab55f5632db7bd63d1f663: Status 404 returned error can't find the container with id 93c96c6d97334a08d67a5b1279b5cb6cd75d6e3c8aab55f5632db7bd63d1f663 Apr 16 04:29:40.001764 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:40.001743 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 04:29:40.816925 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:40.816871 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-rkb2q" event={"ID":"41916b10-588d-4396-b383-4bd43536f029","Type":"ContainerStarted","Data":"93c96c6d97334a08d67a5b1279b5cb6cd75d6e3c8aab55f5632db7bd63d1f663"} Apr 16 04:29:41.833888 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:41.833827 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-5988777b7d-tsvd6"] Apr 16 04:29:41.840919 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:41.840890 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5988777b7d-tsvd6" Apr 16 04:29:41.846059 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:41.846029 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5988777b7d-tsvd6"] Apr 16 04:29:41.846419 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:41.846392 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 04:29:41.846419 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:41.846395 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 04:29:41.846609 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:41.846515 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-2xlxd\"" Apr 16 04:29:41.846609 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:41.846551 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 04:29:41.847634 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:41.847612 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 04:29:41.847784 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:41.847612 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 04:29:41.919300 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:41.919262 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/5b421dd7-c112-44f4-bc6d-a4e455703e31-manager-config\") pod \"lws-controller-manager-5988777b7d-tsvd6\" (UID: \"5b421dd7-c112-44f4-bc6d-a4e455703e31\") " pod="openshift-lws-operator/lws-controller-manager-5988777b7d-tsvd6" Apr 16 04:29:41.919513 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:41.919448 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/5b421dd7-c112-44f4-bc6d-a4e455703e31-metrics-cert\") pod \"lws-controller-manager-5988777b7d-tsvd6\" (UID: \"5b421dd7-c112-44f4-bc6d-a4e455703e31\") " pod="openshift-lws-operator/lws-controller-manager-5988777b7d-tsvd6" Apr 16 04:29:41.919513 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:41.919485 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5b421dd7-c112-44f4-bc6d-a4e455703e31-cert\") pod \"lws-controller-manager-5988777b7d-tsvd6\" (UID: \"5b421dd7-c112-44f4-bc6d-a4e455703e31\") " pod="openshift-lws-operator/lws-controller-manager-5988777b7d-tsvd6" Apr 16 04:29:41.919628 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:41.919552 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8grv\" (UniqueName: \"kubernetes.io/projected/5b421dd7-c112-44f4-bc6d-a4e455703e31-kube-api-access-f8grv\") pod \"lws-controller-manager-5988777b7d-tsvd6\" (UID: \"5b421dd7-c112-44f4-bc6d-a4e455703e31\") " pod="openshift-lws-operator/lws-controller-manager-5988777b7d-tsvd6" Apr 16 04:29:42.020722 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:42.020686 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/5b421dd7-c112-44f4-bc6d-a4e455703e31-manager-config\") pod \"lws-controller-manager-5988777b7d-tsvd6\" (UID: \"5b421dd7-c112-44f4-bc6d-a4e455703e31\") " pod="openshift-lws-operator/lws-controller-manager-5988777b7d-tsvd6" Apr 16 04:29:42.020917 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:42.020763 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/5b421dd7-c112-44f4-bc6d-a4e455703e31-metrics-cert\") pod \"lws-controller-manager-5988777b7d-tsvd6\" (UID: \"5b421dd7-c112-44f4-bc6d-a4e455703e31\") " pod="openshift-lws-operator/lws-controller-manager-5988777b7d-tsvd6" Apr 16 04:29:42.020917 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:42.020802 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5b421dd7-c112-44f4-bc6d-a4e455703e31-cert\") pod \"lws-controller-manager-5988777b7d-tsvd6\" (UID: \"5b421dd7-c112-44f4-bc6d-a4e455703e31\") " pod="openshift-lws-operator/lws-controller-manager-5988777b7d-tsvd6" Apr 16 04:29:42.020917 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:42.020827 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f8grv\" (UniqueName: \"kubernetes.io/projected/5b421dd7-c112-44f4-bc6d-a4e455703e31-kube-api-access-f8grv\") pod \"lws-controller-manager-5988777b7d-tsvd6\" (UID: \"5b421dd7-c112-44f4-bc6d-a4e455703e31\") " pod="openshift-lws-operator/lws-controller-manager-5988777b7d-tsvd6" Apr 16 04:29:42.021559 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:42.021525 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/5b421dd7-c112-44f4-bc6d-a4e455703e31-manager-config\") pod \"lws-controller-manager-5988777b7d-tsvd6\" (UID: \"5b421dd7-c112-44f4-bc6d-a4e455703e31\") " pod="openshift-lws-operator/lws-controller-manager-5988777b7d-tsvd6" Apr 16 04:29:42.023968 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:42.023944 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5b421dd7-c112-44f4-bc6d-a4e455703e31-cert\") pod \"lws-controller-manager-5988777b7d-tsvd6\" (UID: \"5b421dd7-c112-44f4-bc6d-a4e455703e31\") " pod="openshift-lws-operator/lws-controller-manager-5988777b7d-tsvd6" Apr 16 04:29:42.024078 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:42.023978 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/5b421dd7-c112-44f4-bc6d-a4e455703e31-metrics-cert\") pod \"lws-controller-manager-5988777b7d-tsvd6\" (UID: \"5b421dd7-c112-44f4-bc6d-a4e455703e31\") " pod="openshift-lws-operator/lws-controller-manager-5988777b7d-tsvd6" Apr 16 04:29:42.029760 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:42.029733 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8grv\" (UniqueName: \"kubernetes.io/projected/5b421dd7-c112-44f4-bc6d-a4e455703e31-kube-api-access-f8grv\") pod \"lws-controller-manager-5988777b7d-tsvd6\" (UID: \"5b421dd7-c112-44f4-bc6d-a4e455703e31\") " pod="openshift-lws-operator/lws-controller-manager-5988777b7d-tsvd6" Apr 16 04:29:42.154047 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:42.153958 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5988777b7d-tsvd6" Apr 16 04:29:42.716127 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:42.716106 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5988777b7d-tsvd6"] Apr 16 04:29:42.717820 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:29:42.717783 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b421dd7_c112_44f4_bc6d_a4e455703e31.slice/crio-baa28559b2e989e61f1b50e068ce022f972b94526fbf869bf776cc0ffaa311d0 WatchSource:0}: Error finding container baa28559b2e989e61f1b50e068ce022f972b94526fbf869bf776cc0ffaa311d0: Status 404 returned error can't find the container with id baa28559b2e989e61f1b50e068ce022f972b94526fbf869bf776cc0ffaa311d0 Apr 16 04:29:42.825573 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:42.825473 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-rkb2q" event={"ID":"41916b10-588d-4396-b383-4bd43536f029","Type":"ContainerStarted","Data":"9b23fedfa612087fed8760a67b40002cd621fe3d062a16af8ab2cc5c28cd9f50"} Apr 16 04:29:42.825743 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:42.825600 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-rkb2q" Apr 16 04:29:42.826617 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:42.826594 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5988777b7d-tsvd6" event={"ID":"5b421dd7-c112-44f4-bc6d-a4e455703e31","Type":"ContainerStarted","Data":"baa28559b2e989e61f1b50e068ce022f972b94526fbf869bf776cc0ffaa311d0"} Apr 16 04:29:42.850874 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:42.850812 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-rkb2q" podStartSLOduration=1.232746794 podStartE2EDuration="3.850793539s" podCreationTimestamp="2026-04-16 04:29:39 +0000 UTC" firstStartedPulling="2026-04-16 04:29:40.001921028 +0000 UTC m=+327.904697020" lastFinishedPulling="2026-04-16 04:29:42.619967774 +0000 UTC m=+330.522743765" observedRunningTime="2026-04-16 04:29:42.848824806 +0000 UTC m=+330.751600803" watchObservedRunningTime="2026-04-16 04:29:42.850793539 +0000 UTC m=+330.753569538" Apr 16 04:29:44.834860 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:44.834827 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5988777b7d-tsvd6" event={"ID":"5b421dd7-c112-44f4-bc6d-a4e455703e31","Type":"ContainerStarted","Data":"b6c8852d26ea931743b8042bb77999cf75d73b95da90043e5dae6acba5367afe"} Apr 16 04:29:44.834860 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:44.834876 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-5988777b7d-tsvd6" Apr 16 04:29:44.851441 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:44.851390 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-5988777b7d-tsvd6" podStartSLOduration=2.338269537 podStartE2EDuration="3.851375134s" podCreationTimestamp="2026-04-16 04:29:41 +0000 UTC" firstStartedPulling="2026-04-16 04:29:42.719619521 +0000 UTC m=+330.622395502" lastFinishedPulling="2026-04-16 04:29:44.232725118 +0000 UTC m=+332.135501099" observedRunningTime="2026-04-16 04:29:44.849822208 +0000 UTC m=+332.752598208" watchObservedRunningTime="2026-04-16 04:29:44.851375134 +0000 UTC m=+332.754151132" Apr 16 04:29:53.833014 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:53.832981 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-rkb2q" Apr 16 04:29:55.840939 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:29:55.840907 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-5988777b7d-tsvd6" Apr 16 04:30:10.182665 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:10.182586 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25fx67"] Apr 16 04:30:10.185012 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:10.184994 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25fx67" Apr 16 04:30:10.189573 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:10.189514 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 04:30:10.190015 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:10.189995 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-v68fx\"" Apr 16 04:30:10.195239 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:10.195217 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 04:30:10.205582 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:10.205556 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25fx67"] Apr 16 04:30:10.363444 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:10.363402 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/17b9981c-8588-4274-a9b5-f0d4131cdaa0-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25fx67\" (UID: \"17b9981c-8588-4274-a9b5-f0d4131cdaa0\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25fx67" Apr 16 04:30:10.363619 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:10.363532 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/17b9981c-8588-4274-a9b5-f0d4131cdaa0-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25fx67\" (UID: \"17b9981c-8588-4274-a9b5-f0d4131cdaa0\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25fx67" Apr 16 04:30:10.363619 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:10.363589 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65jf9\" (UniqueName: \"kubernetes.io/projected/17b9981c-8588-4274-a9b5-f0d4131cdaa0-kube-api-access-65jf9\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25fx67\" (UID: \"17b9981c-8588-4274-a9b5-f0d4131cdaa0\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25fx67" Apr 16 04:30:10.465010 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:10.464908 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/17b9981c-8588-4274-a9b5-f0d4131cdaa0-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25fx67\" (UID: \"17b9981c-8588-4274-a9b5-f0d4131cdaa0\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25fx67" Apr 16 04:30:10.465010 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:10.464965 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-65jf9\" (UniqueName: \"kubernetes.io/projected/17b9981c-8588-4274-a9b5-f0d4131cdaa0-kube-api-access-65jf9\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25fx67\" (UID: \"17b9981c-8588-4274-a9b5-f0d4131cdaa0\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25fx67" Apr 16 04:30:10.465010 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:10.464995 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/17b9981c-8588-4274-a9b5-f0d4131cdaa0-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25fx67\" (UID: \"17b9981c-8588-4274-a9b5-f0d4131cdaa0\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25fx67" Apr 16 04:30:10.465416 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:10.465390 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/17b9981c-8588-4274-a9b5-f0d4131cdaa0-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25fx67\" (UID: \"17b9981c-8588-4274-a9b5-f0d4131cdaa0\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25fx67" Apr 16 04:30:10.465416 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:10.465410 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/17b9981c-8588-4274-a9b5-f0d4131cdaa0-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25fx67\" (UID: \"17b9981c-8588-4274-a9b5-f0d4131cdaa0\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25fx67" Apr 16 04:30:10.474058 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:10.474027 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-65jf9\" (UniqueName: \"kubernetes.io/projected/17b9981c-8588-4274-a9b5-f0d4131cdaa0-kube-api-access-65jf9\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25fx67\" (UID: \"17b9981c-8588-4274-a9b5-f0d4131cdaa0\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25fx67" Apr 16 04:30:10.496161 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:10.496131 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25fx67" Apr 16 04:30:10.618255 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:10.618228 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25fx67"] Apr 16 04:30:10.620956 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:30:10.620911 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17b9981c_8588_4274_a9b5_f0d4131cdaa0.slice/crio-2bccec749968aeb2987b2b7471575b2159e52c6966965d139d4738d401276bb4 WatchSource:0}: Error finding container 2bccec749968aeb2987b2b7471575b2159e52c6966965d139d4738d401276bb4: Status 404 returned error can't find the container with id 2bccec749968aeb2987b2b7471575b2159e52c6966965d139d4738d401276bb4 Apr 16 04:30:10.919719 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:10.919686 2575 generic.go:358] "Generic (PLEG): container finished" podID="17b9981c-8588-4274-a9b5-f0d4131cdaa0" containerID="840cb595ff48dbcf5295a9906842bce85ecab90988de6d96bd64c555c18578a4" exitCode=0 Apr 16 04:30:10.919895 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:10.919772 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25fx67" event={"ID":"17b9981c-8588-4274-a9b5-f0d4131cdaa0","Type":"ContainerDied","Data":"840cb595ff48dbcf5295a9906842bce85ecab90988de6d96bd64c555c18578a4"} Apr 16 04:30:10.919895 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:10.919807 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25fx67" event={"ID":"17b9981c-8588-4274-a9b5-f0d4131cdaa0","Type":"ContainerStarted","Data":"2bccec749968aeb2987b2b7471575b2159e52c6966965d139d4738d401276bb4"} Apr 16 04:30:11.924646 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:11.924610 2575 generic.go:358] "Generic (PLEG): container finished" podID="17b9981c-8588-4274-a9b5-f0d4131cdaa0" containerID="33bd14a4004bc3f9f783b2f2ed49d7dd39157c1a835f415f7cad1e4c0641167a" exitCode=0 Apr 16 04:30:11.925031 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:11.924689 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25fx67" event={"ID":"17b9981c-8588-4274-a9b5-f0d4131cdaa0","Type":"ContainerDied","Data":"33bd14a4004bc3f9f783b2f2ed49d7dd39157c1a835f415f7cad1e4c0641167a"} Apr 16 04:30:12.929882 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:12.929850 2575 generic.go:358] "Generic (PLEG): container finished" podID="17b9981c-8588-4274-a9b5-f0d4131cdaa0" containerID="1d5f50e5dfa2ff6cd3f99dd3cdf409ec799712328098f8adeb7ce5ee8bed1302" exitCode=0 Apr 16 04:30:12.930343 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:12.929888 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25fx67" event={"ID":"17b9981c-8588-4274-a9b5-f0d4131cdaa0","Type":"ContainerDied","Data":"1d5f50e5dfa2ff6cd3f99dd3cdf409ec799712328098f8adeb7ce5ee8bed1302"} Apr 16 04:30:14.058902 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:14.058876 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25fx67" Apr 16 04:30:14.199728 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:14.199645 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/17b9981c-8588-4274-a9b5-f0d4131cdaa0-bundle\") pod \"17b9981c-8588-4274-a9b5-f0d4131cdaa0\" (UID: \"17b9981c-8588-4274-a9b5-f0d4131cdaa0\") " Apr 16 04:30:14.199728 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:14.199686 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/17b9981c-8588-4274-a9b5-f0d4131cdaa0-util\") pod \"17b9981c-8588-4274-a9b5-f0d4131cdaa0\" (UID: \"17b9981c-8588-4274-a9b5-f0d4131cdaa0\") " Apr 16 04:30:14.199728 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:14.199721 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65jf9\" (UniqueName: \"kubernetes.io/projected/17b9981c-8588-4274-a9b5-f0d4131cdaa0-kube-api-access-65jf9\") pod \"17b9981c-8588-4274-a9b5-f0d4131cdaa0\" (UID: \"17b9981c-8588-4274-a9b5-f0d4131cdaa0\") " Apr 16 04:30:14.200929 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:14.200896 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17b9981c-8588-4274-a9b5-f0d4131cdaa0-bundle" (OuterVolumeSpecName: "bundle") pod "17b9981c-8588-4274-a9b5-f0d4131cdaa0" (UID: "17b9981c-8588-4274-a9b5-f0d4131cdaa0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 04:30:14.201988 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:14.201963 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17b9981c-8588-4274-a9b5-f0d4131cdaa0-kube-api-access-65jf9" (OuterVolumeSpecName: "kube-api-access-65jf9") pod "17b9981c-8588-4274-a9b5-f0d4131cdaa0" (UID: "17b9981c-8588-4274-a9b5-f0d4131cdaa0"). InnerVolumeSpecName "kube-api-access-65jf9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 04:30:14.205527 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:14.205501 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17b9981c-8588-4274-a9b5-f0d4131cdaa0-util" (OuterVolumeSpecName: "util") pod "17b9981c-8588-4274-a9b5-f0d4131cdaa0" (UID: "17b9981c-8588-4274-a9b5-f0d4131cdaa0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 04:30:14.301246 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:14.301206 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/17b9981c-8588-4274-a9b5-f0d4131cdaa0-bundle\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:30:14.301246 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:14.301253 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/17b9981c-8588-4274-a9b5-f0d4131cdaa0-util\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:30:14.301471 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:14.301265 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-65jf9\" (UniqueName: \"kubernetes.io/projected/17b9981c-8588-4274-a9b5-f0d4131cdaa0-kube-api-access-65jf9\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:30:14.938319 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:14.938294 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25fx67" Apr 16 04:30:14.938474 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:14.938291 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25fx67" event={"ID":"17b9981c-8588-4274-a9b5-f0d4131cdaa0","Type":"ContainerDied","Data":"2bccec749968aeb2987b2b7471575b2159e52c6966965d139d4738d401276bb4"} Apr 16 04:30:14.938474 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:14.938373 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2bccec749968aeb2987b2b7471575b2159e52c6966965d139d4738d401276bb4" Apr 16 04:30:30.783727 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:30.783687 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff"] Apr 16 04:30:30.784305 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:30.784173 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="17b9981c-8588-4274-a9b5-f0d4131cdaa0" containerName="util" Apr 16 04:30:30.784305 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:30.784192 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="17b9981c-8588-4274-a9b5-f0d4131cdaa0" containerName="util" Apr 16 04:30:30.784305 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:30.784236 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="17b9981c-8588-4274-a9b5-f0d4131cdaa0" containerName="pull" Apr 16 04:30:30.784305 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:30.784245 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="17b9981c-8588-4274-a9b5-f0d4131cdaa0" containerName="pull" Apr 16 04:30:30.784305 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:30.784271 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="17b9981c-8588-4274-a9b5-f0d4131cdaa0" containerName="extract" Apr 16 04:30:30.784305 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:30.784281 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="17b9981c-8588-4274-a9b5-f0d4131cdaa0" containerName="extract" Apr 16 04:30:30.784619 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:30.784363 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="17b9981c-8588-4274-a9b5-f0d4131cdaa0" containerName="extract" Apr 16 04:30:30.787367 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:30.787347 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff" Apr 16 04:30:30.790099 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:30.790066 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 04:30:30.790229 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:30.790108 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-5wwvh\"" Apr 16 04:30:30.798593 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:30.798550 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff"] Apr 16 04:30:30.841549 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:30.841504 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79mnl\" (UniqueName: \"kubernetes.io/projected/c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8-kube-api-access-79mnl\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff\" (UID: \"c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff" Apr 16 04:30:30.841754 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:30.841565 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff\" (UID: \"c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff" Apr 16 04:30:30.841754 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:30.841583 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff\" (UID: \"c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff" Apr 16 04:30:30.841754 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:30.841613 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff\" (UID: \"c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff" Apr 16 04:30:30.841754 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:30.841715 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff\" (UID: \"c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff" Apr 16 04:30:30.841754 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:30.841753 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff\" (UID: \"c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff" Apr 16 04:30:30.842002 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:30.841773 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff\" (UID: \"c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff" Apr 16 04:30:30.842002 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:30.841806 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff\" (UID: \"c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff" Apr 16 04:30:30.842002 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:30.841871 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff\" (UID: \"c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff" Apr 16 04:30:30.942542 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:30.942506 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff\" (UID: \"c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff" Apr 16 04:30:30.942542 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:30.942549 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff\" (UID: \"c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff" Apr 16 04:30:30.942806 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:30.942686 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff\" (UID: \"c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff" Apr 16 04:30:30.942806 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:30.942774 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff\" (UID: \"c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff" Apr 16 04:30:30.942914 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:30.942820 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff\" (UID: \"c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff" Apr 16 04:30:30.942914 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:30.942852 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff\" (UID: \"c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff" Apr 16 04:30:30.942914 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:30.942901 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff\" (UID: \"c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff" Apr 16 04:30:30.943131 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:30.942935 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff\" (UID: \"c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff" Apr 16 04:30:30.943131 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:30.942943 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff\" (UID: \"c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff" Apr 16 04:30:30.943131 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:30.942964 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-79mnl\" (UniqueName: \"kubernetes.io/projected/c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8-kube-api-access-79mnl\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff\" (UID: \"c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff" Apr 16 04:30:30.943281 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:30.943207 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff\" (UID: \"c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff" Apr 16 04:30:30.943392 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:30.943371 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff\" (UID: \"c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff" Apr 16 04:30:30.943456 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:30.943441 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff\" (UID: \"c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff" Apr 16 04:30:30.943498 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:30.943472 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff\" (UID: \"c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff" Apr 16 04:30:30.945413 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:30.945388 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff\" (UID: \"c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff" Apr 16 04:30:30.945584 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:30.945566 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff\" (UID: \"c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff" Apr 16 04:30:30.952144 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:30.952119 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-79mnl\" (UniqueName: \"kubernetes.io/projected/c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8-kube-api-access-79mnl\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff\" (UID: \"c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff" Apr 16 04:30:30.952302 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:30.952272 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff\" (UID: \"c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff" Apr 16 04:30:31.100532 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:31.100438 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff" Apr 16 04:30:31.235935 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:31.235825 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff"] Apr 16 04:30:31.238354 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:30:31.238304 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc86bff4a_b5e8_4159_a34e_9c8bdf20b6b8.slice/crio-2e5040596d167d7e23b2d145c1cf315e85774f5725e700933c080b0711e8d9bb WatchSource:0}: Error finding container 2e5040596d167d7e23b2d145c1cf315e85774f5725e700933c080b0711e8d9bb: Status 404 returned error can't find the container with id 2e5040596d167d7e23b2d145c1cf315e85774f5725e700933c080b0711e8d9bb Apr 16 04:30:31.998677 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:31.998633 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff" event={"ID":"c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8","Type":"ContainerStarted","Data":"2e5040596d167d7e23b2d145c1cf315e85774f5725e700933c080b0711e8d9bb"} Apr 16 04:30:33.632043 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:33.632005 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 04:30:33.632301 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:33.632095 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 04:30:33.632301 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:33.632128 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 04:30:34.007444 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:34.007402 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff" event={"ID":"c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8","Type":"ContainerStarted","Data":"8efecf0f3fda1f0e0163fed2ae0a4119a308214953066b9db8527cfd45ae5a70"} Apr 16 04:30:34.027949 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:34.027889 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff" podStartSLOduration=1.636362501 podStartE2EDuration="4.027872476s" podCreationTimestamp="2026-04-16 04:30:30 +0000 UTC" firstStartedPulling="2026-04-16 04:30:31.24021211 +0000 UTC m=+379.142988087" lastFinishedPulling="2026-04-16 04:30:33.631722071 +0000 UTC m=+381.534498062" observedRunningTime="2026-04-16 04:30:34.025503026 +0000 UTC m=+381.928279065" watchObservedRunningTime="2026-04-16 04:30:34.027872476 +0000 UTC m=+381.930648475" Apr 16 04:30:34.101071 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:34.101029 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff" Apr 16 04:30:34.102240 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:34.102215 2575 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.134.0.30:15021/healthz/ready\": dial tcp 10.134.0.30:15021: connect: connection refused" start-of-body= Apr 16 04:30:34.102307 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:34.102278 2575 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff" podUID="c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.134.0.30:15021/healthz/ready\": dial tcp 10.134.0.30:15021: connect: connection refused" Apr 16 04:30:35.105046 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:35.105008 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff" Apr 16 04:30:36.014188 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:36.014160 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff" Apr 16 04:30:36.015210 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:36.015193 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff" Apr 16 04:30:57.601405 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:57.601361 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-t42pr"] Apr 16 04:30:57.606375 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:57.606353 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-t42pr" Apr 16 04:30:57.609215 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:57.609183 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 04:30:57.609215 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:57.609198 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-nwcb6\"" Apr 16 04:30:57.610452 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:57.610429 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 04:30:57.614924 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:57.614621 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-t42pr"] Apr 16 04:30:57.781360 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:57.781296 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlzj7\" (UniqueName: \"kubernetes.io/projected/5d49287c-93c6-4ba0-bc3e-d2c822a219d7-kube-api-access-xlzj7\") pod \"kuadrant-operator-catalog-t42pr\" (UID: \"5d49287c-93c6-4ba0-bc3e-d2c822a219d7\") " pod="kuadrant-system/kuadrant-operator-catalog-t42pr" Apr 16 04:30:57.882416 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:57.882289 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xlzj7\" (UniqueName: \"kubernetes.io/projected/5d49287c-93c6-4ba0-bc3e-d2c822a219d7-kube-api-access-xlzj7\") pod \"kuadrant-operator-catalog-t42pr\" (UID: \"5d49287c-93c6-4ba0-bc3e-d2c822a219d7\") " pod="kuadrant-system/kuadrant-operator-catalog-t42pr" Apr 16 04:30:57.891872 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:57.891847 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlzj7\" (UniqueName: \"kubernetes.io/projected/5d49287c-93c6-4ba0-bc3e-d2c822a219d7-kube-api-access-xlzj7\") pod \"kuadrant-operator-catalog-t42pr\" (UID: \"5d49287c-93c6-4ba0-bc3e-d2c822a219d7\") " pod="kuadrant-system/kuadrant-operator-catalog-t42pr" Apr 16 04:30:57.917893 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:57.917858 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-t42pr" Apr 16 04:30:57.941962 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:57.941922 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-t42pr"] Apr 16 04:30:58.043458 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:58.043431 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-t42pr"] Apr 16 04:30:58.045744 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:30:58.045713 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d49287c_93c6_4ba0_bc3e_d2c822a219d7.slice/crio-6b1079ea86cc2d5772fae4474012d72fdd7284f48584578097f4f3a8727df17a WatchSource:0}: Error finding container 6b1079ea86cc2d5772fae4474012d72fdd7284f48584578097f4f3a8727df17a: Status 404 returned error can't find the container with id 6b1079ea86cc2d5772fae4474012d72fdd7284f48584578097f4f3a8727df17a Apr 16 04:30:58.091097 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:58.091050 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-t42pr" event={"ID":"5d49287c-93c6-4ba0-bc3e-d2c822a219d7","Type":"ContainerStarted","Data":"6b1079ea86cc2d5772fae4474012d72fdd7284f48584578097f4f3a8727df17a"} Apr 16 04:30:58.148670 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:58.148590 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-lllgk"] Apr 16 04:30:58.153435 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:58.153413 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-lllgk" Apr 16 04:30:58.159227 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:58.159198 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-lllgk"] Apr 16 04:30:58.285575 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:58.285531 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwfb4\" (UniqueName: \"kubernetes.io/projected/6b976af6-87f2-4d13-850b-af9235ade66d-kube-api-access-gwfb4\") pod \"kuadrant-operator-catalog-lllgk\" (UID: \"6b976af6-87f2-4d13-850b-af9235ade66d\") " pod="kuadrant-system/kuadrant-operator-catalog-lllgk" Apr 16 04:30:58.386516 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:58.386476 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwfb4\" (UniqueName: \"kubernetes.io/projected/6b976af6-87f2-4d13-850b-af9235ade66d-kube-api-access-gwfb4\") pod \"kuadrant-operator-catalog-lllgk\" (UID: \"6b976af6-87f2-4d13-850b-af9235ade66d\") " pod="kuadrant-system/kuadrant-operator-catalog-lllgk" Apr 16 04:30:58.394120 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:58.394091 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwfb4\" (UniqueName: \"kubernetes.io/projected/6b976af6-87f2-4d13-850b-af9235ade66d-kube-api-access-gwfb4\") pod \"kuadrant-operator-catalog-lllgk\" (UID: \"6b976af6-87f2-4d13-850b-af9235ade66d\") " pod="kuadrant-system/kuadrant-operator-catalog-lllgk" Apr 16 04:30:58.465254 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:58.465165 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-lllgk" Apr 16 04:30:58.806999 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:58.806972 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-lllgk"] Apr 16 04:30:58.853163 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:30:58.853126 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b976af6_87f2_4d13_850b_af9235ade66d.slice/crio-9d676b47594fe1211506eaa513ea6b439f4f4f891345eebf6e85af48107e599a WatchSource:0}: Error finding container 9d676b47594fe1211506eaa513ea6b439f4f4f891345eebf6e85af48107e599a: Status 404 returned error can't find the container with id 9d676b47594fe1211506eaa513ea6b439f4f4f891345eebf6e85af48107e599a Apr 16 04:30:59.102477 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:30:59.102388 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-lllgk" event={"ID":"6b976af6-87f2-4d13-850b-af9235ade66d","Type":"ContainerStarted","Data":"9d676b47594fe1211506eaa513ea6b439f4f4f891345eebf6e85af48107e599a"} Apr 16 04:31:01.112427 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:01.112391 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-t42pr" event={"ID":"5d49287c-93c6-4ba0-bc3e-d2c822a219d7","Type":"ContainerStarted","Data":"b26400fe767522daf0163df85f35fca8377ea886fcdb09c12a1ad5eab67473a8"} Apr 16 04:31:01.112880 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:01.112497 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-t42pr" podUID="5d49287c-93c6-4ba0-bc3e-d2c822a219d7" containerName="registry-server" containerID="cri-o://b26400fe767522daf0163df85f35fca8377ea886fcdb09c12a1ad5eab67473a8" gracePeriod=2 Apr 16 04:31:01.113877 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:01.113840 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-lllgk" event={"ID":"6b976af6-87f2-4d13-850b-af9235ade66d","Type":"ContainerStarted","Data":"11442a198dad8104b084da3cce72bebb57641adce31029571040e81afcb51171"} Apr 16 04:31:01.127095 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:01.127034 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-t42pr" podStartSLOduration=1.9736039669999998 podStartE2EDuration="4.127015244s" podCreationTimestamp="2026-04-16 04:30:57 +0000 UTC" firstStartedPulling="2026-04-16 04:30:58.04710247 +0000 UTC m=+405.949878447" lastFinishedPulling="2026-04-16 04:31:00.200513733 +0000 UTC m=+408.103289724" observedRunningTime="2026-04-16 04:31:01.126288525 +0000 UTC m=+409.029064524" watchObservedRunningTime="2026-04-16 04:31:01.127015244 +0000 UTC m=+409.029791243" Apr 16 04:31:01.143685 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:01.143636 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-lllgk" podStartSLOduration=1.7948807119999999 podStartE2EDuration="3.14361947s" podCreationTimestamp="2026-04-16 04:30:58 +0000 UTC" firstStartedPulling="2026-04-16 04:30:58.854639352 +0000 UTC m=+406.757415333" lastFinishedPulling="2026-04-16 04:31:00.203378113 +0000 UTC m=+408.106154091" observedRunningTime="2026-04-16 04:31:01.142213269 +0000 UTC m=+409.044989261" watchObservedRunningTime="2026-04-16 04:31:01.14361947 +0000 UTC m=+409.046395468" Apr 16 04:31:01.358083 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:01.358059 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-t42pr" Apr 16 04:31:01.515578 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:01.515545 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlzj7\" (UniqueName: \"kubernetes.io/projected/5d49287c-93c6-4ba0-bc3e-d2c822a219d7-kube-api-access-xlzj7\") pod \"5d49287c-93c6-4ba0-bc3e-d2c822a219d7\" (UID: \"5d49287c-93c6-4ba0-bc3e-d2c822a219d7\") " Apr 16 04:31:01.517730 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:01.517694 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d49287c-93c6-4ba0-bc3e-d2c822a219d7-kube-api-access-xlzj7" (OuterVolumeSpecName: "kube-api-access-xlzj7") pod "5d49287c-93c6-4ba0-bc3e-d2c822a219d7" (UID: "5d49287c-93c6-4ba0-bc3e-d2c822a219d7"). InnerVolumeSpecName "kube-api-access-xlzj7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 04:31:01.616714 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:01.616668 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xlzj7\" (UniqueName: \"kubernetes.io/projected/5d49287c-93c6-4ba0-bc3e-d2c822a219d7-kube-api-access-xlzj7\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:31:02.119616 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:02.119584 2575 generic.go:358] "Generic (PLEG): container finished" podID="5d49287c-93c6-4ba0-bc3e-d2c822a219d7" containerID="b26400fe767522daf0163df85f35fca8377ea886fcdb09c12a1ad5eab67473a8" exitCode=0 Apr 16 04:31:02.120086 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:02.119646 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-t42pr" Apr 16 04:31:02.120086 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:02.119670 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-t42pr" event={"ID":"5d49287c-93c6-4ba0-bc3e-d2c822a219d7","Type":"ContainerDied","Data":"b26400fe767522daf0163df85f35fca8377ea886fcdb09c12a1ad5eab67473a8"} Apr 16 04:31:02.120086 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:02.119713 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-t42pr" event={"ID":"5d49287c-93c6-4ba0-bc3e-d2c822a219d7","Type":"ContainerDied","Data":"6b1079ea86cc2d5772fae4474012d72fdd7284f48584578097f4f3a8727df17a"} Apr 16 04:31:02.120086 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:02.119731 2575 scope.go:117] "RemoveContainer" containerID="b26400fe767522daf0163df85f35fca8377ea886fcdb09c12a1ad5eab67473a8" Apr 16 04:31:02.129610 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:02.129593 2575 scope.go:117] "RemoveContainer" containerID="b26400fe767522daf0163df85f35fca8377ea886fcdb09c12a1ad5eab67473a8" Apr 16 04:31:02.129912 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:31:02.129889 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b26400fe767522daf0163df85f35fca8377ea886fcdb09c12a1ad5eab67473a8\": container with ID starting with b26400fe767522daf0163df85f35fca8377ea886fcdb09c12a1ad5eab67473a8 not found: ID does not exist" containerID="b26400fe767522daf0163df85f35fca8377ea886fcdb09c12a1ad5eab67473a8" Apr 16 04:31:02.129985 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:02.129920 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b26400fe767522daf0163df85f35fca8377ea886fcdb09c12a1ad5eab67473a8"} err="failed to get container status \"b26400fe767522daf0163df85f35fca8377ea886fcdb09c12a1ad5eab67473a8\": rpc error: code = NotFound desc = could not find container \"b26400fe767522daf0163df85f35fca8377ea886fcdb09c12a1ad5eab67473a8\": container with ID starting with b26400fe767522daf0163df85f35fca8377ea886fcdb09c12a1ad5eab67473a8 not found: ID does not exist" Apr 16 04:31:02.142730 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:02.142673 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-t42pr"] Apr 16 04:31:02.146067 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:02.146023 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-t42pr"] Apr 16 04:31:02.708019 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:02.707987 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d49287c-93c6-4ba0-bc3e-d2c822a219d7" path="/var/lib/kubelet/pods/5d49287c-93c6-4ba0-bc3e-d2c822a219d7/volumes" Apr 16 04:31:08.465777 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:08.465738 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-lllgk" Apr 16 04:31:08.465777 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:08.465788 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-lllgk" Apr 16 04:31:08.488244 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:08.488216 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-lllgk" Apr 16 04:31:09.164814 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:09.164783 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-lllgk" Apr 16 04:31:13.182074 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:13.182027 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0lwbrn"] Apr 16 04:31:13.182491 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:13.182456 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d49287c-93c6-4ba0-bc3e-d2c822a219d7" containerName="registry-server" Apr 16 04:31:13.182491 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:13.182469 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d49287c-93c6-4ba0-bc3e-d2c822a219d7" containerName="registry-server" Apr 16 04:31:13.182562 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:13.182529 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="5d49287c-93c6-4ba0-bc3e-d2c822a219d7" containerName="registry-server" Apr 16 04:31:13.193392 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:13.193361 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0lwbrn" Apr 16 04:31:13.194763 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:13.194737 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0lwbrn"] Apr 16 04:31:13.196147 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:13.196123 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-5kqt9\"" Apr 16 04:31:13.213608 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:13.213576 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvxhs\" (UniqueName: \"kubernetes.io/projected/5c4034f6-34d8-4652-b48a-c39d4dca98db-kube-api-access-jvxhs\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0lwbrn\" (UID: \"5c4034f6-34d8-4652-b48a-c39d4dca98db\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0lwbrn" Apr 16 04:31:13.213766 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:13.213690 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c4034f6-34d8-4652-b48a-c39d4dca98db-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0lwbrn\" (UID: \"5c4034f6-34d8-4652-b48a-c39d4dca98db\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0lwbrn" Apr 16 04:31:13.213766 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:13.213716 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c4034f6-34d8-4652-b48a-c39d4dca98db-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0lwbrn\" (UID: \"5c4034f6-34d8-4652-b48a-c39d4dca98db\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0lwbrn" Apr 16 04:31:13.314378 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:13.314307 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c4034f6-34d8-4652-b48a-c39d4dca98db-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0lwbrn\" (UID: \"5c4034f6-34d8-4652-b48a-c39d4dca98db\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0lwbrn" Apr 16 04:31:13.314545 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:13.314389 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c4034f6-34d8-4652-b48a-c39d4dca98db-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0lwbrn\" (UID: \"5c4034f6-34d8-4652-b48a-c39d4dca98db\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0lwbrn" Apr 16 04:31:13.314545 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:13.314412 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jvxhs\" (UniqueName: \"kubernetes.io/projected/5c4034f6-34d8-4652-b48a-c39d4dca98db-kube-api-access-jvxhs\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0lwbrn\" (UID: \"5c4034f6-34d8-4652-b48a-c39d4dca98db\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0lwbrn" Apr 16 04:31:13.314704 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:13.314686 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c4034f6-34d8-4652-b48a-c39d4dca98db-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0lwbrn\" (UID: \"5c4034f6-34d8-4652-b48a-c39d4dca98db\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0lwbrn" Apr 16 04:31:13.314790 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:13.314767 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c4034f6-34d8-4652-b48a-c39d4dca98db-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0lwbrn\" (UID: \"5c4034f6-34d8-4652-b48a-c39d4dca98db\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0lwbrn" Apr 16 04:31:13.322909 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:13.322874 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvxhs\" (UniqueName: \"kubernetes.io/projected/5c4034f6-34d8-4652-b48a-c39d4dca98db-kube-api-access-jvxhs\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0lwbrn\" (UID: \"5c4034f6-34d8-4652-b48a-c39d4dca98db\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0lwbrn" Apr 16 04:31:13.504401 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:13.504367 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0lwbrn" Apr 16 04:31:13.629606 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:13.629580 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0lwbrn"] Apr 16 04:31:13.631190 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:31:13.631167 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c4034f6_34d8_4652_b48a_c39d4dca98db.slice/crio-e927616d4898f11cb3a7785d120129cec17006e494d3847590874a5bcfc09325 WatchSource:0}: Error finding container e927616d4898f11cb3a7785d120129cec17006e494d3847590874a5bcfc09325: Status 404 returned error can't find the container with id e927616d4898f11cb3a7785d120129cec17006e494d3847590874a5bcfc09325 Apr 16 04:31:13.781936 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:13.781905 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736lkx4"] Apr 16 04:31:13.785601 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:13.785582 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736lkx4" Apr 16 04:31:13.792609 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:13.792576 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736lkx4"] Apr 16 04:31:13.818386 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:13.818361 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0883035b-5114-4548-a7e0-5145e03e2f01-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736lkx4\" (UID: \"0883035b-5114-4548-a7e0-5145e03e2f01\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736lkx4" Apr 16 04:31:13.818531 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:13.818393 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cbzz\" (UniqueName: \"kubernetes.io/projected/0883035b-5114-4548-a7e0-5145e03e2f01-kube-api-access-2cbzz\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736lkx4\" (UID: \"0883035b-5114-4548-a7e0-5145e03e2f01\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736lkx4" Apr 16 04:31:13.818531 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:13.818429 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0883035b-5114-4548-a7e0-5145e03e2f01-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736lkx4\" (UID: \"0883035b-5114-4548-a7e0-5145e03e2f01\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736lkx4" Apr 16 04:31:13.918989 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:13.918950 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0883035b-5114-4548-a7e0-5145e03e2f01-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736lkx4\" (UID: \"0883035b-5114-4548-a7e0-5145e03e2f01\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736lkx4" Apr 16 04:31:13.919200 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:13.919069 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0883035b-5114-4548-a7e0-5145e03e2f01-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736lkx4\" (UID: \"0883035b-5114-4548-a7e0-5145e03e2f01\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736lkx4" Apr 16 04:31:13.919200 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:13.919100 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2cbzz\" (UniqueName: \"kubernetes.io/projected/0883035b-5114-4548-a7e0-5145e03e2f01-kube-api-access-2cbzz\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736lkx4\" (UID: \"0883035b-5114-4548-a7e0-5145e03e2f01\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736lkx4" Apr 16 04:31:13.919402 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:13.919379 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0883035b-5114-4548-a7e0-5145e03e2f01-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736lkx4\" (UID: \"0883035b-5114-4548-a7e0-5145e03e2f01\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736lkx4" Apr 16 04:31:13.919477 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:13.919399 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0883035b-5114-4548-a7e0-5145e03e2f01-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736lkx4\" (UID: \"0883035b-5114-4548-a7e0-5145e03e2f01\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736lkx4" Apr 16 04:31:13.926954 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:13.926918 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cbzz\" (UniqueName: \"kubernetes.io/projected/0883035b-5114-4548-a7e0-5145e03e2f01-kube-api-access-2cbzz\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736lkx4\" (UID: \"0883035b-5114-4548-a7e0-5145e03e2f01\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736lkx4" Apr 16 04:31:14.117988 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:14.117881 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736lkx4" Apr 16 04:31:14.162848 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:14.162803 2575 generic.go:358] "Generic (PLEG): container finished" podID="5c4034f6-34d8-4652-b48a-c39d4dca98db" containerID="42b6fd35b2942242b08e76062be559450d7b6f149e6bd6056fc3600fdbb69834" exitCode=0 Apr 16 04:31:14.162848 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:14.162852 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0lwbrn" event={"ID":"5c4034f6-34d8-4652-b48a-c39d4dca98db","Type":"ContainerDied","Data":"42b6fd35b2942242b08e76062be559450d7b6f149e6bd6056fc3600fdbb69834"} Apr 16 04:31:14.163087 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:14.162880 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0lwbrn" event={"ID":"5c4034f6-34d8-4652-b48a-c39d4dca98db","Type":"ContainerStarted","Data":"e927616d4898f11cb3a7785d120129cec17006e494d3847590874a5bcfc09325"} Apr 16 04:31:14.245665 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:14.245614 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736lkx4"] Apr 16 04:31:14.247839 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:31:14.247811 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0883035b_5114_4548_a7e0_5145e03e2f01.slice/crio-9502cf7e25190eb3b2f9de21e642b765d6a310b57474f7ad72b1579bdbbcf64a WatchSource:0}: Error finding container 9502cf7e25190eb3b2f9de21e642b765d6a310b57474f7ad72b1579bdbbcf64a: Status 404 returned error can't find the container with id 9502cf7e25190eb3b2f9de21e642b765d6a310b57474f7ad72b1579bdbbcf64a Apr 16 04:31:14.380698 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:14.380615 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ltqff"] Apr 16 04:31:14.384294 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:14.384275 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ltqff" Apr 16 04:31:14.390744 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:14.390709 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ltqff"] Apr 16 04:31:14.424484 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:14.424440 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29185a8a-aac1-4386-8cb3-10e29ac2369e-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ltqff\" (UID: \"29185a8a-aac1-4386-8cb3-10e29ac2369e\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ltqff" Apr 16 04:31:14.424634 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:14.424501 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29185a8a-aac1-4386-8cb3-10e29ac2369e-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ltqff\" (UID: \"29185a8a-aac1-4386-8cb3-10e29ac2369e\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ltqff" Apr 16 04:31:14.424634 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:14.424521 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8vjd\" (UniqueName: \"kubernetes.io/projected/29185a8a-aac1-4386-8cb3-10e29ac2369e-kube-api-access-p8vjd\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ltqff\" (UID: \"29185a8a-aac1-4386-8cb3-10e29ac2369e\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ltqff" Apr 16 04:31:14.525006 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:14.524966 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29185a8a-aac1-4386-8cb3-10e29ac2369e-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ltqff\" (UID: \"29185a8a-aac1-4386-8cb3-10e29ac2369e\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ltqff" Apr 16 04:31:14.525182 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:14.525019 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29185a8a-aac1-4386-8cb3-10e29ac2369e-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ltqff\" (UID: \"29185a8a-aac1-4386-8cb3-10e29ac2369e\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ltqff" Apr 16 04:31:14.525182 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:14.525038 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p8vjd\" (UniqueName: \"kubernetes.io/projected/29185a8a-aac1-4386-8cb3-10e29ac2369e-kube-api-access-p8vjd\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ltqff\" (UID: \"29185a8a-aac1-4386-8cb3-10e29ac2369e\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ltqff" Apr 16 04:31:14.525409 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:14.525390 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29185a8a-aac1-4386-8cb3-10e29ac2369e-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ltqff\" (UID: \"29185a8a-aac1-4386-8cb3-10e29ac2369e\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ltqff" Apr 16 04:31:14.525458 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:14.525426 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29185a8a-aac1-4386-8cb3-10e29ac2369e-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ltqff\" (UID: \"29185a8a-aac1-4386-8cb3-10e29ac2369e\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ltqff" Apr 16 04:31:14.532859 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:14.532834 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8vjd\" (UniqueName: \"kubernetes.io/projected/29185a8a-aac1-4386-8cb3-10e29ac2369e-kube-api-access-p8vjd\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ltqff\" (UID: \"29185a8a-aac1-4386-8cb3-10e29ac2369e\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ltqff" Apr 16 04:31:14.730493 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:14.730419 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ltqff" Apr 16 04:31:14.784594 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:14.784560 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759t9bx9"] Apr 16 04:31:14.790298 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:14.790275 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759t9bx9" Apr 16 04:31:14.794878 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:14.794778 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759t9bx9"] Apr 16 04:31:14.827280 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:14.827249 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ace60148-af6e-4ac7-aa73-cc2276b4f3c7-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759t9bx9\" (UID: \"ace60148-af6e-4ac7-aa73-cc2276b4f3c7\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759t9bx9" Apr 16 04:31:14.827280 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:14.827284 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2th95\" (UniqueName: \"kubernetes.io/projected/ace60148-af6e-4ac7-aa73-cc2276b4f3c7-kube-api-access-2th95\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759t9bx9\" (UID: \"ace60148-af6e-4ac7-aa73-cc2276b4f3c7\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759t9bx9" Apr 16 04:31:14.827551 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:14.827425 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ace60148-af6e-4ac7-aa73-cc2276b4f3c7-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759t9bx9\" (UID: \"ace60148-af6e-4ac7-aa73-cc2276b4f3c7\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759t9bx9" Apr 16 04:31:14.872233 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:14.872206 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ltqff"] Apr 16 04:31:14.900375 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:31:14.900316 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29185a8a_aac1_4386_8cb3_10e29ac2369e.slice/crio-effa483e8d771301be9cf84c79c7673eef2ad2e5628a536ebc61ecb37d858e15 WatchSource:0}: Error finding container effa483e8d771301be9cf84c79c7673eef2ad2e5628a536ebc61ecb37d858e15: Status 404 returned error can't find the container with id effa483e8d771301be9cf84c79c7673eef2ad2e5628a536ebc61ecb37d858e15 Apr 16 04:31:14.928068 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:14.928026 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ace60148-af6e-4ac7-aa73-cc2276b4f3c7-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759t9bx9\" (UID: \"ace60148-af6e-4ac7-aa73-cc2276b4f3c7\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759t9bx9" Apr 16 04:31:14.928196 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:14.928073 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2th95\" (UniqueName: \"kubernetes.io/projected/ace60148-af6e-4ac7-aa73-cc2276b4f3c7-kube-api-access-2th95\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759t9bx9\" (UID: \"ace60148-af6e-4ac7-aa73-cc2276b4f3c7\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759t9bx9" Apr 16 04:31:14.928196 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:14.928113 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ace60148-af6e-4ac7-aa73-cc2276b4f3c7-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759t9bx9\" (UID: \"ace60148-af6e-4ac7-aa73-cc2276b4f3c7\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759t9bx9" Apr 16 04:31:14.928491 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:14.928465 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ace60148-af6e-4ac7-aa73-cc2276b4f3c7-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759t9bx9\" (UID: \"ace60148-af6e-4ac7-aa73-cc2276b4f3c7\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759t9bx9" Apr 16 04:31:14.928537 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:14.928474 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ace60148-af6e-4ac7-aa73-cc2276b4f3c7-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759t9bx9\" (UID: \"ace60148-af6e-4ac7-aa73-cc2276b4f3c7\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759t9bx9" Apr 16 04:31:14.942406 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:14.936995 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2th95\" (UniqueName: \"kubernetes.io/projected/ace60148-af6e-4ac7-aa73-cc2276b4f3c7-kube-api-access-2th95\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759t9bx9\" (UID: \"ace60148-af6e-4ac7-aa73-cc2276b4f3c7\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759t9bx9" Apr 16 04:31:15.108430 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:15.108387 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759t9bx9" Apr 16 04:31:15.178976 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:15.177618 2575 generic.go:358] "Generic (PLEG): container finished" podID="29185a8a-aac1-4386-8cb3-10e29ac2369e" containerID="a8054b1a06f71062de810357f5fe07a0adff1166dff8114eeaf5d94c4589e207" exitCode=0 Apr 16 04:31:15.178976 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:15.177697 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ltqff" event={"ID":"29185a8a-aac1-4386-8cb3-10e29ac2369e","Type":"ContainerDied","Data":"a8054b1a06f71062de810357f5fe07a0adff1166dff8114eeaf5d94c4589e207"} Apr 16 04:31:15.178976 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:15.177726 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ltqff" event={"ID":"29185a8a-aac1-4386-8cb3-10e29ac2369e","Type":"ContainerStarted","Data":"effa483e8d771301be9cf84c79c7673eef2ad2e5628a536ebc61ecb37d858e15"} Apr 16 04:31:15.190784 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:15.190743 2575 generic.go:358] "Generic (PLEG): container finished" podID="0883035b-5114-4548-a7e0-5145e03e2f01" containerID="95762edd1c68ad1462762ec92b854d92803580f70d120d6dbadee5645cf20391" exitCode=0 Apr 16 04:31:15.190974 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:15.190789 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736lkx4" event={"ID":"0883035b-5114-4548-a7e0-5145e03e2f01","Type":"ContainerDied","Data":"95762edd1c68ad1462762ec92b854d92803580f70d120d6dbadee5645cf20391"} Apr 16 04:31:15.190974 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:15.190844 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736lkx4" event={"ID":"0883035b-5114-4548-a7e0-5145e03e2f01","Type":"ContainerStarted","Data":"9502cf7e25190eb3b2f9de21e642b765d6a310b57474f7ad72b1579bdbbcf64a"} Apr 16 04:31:15.193409 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:15.193284 2575 generic.go:358] "Generic (PLEG): container finished" podID="5c4034f6-34d8-4652-b48a-c39d4dca98db" containerID="bdd414e25be123e34bd6410978f42002545368b66b5af66ccd8a02bdeaba2c4c" exitCode=0 Apr 16 04:31:15.193409 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:15.193370 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0lwbrn" event={"ID":"5c4034f6-34d8-4652-b48a-c39d4dca98db","Type":"ContainerDied","Data":"bdd414e25be123e34bd6410978f42002545368b66b5af66ccd8a02bdeaba2c4c"} Apr 16 04:31:15.254465 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:15.254442 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759t9bx9"] Apr 16 04:31:15.257157 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:31:15.257099 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podace60148_af6e_4ac7_aa73_cc2276b4f3c7.slice/crio-727239ef65392099ddc3f168c42f4bdacb8e0debd42f2ad61910021e764f9d34 WatchSource:0}: Error finding container 727239ef65392099ddc3f168c42f4bdacb8e0debd42f2ad61910021e764f9d34: Status 404 returned error can't find the container with id 727239ef65392099ddc3f168c42f4bdacb8e0debd42f2ad61910021e764f9d34 Apr 16 04:31:16.201130 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:16.201043 2575 generic.go:358] "Generic (PLEG): container finished" podID="29185a8a-aac1-4386-8cb3-10e29ac2369e" containerID="b338d9d60487ab8eb973e8dd0978a283bd48b69dc7ab75ae70804c0103032296" exitCode=0 Apr 16 04:31:16.201130 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:16.201106 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ltqff" event={"ID":"29185a8a-aac1-4386-8cb3-10e29ac2369e","Type":"ContainerDied","Data":"b338d9d60487ab8eb973e8dd0978a283bd48b69dc7ab75ae70804c0103032296"} Apr 16 04:31:16.202670 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:16.202635 2575 generic.go:358] "Generic (PLEG): container finished" podID="0883035b-5114-4548-a7e0-5145e03e2f01" containerID="f950fc11e2c15763f7a8aec5c0bfb169303e42e49c26388fb424ca8f11318e4b" exitCode=0 Apr 16 04:31:16.202790 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:16.202689 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736lkx4" event={"ID":"0883035b-5114-4548-a7e0-5145e03e2f01","Type":"ContainerDied","Data":"f950fc11e2c15763f7a8aec5c0bfb169303e42e49c26388fb424ca8f11318e4b"} Apr 16 04:31:16.204126 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:16.204100 2575 generic.go:358] "Generic (PLEG): container finished" podID="ace60148-af6e-4ac7-aa73-cc2276b4f3c7" containerID="58f57d49c133e626e60fa56929212c96b5edbe9728a4bae9687425f15108cd25" exitCode=0 Apr 16 04:31:16.204248 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:16.204176 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759t9bx9" event={"ID":"ace60148-af6e-4ac7-aa73-cc2276b4f3c7","Type":"ContainerDied","Data":"58f57d49c133e626e60fa56929212c96b5edbe9728a4bae9687425f15108cd25"} Apr 16 04:31:16.204248 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:16.204197 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759t9bx9" event={"ID":"ace60148-af6e-4ac7-aa73-cc2276b4f3c7","Type":"ContainerStarted","Data":"727239ef65392099ddc3f168c42f4bdacb8e0debd42f2ad61910021e764f9d34"} Apr 16 04:31:16.206152 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:16.206125 2575 generic.go:358] "Generic (PLEG): container finished" podID="5c4034f6-34d8-4652-b48a-c39d4dca98db" containerID="e0cfa7c01d5044539d1f53f98cfa43036bf98133feacb062f77bb2e05bfe7436" exitCode=0 Apr 16 04:31:16.206376 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:16.206195 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0lwbrn" event={"ID":"5c4034f6-34d8-4652-b48a-c39d4dca98db","Type":"ContainerDied","Data":"e0cfa7c01d5044539d1f53f98cfa43036bf98133feacb062f77bb2e05bfe7436"} Apr 16 04:31:17.212163 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:17.212116 2575 generic.go:358] "Generic (PLEG): container finished" podID="29185a8a-aac1-4386-8cb3-10e29ac2369e" containerID="84c6ebe03de0f59a4391f206b8c48ae7b2ff7d998278d232b0f68ab8f218bd4b" exitCode=0 Apr 16 04:31:17.212651 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:17.212163 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ltqff" event={"ID":"29185a8a-aac1-4386-8cb3-10e29ac2369e","Type":"ContainerDied","Data":"84c6ebe03de0f59a4391f206b8c48ae7b2ff7d998278d232b0f68ab8f218bd4b"} Apr 16 04:31:17.214038 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:17.214014 2575 generic.go:358] "Generic (PLEG): container finished" podID="0883035b-5114-4548-a7e0-5145e03e2f01" containerID="dfabcb49e0de3a276cb9e5eaeb87b896db74502b3d1484a59fbb00f0d32d06d8" exitCode=0 Apr 16 04:31:17.214114 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:17.214076 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736lkx4" event={"ID":"0883035b-5114-4548-a7e0-5145e03e2f01","Type":"ContainerDied","Data":"dfabcb49e0de3a276cb9e5eaeb87b896db74502b3d1484a59fbb00f0d32d06d8"} Apr 16 04:31:17.215538 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:17.215507 2575 generic.go:358] "Generic (PLEG): container finished" podID="ace60148-af6e-4ac7-aa73-cc2276b4f3c7" containerID="41544f3711c77cf697aa266b27b0bda262e3d76ea8a97256bfeb3a16bea925e5" exitCode=0 Apr 16 04:31:17.215615 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:17.215543 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759t9bx9" event={"ID":"ace60148-af6e-4ac7-aa73-cc2276b4f3c7","Type":"ContainerDied","Data":"41544f3711c77cf697aa266b27b0bda262e3d76ea8a97256bfeb3a16bea925e5"} Apr 16 04:31:17.339838 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:17.339814 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0lwbrn" Apr 16 04:31:17.452503 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:17.452463 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c4034f6-34d8-4652-b48a-c39d4dca98db-bundle\") pod \"5c4034f6-34d8-4652-b48a-c39d4dca98db\" (UID: \"5c4034f6-34d8-4652-b48a-c39d4dca98db\") " Apr 16 04:31:17.452657 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:17.452582 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvxhs\" (UniqueName: \"kubernetes.io/projected/5c4034f6-34d8-4652-b48a-c39d4dca98db-kube-api-access-jvxhs\") pod \"5c4034f6-34d8-4652-b48a-c39d4dca98db\" (UID: \"5c4034f6-34d8-4652-b48a-c39d4dca98db\") " Apr 16 04:31:17.452700 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:17.452655 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c4034f6-34d8-4652-b48a-c39d4dca98db-util\") pod \"5c4034f6-34d8-4652-b48a-c39d4dca98db\" (UID: \"5c4034f6-34d8-4652-b48a-c39d4dca98db\") " Apr 16 04:31:17.453066 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:17.453042 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c4034f6-34d8-4652-b48a-c39d4dca98db-bundle" (OuterVolumeSpecName: "bundle") pod "5c4034f6-34d8-4652-b48a-c39d4dca98db" (UID: "5c4034f6-34d8-4652-b48a-c39d4dca98db"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 04:31:17.454807 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:17.454779 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c4034f6-34d8-4652-b48a-c39d4dca98db-kube-api-access-jvxhs" (OuterVolumeSpecName: "kube-api-access-jvxhs") pod "5c4034f6-34d8-4652-b48a-c39d4dca98db" (UID: "5c4034f6-34d8-4652-b48a-c39d4dca98db"). InnerVolumeSpecName "kube-api-access-jvxhs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 04:31:17.457934 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:17.457899 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c4034f6-34d8-4652-b48a-c39d4dca98db-util" (OuterVolumeSpecName: "util") pod "5c4034f6-34d8-4652-b48a-c39d4dca98db" (UID: "5c4034f6-34d8-4652-b48a-c39d4dca98db"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 04:31:17.554206 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:17.554101 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jvxhs\" (UniqueName: \"kubernetes.io/projected/5c4034f6-34d8-4652-b48a-c39d4dca98db-kube-api-access-jvxhs\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:31:17.554206 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:17.554150 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c4034f6-34d8-4652-b48a-c39d4dca98db-util\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:31:17.554206 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:17.554161 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c4034f6-34d8-4652-b48a-c39d4dca98db-bundle\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:31:18.221428 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:18.221388 2575 generic.go:358] "Generic (PLEG): container finished" podID="ace60148-af6e-4ac7-aa73-cc2276b4f3c7" containerID="6171e3ab0ec28bafadbc07e182ac876978a1b5d239fb0091b72a8d710c26bfe8" exitCode=0 Apr 16 04:31:18.221845 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:18.221469 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759t9bx9" event={"ID":"ace60148-af6e-4ac7-aa73-cc2276b4f3c7","Type":"ContainerDied","Data":"6171e3ab0ec28bafadbc07e182ac876978a1b5d239fb0091b72a8d710c26bfe8"} Apr 16 04:31:18.223023 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:18.223002 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0lwbrn" Apr 16 04:31:18.223113 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:18.223021 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0lwbrn" event={"ID":"5c4034f6-34d8-4652-b48a-c39d4dca98db","Type":"ContainerDied","Data":"e927616d4898f11cb3a7785d120129cec17006e494d3847590874a5bcfc09325"} Apr 16 04:31:18.223113 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:18.223054 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e927616d4898f11cb3a7785d120129cec17006e494d3847590874a5bcfc09325" Apr 16 04:31:18.371251 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:18.371227 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736lkx4" Apr 16 04:31:18.375231 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:18.375211 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ltqff" Apr 16 04:31:18.462610 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:18.462571 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29185a8a-aac1-4386-8cb3-10e29ac2369e-util\") pod \"29185a8a-aac1-4386-8cb3-10e29ac2369e\" (UID: \"29185a8a-aac1-4386-8cb3-10e29ac2369e\") " Apr 16 04:31:18.462610 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:18.462610 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0883035b-5114-4548-a7e0-5145e03e2f01-bundle\") pod \"0883035b-5114-4548-a7e0-5145e03e2f01\" (UID: \"0883035b-5114-4548-a7e0-5145e03e2f01\") " Apr 16 04:31:18.462865 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:18.462632 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0883035b-5114-4548-a7e0-5145e03e2f01-util\") pod \"0883035b-5114-4548-a7e0-5145e03e2f01\" (UID: \"0883035b-5114-4548-a7e0-5145e03e2f01\") " Apr 16 04:31:18.462865 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:18.462666 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cbzz\" (UniqueName: \"kubernetes.io/projected/0883035b-5114-4548-a7e0-5145e03e2f01-kube-api-access-2cbzz\") pod \"0883035b-5114-4548-a7e0-5145e03e2f01\" (UID: \"0883035b-5114-4548-a7e0-5145e03e2f01\") " Apr 16 04:31:18.462865 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:18.462695 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29185a8a-aac1-4386-8cb3-10e29ac2369e-bundle\") pod \"29185a8a-aac1-4386-8cb3-10e29ac2369e\" (UID: \"29185a8a-aac1-4386-8cb3-10e29ac2369e\") " Apr 16 04:31:18.462865 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:18.462727 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8vjd\" (UniqueName: \"kubernetes.io/projected/29185a8a-aac1-4386-8cb3-10e29ac2369e-kube-api-access-p8vjd\") pod \"29185a8a-aac1-4386-8cb3-10e29ac2369e\" (UID: \"29185a8a-aac1-4386-8cb3-10e29ac2369e\") " Apr 16 04:31:18.463315 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:18.463279 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29185a8a-aac1-4386-8cb3-10e29ac2369e-bundle" (OuterVolumeSpecName: "bundle") pod "29185a8a-aac1-4386-8cb3-10e29ac2369e" (UID: "29185a8a-aac1-4386-8cb3-10e29ac2369e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 04:31:18.463462 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:18.463419 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0883035b-5114-4548-a7e0-5145e03e2f01-bundle" (OuterVolumeSpecName: "bundle") pod "0883035b-5114-4548-a7e0-5145e03e2f01" (UID: "0883035b-5114-4548-a7e0-5145e03e2f01"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 04:31:18.464942 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:18.464910 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0883035b-5114-4548-a7e0-5145e03e2f01-kube-api-access-2cbzz" (OuterVolumeSpecName: "kube-api-access-2cbzz") pod "0883035b-5114-4548-a7e0-5145e03e2f01" (UID: "0883035b-5114-4548-a7e0-5145e03e2f01"). InnerVolumeSpecName "kube-api-access-2cbzz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 04:31:18.465051 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:18.465024 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29185a8a-aac1-4386-8cb3-10e29ac2369e-kube-api-access-p8vjd" (OuterVolumeSpecName: "kube-api-access-p8vjd") pod "29185a8a-aac1-4386-8cb3-10e29ac2369e" (UID: "29185a8a-aac1-4386-8cb3-10e29ac2369e"). InnerVolumeSpecName "kube-api-access-p8vjd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 04:31:18.468529 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:18.468481 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0883035b-5114-4548-a7e0-5145e03e2f01-util" (OuterVolumeSpecName: "util") pod "0883035b-5114-4548-a7e0-5145e03e2f01" (UID: "0883035b-5114-4548-a7e0-5145e03e2f01"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 04:31:18.468654 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:18.468615 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29185a8a-aac1-4386-8cb3-10e29ac2369e-util" (OuterVolumeSpecName: "util") pod "29185a8a-aac1-4386-8cb3-10e29ac2369e" (UID: "29185a8a-aac1-4386-8cb3-10e29ac2369e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 04:31:18.563999 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:18.563967 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29185a8a-aac1-4386-8cb3-10e29ac2369e-util\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:31:18.563999 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:18.563999 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0883035b-5114-4548-a7e0-5145e03e2f01-bundle\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:31:18.564198 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:18.564008 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0883035b-5114-4548-a7e0-5145e03e2f01-util\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:31:18.564198 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:18.564017 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2cbzz\" (UniqueName: \"kubernetes.io/projected/0883035b-5114-4548-a7e0-5145e03e2f01-kube-api-access-2cbzz\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:31:18.564198 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:18.564026 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29185a8a-aac1-4386-8cb3-10e29ac2369e-bundle\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:31:18.564198 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:18.564036 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p8vjd\" (UniqueName: \"kubernetes.io/projected/29185a8a-aac1-4386-8cb3-10e29ac2369e-kube-api-access-p8vjd\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:31:19.228285 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:19.228250 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ltqff" event={"ID":"29185a8a-aac1-4386-8cb3-10e29ac2369e","Type":"ContainerDied","Data":"effa483e8d771301be9cf84c79c7673eef2ad2e5628a536ebc61ecb37d858e15"} Apr 16 04:31:19.228285 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:19.228278 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ltqff" Apr 16 04:31:19.228756 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:19.228286 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="effa483e8d771301be9cf84c79c7673eef2ad2e5628a536ebc61ecb37d858e15" Apr 16 04:31:19.230107 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:19.230085 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736lkx4" Apr 16 04:31:19.230261 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:19.230087 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736lkx4" event={"ID":"0883035b-5114-4548-a7e0-5145e03e2f01","Type":"ContainerDied","Data":"9502cf7e25190eb3b2f9de21e642b765d6a310b57474f7ad72b1579bdbbcf64a"} Apr 16 04:31:19.230261 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:19.230196 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9502cf7e25190eb3b2f9de21e642b765d6a310b57474f7ad72b1579bdbbcf64a" Apr 16 04:31:19.359573 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:19.359546 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759t9bx9" Apr 16 04:31:19.472494 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:19.472462 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ace60148-af6e-4ac7-aa73-cc2276b4f3c7-bundle\") pod \"ace60148-af6e-4ac7-aa73-cc2276b4f3c7\" (UID: \"ace60148-af6e-4ac7-aa73-cc2276b4f3c7\") " Apr 16 04:31:19.472690 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:19.472539 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ace60148-af6e-4ac7-aa73-cc2276b4f3c7-util\") pod \"ace60148-af6e-4ac7-aa73-cc2276b4f3c7\" (UID: \"ace60148-af6e-4ac7-aa73-cc2276b4f3c7\") " Apr 16 04:31:19.472690 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:19.472581 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2th95\" (UniqueName: \"kubernetes.io/projected/ace60148-af6e-4ac7-aa73-cc2276b4f3c7-kube-api-access-2th95\") pod \"ace60148-af6e-4ac7-aa73-cc2276b4f3c7\" (UID: \"ace60148-af6e-4ac7-aa73-cc2276b4f3c7\") " Apr 16 04:31:19.472940 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:19.472906 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ace60148-af6e-4ac7-aa73-cc2276b4f3c7-bundle" (OuterVolumeSpecName: "bundle") pod "ace60148-af6e-4ac7-aa73-cc2276b4f3c7" (UID: "ace60148-af6e-4ac7-aa73-cc2276b4f3c7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 04:31:19.474783 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:19.474754 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ace60148-af6e-4ac7-aa73-cc2276b4f3c7-kube-api-access-2th95" (OuterVolumeSpecName: "kube-api-access-2th95") pod "ace60148-af6e-4ac7-aa73-cc2276b4f3c7" (UID: "ace60148-af6e-4ac7-aa73-cc2276b4f3c7"). InnerVolumeSpecName "kube-api-access-2th95". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 04:31:19.477107 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:19.477085 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ace60148-af6e-4ac7-aa73-cc2276b4f3c7-util" (OuterVolumeSpecName: "util") pod "ace60148-af6e-4ac7-aa73-cc2276b4f3c7" (UID: "ace60148-af6e-4ac7-aa73-cc2276b4f3c7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 04:31:19.573677 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:19.573640 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ace60148-af6e-4ac7-aa73-cc2276b4f3c7-bundle\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:31:19.573677 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:19.573669 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ace60148-af6e-4ac7-aa73-cc2276b4f3c7-util\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:31:19.573677 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:19.573678 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2th95\" (UniqueName: \"kubernetes.io/projected/ace60148-af6e-4ac7-aa73-cc2276b4f3c7-kube-api-access-2th95\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:31:20.235124 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:20.235091 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759t9bx9" event={"ID":"ace60148-af6e-4ac7-aa73-cc2276b4f3c7","Type":"ContainerDied","Data":"727239ef65392099ddc3f168c42f4bdacb8e0debd42f2ad61910021e764f9d34"} Apr 16 04:31:20.235124 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:20.235131 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="727239ef65392099ddc3f168c42f4bdacb8e0debd42f2ad61910021e764f9d34" Apr 16 04:31:20.235655 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:20.235108 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759t9bx9" Apr 16 04:31:28.768403 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:28.768312 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-cqlkc"] Apr 16 04:31:28.768750 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:28.768666 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c4034f6-34d8-4652-b48a-c39d4dca98db" containerName="util" Apr 16 04:31:28.768750 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:28.768678 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c4034f6-34d8-4652-b48a-c39d4dca98db" containerName="util" Apr 16 04:31:28.768750 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:28.768688 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0883035b-5114-4548-a7e0-5145e03e2f01" containerName="util" Apr 16 04:31:28.768750 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:28.768694 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="0883035b-5114-4548-a7e0-5145e03e2f01" containerName="util" Apr 16 04:31:28.768750 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:28.768700 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29185a8a-aac1-4386-8cb3-10e29ac2369e" containerName="pull" Apr 16 04:31:28.768750 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:28.768706 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="29185a8a-aac1-4386-8cb3-10e29ac2369e" containerName="pull" Apr 16 04:31:28.768750 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:28.768713 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c4034f6-34d8-4652-b48a-c39d4dca98db" containerName="extract" Apr 16 04:31:28.768750 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:28.768720 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c4034f6-34d8-4652-b48a-c39d4dca98db" containerName="extract" Apr 16 04:31:28.768750 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:28.768728 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29185a8a-aac1-4386-8cb3-10e29ac2369e" containerName="util" Apr 16 04:31:28.768750 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:28.768733 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="29185a8a-aac1-4386-8cb3-10e29ac2369e" containerName="util" Apr 16 04:31:28.768750 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:28.768741 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ace60148-af6e-4ac7-aa73-cc2276b4f3c7" containerName="extract" Apr 16 04:31:28.768750 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:28.768746 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="ace60148-af6e-4ac7-aa73-cc2276b4f3c7" containerName="extract" Apr 16 04:31:28.768750 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:28.768754 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29185a8a-aac1-4386-8cb3-10e29ac2369e" containerName="extract" Apr 16 04:31:28.769142 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:28.768759 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="29185a8a-aac1-4386-8cb3-10e29ac2369e" containerName="extract" Apr 16 04:31:28.769142 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:28.768768 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0883035b-5114-4548-a7e0-5145e03e2f01" containerName="pull" Apr 16 04:31:28.769142 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:28.768773 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="0883035b-5114-4548-a7e0-5145e03e2f01" containerName="pull" Apr 16 04:31:28.769142 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:28.768779 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ace60148-af6e-4ac7-aa73-cc2276b4f3c7" containerName="pull" Apr 16 04:31:28.769142 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:28.768784 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="ace60148-af6e-4ac7-aa73-cc2276b4f3c7" containerName="pull" Apr 16 04:31:28.769142 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:28.768797 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0883035b-5114-4548-a7e0-5145e03e2f01" containerName="extract" Apr 16 04:31:28.769142 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:28.768802 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="0883035b-5114-4548-a7e0-5145e03e2f01" containerName="extract" Apr 16 04:31:28.769142 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:28.768808 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ace60148-af6e-4ac7-aa73-cc2276b4f3c7" containerName="util" Apr 16 04:31:28.769142 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:28.768812 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="ace60148-af6e-4ac7-aa73-cc2276b4f3c7" containerName="util" Apr 16 04:31:28.769142 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:28.768818 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c4034f6-34d8-4652-b48a-c39d4dca98db" containerName="pull" Apr 16 04:31:28.769142 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:28.768823 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c4034f6-34d8-4652-b48a-c39d4dca98db" containerName="pull" Apr 16 04:31:28.769142 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:28.768881 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="0883035b-5114-4548-a7e0-5145e03e2f01" containerName="extract" Apr 16 04:31:28.769142 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:28.768896 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="29185a8a-aac1-4386-8cb3-10e29ac2369e" containerName="extract" Apr 16 04:31:28.769142 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:28.768907 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="5c4034f6-34d8-4652-b48a-c39d4dca98db" containerName="extract" Apr 16 04:31:28.769142 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:28.768913 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="ace60148-af6e-4ac7-aa73-cc2276b4f3c7" containerName="extract" Apr 16 04:31:28.771837 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:28.771817 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-cqlkc" Apr 16 04:31:28.774584 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:28.774562 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-g2fxf\"" Apr 16 04:31:28.784356 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:28.784317 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-cqlkc"] Apr 16 04:31:28.847772 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:28.847726 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpqbc\" (UniqueName: \"kubernetes.io/projected/bdf7259c-f67a-4238-96eb-da5d561be217-kube-api-access-tpqbc\") pod \"limitador-operator-controller-manager-85c4996f8c-cqlkc\" (UID: \"bdf7259c-f67a-4238-96eb-da5d561be217\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-cqlkc" Apr 16 04:31:28.948787 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:28.948745 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tpqbc\" (UniqueName: \"kubernetes.io/projected/bdf7259c-f67a-4238-96eb-da5d561be217-kube-api-access-tpqbc\") pod \"limitador-operator-controller-manager-85c4996f8c-cqlkc\" (UID: \"bdf7259c-f67a-4238-96eb-da5d561be217\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-cqlkc" Apr 16 04:31:28.957669 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:28.957640 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpqbc\" (UniqueName: \"kubernetes.io/projected/bdf7259c-f67a-4238-96eb-da5d561be217-kube-api-access-tpqbc\") pod \"limitador-operator-controller-manager-85c4996f8c-cqlkc\" (UID: \"bdf7259c-f67a-4238-96eb-da5d561be217\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-cqlkc" Apr 16 04:31:29.082019 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:29.081911 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-cqlkc" Apr 16 04:31:29.211357 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:29.211318 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-cqlkc"] Apr 16 04:31:29.213408 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:31:29.213380 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdf7259c_f67a_4238_96eb_da5d561be217.slice/crio-8ed849a6ce2a413ccb1d1848ac0b823175006af4754c1a0703f55fd9aaceceee WatchSource:0}: Error finding container 8ed849a6ce2a413ccb1d1848ac0b823175006af4754c1a0703f55fd9aaceceee: Status 404 returned error can't find the container with id 8ed849a6ce2a413ccb1d1848ac0b823175006af4754c1a0703f55fd9aaceceee Apr 16 04:31:29.269248 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:29.269213 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-cqlkc" event={"ID":"bdf7259c-f67a-4238-96eb-da5d561be217","Type":"ContainerStarted","Data":"8ed849a6ce2a413ccb1d1848ac0b823175006af4754c1a0703f55fd9aaceceee"} Apr 16 04:31:31.277213 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:31.277174 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-cqlkc" event={"ID":"bdf7259c-f67a-4238-96eb-da5d561be217","Type":"ContainerStarted","Data":"180090cbba2774c274dce6e9904788e91ca874195ab4416c0e0844e7bd8107a5"} Apr 16 04:31:31.277629 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:31.277293 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-cqlkc" Apr 16 04:31:31.294890 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:31.294827 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-cqlkc" podStartSLOduration=1.703426831 podStartE2EDuration="3.294807364s" podCreationTimestamp="2026-04-16 04:31:28 +0000 UTC" firstStartedPulling="2026-04-16 04:31:29.21519901 +0000 UTC m=+437.117974987" lastFinishedPulling="2026-04-16 04:31:30.806579543 +0000 UTC m=+438.709355520" observedRunningTime="2026-04-16 04:31:31.294308761 +0000 UTC m=+439.197084763" watchObservedRunningTime="2026-04-16 04:31:31.294807364 +0000 UTC m=+439.197583364" Apr 16 04:31:32.453077 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:32.453043 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-9cp6s"] Apr 16 04:31:32.456475 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:32.456458 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9cp6s" Apr 16 04:31:32.459218 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:32.459201 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 16 04:31:32.459343 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:32.459252 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-6sqqs\"" Apr 16 04:31:32.467089 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:32.467067 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-9cp6s"] Apr 16 04:31:32.580120 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:32.580085 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s56mr\" (UniqueName: \"kubernetes.io/projected/3fdbb361-f9e6-49bc-ad0c-aade3124e9fc-kube-api-access-s56mr\") pod \"dns-operator-controller-manager-648d5c98bc-9cp6s\" (UID: \"3fdbb361-f9e6-49bc-ad0c-aade3124e9fc\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9cp6s" Apr 16 04:31:32.681128 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:32.681070 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s56mr\" (UniqueName: \"kubernetes.io/projected/3fdbb361-f9e6-49bc-ad0c-aade3124e9fc-kube-api-access-s56mr\") pod \"dns-operator-controller-manager-648d5c98bc-9cp6s\" (UID: \"3fdbb361-f9e6-49bc-ad0c-aade3124e9fc\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9cp6s" Apr 16 04:31:32.689281 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:32.689257 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s56mr\" (UniqueName: \"kubernetes.io/projected/3fdbb361-f9e6-49bc-ad0c-aade3124e9fc-kube-api-access-s56mr\") pod \"dns-operator-controller-manager-648d5c98bc-9cp6s\" (UID: \"3fdbb361-f9e6-49bc-ad0c-aade3124e9fc\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9cp6s" Apr 16 04:31:32.767215 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:32.767176 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9cp6s" Apr 16 04:31:32.899815 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:32.899784 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-9cp6s"] Apr 16 04:31:32.902254 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:31:32.902216 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fdbb361_f9e6_49bc_ad0c_aade3124e9fc.slice/crio-77295ee6c925aff017418fffca596933f073bf88842c1312f6f4af41277471bd WatchSource:0}: Error finding container 77295ee6c925aff017418fffca596933f073bf88842c1312f6f4af41277471bd: Status 404 returned error can't find the container with id 77295ee6c925aff017418fffca596933f073bf88842c1312f6f4af41277471bd Apr 16 04:31:33.285236 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:33.285199 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9cp6s" event={"ID":"3fdbb361-f9e6-49bc-ad0c-aade3124e9fc","Type":"ContainerStarted","Data":"77295ee6c925aff017418fffca596933f073bf88842c1312f6f4af41277471bd"} Apr 16 04:31:36.298861 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:36.298828 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9cp6s" event={"ID":"3fdbb361-f9e6-49bc-ad0c-aade3124e9fc","Type":"ContainerStarted","Data":"a455bf0de302719518a10dae07b990d627716cfb4fe916189a950ae70bdfb3ca"} Apr 16 04:31:36.299242 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:36.298877 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9cp6s" Apr 16 04:31:36.323658 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:36.323608 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9cp6s" podStartSLOduration=1.9478418149999999 podStartE2EDuration="4.323592592s" podCreationTimestamp="2026-04-16 04:31:32 +0000 UTC" firstStartedPulling="2026-04-16 04:31:32.904819406 +0000 UTC m=+440.807595388" lastFinishedPulling="2026-04-16 04:31:35.280570184 +0000 UTC m=+443.183346165" observedRunningTime="2026-04-16 04:31:36.321840434 +0000 UTC m=+444.224616446" watchObservedRunningTime="2026-04-16 04:31:36.323592592 +0000 UTC m=+444.226368591" Apr 16 04:31:42.283480 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:42.283449 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-cqlkc" Apr 16 04:31:44.826113 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:44.826074 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5998c4dcdc-rfsn4"] Apr 16 04:31:44.830388 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:44.830358 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5998c4dcdc-rfsn4" Apr 16 04:31:44.836820 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:44.836791 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5998c4dcdc-rfsn4"] Apr 16 04:31:44.985513 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:44.985475 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d1ed06a6-893b-4e37-b06e-a2d444b721cb-console-oauth-config\") pod \"console-5998c4dcdc-rfsn4\" (UID: \"d1ed06a6-893b-4e37-b06e-a2d444b721cb\") " pod="openshift-console/console-5998c4dcdc-rfsn4" Apr 16 04:31:44.985698 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:44.985539 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1ed06a6-893b-4e37-b06e-a2d444b721cb-trusted-ca-bundle\") pod \"console-5998c4dcdc-rfsn4\" (UID: \"d1ed06a6-893b-4e37-b06e-a2d444b721cb\") " pod="openshift-console/console-5998c4dcdc-rfsn4" Apr 16 04:31:44.985698 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:44.985618 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d1ed06a6-893b-4e37-b06e-a2d444b721cb-console-config\") pod \"console-5998c4dcdc-rfsn4\" (UID: \"d1ed06a6-893b-4e37-b06e-a2d444b721cb\") " pod="openshift-console/console-5998c4dcdc-rfsn4" Apr 16 04:31:44.985773 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:44.985726 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d1ed06a6-893b-4e37-b06e-a2d444b721cb-service-ca\") pod \"console-5998c4dcdc-rfsn4\" (UID: \"d1ed06a6-893b-4e37-b06e-a2d444b721cb\") " pod="openshift-console/console-5998c4dcdc-rfsn4" Apr 16 04:31:44.985773 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:44.985756 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1ed06a6-893b-4e37-b06e-a2d444b721cb-console-serving-cert\") pod \"console-5998c4dcdc-rfsn4\" (UID: \"d1ed06a6-893b-4e37-b06e-a2d444b721cb\") " pod="openshift-console/console-5998c4dcdc-rfsn4" Apr 16 04:31:44.985865 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:44.985782 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz5v9\" (UniqueName: \"kubernetes.io/projected/d1ed06a6-893b-4e37-b06e-a2d444b721cb-kube-api-access-nz5v9\") pod \"console-5998c4dcdc-rfsn4\" (UID: \"d1ed06a6-893b-4e37-b06e-a2d444b721cb\") " pod="openshift-console/console-5998c4dcdc-rfsn4" Apr 16 04:31:44.985865 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:44.985828 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d1ed06a6-893b-4e37-b06e-a2d444b721cb-oauth-serving-cert\") pod \"console-5998c4dcdc-rfsn4\" (UID: \"d1ed06a6-893b-4e37-b06e-a2d444b721cb\") " pod="openshift-console/console-5998c4dcdc-rfsn4" Apr 16 04:31:45.086973 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:45.086874 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d1ed06a6-893b-4e37-b06e-a2d444b721cb-oauth-serving-cert\") pod \"console-5998c4dcdc-rfsn4\" (UID: \"d1ed06a6-893b-4e37-b06e-a2d444b721cb\") " pod="openshift-console/console-5998c4dcdc-rfsn4" Apr 16 04:31:45.086973 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:45.086925 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d1ed06a6-893b-4e37-b06e-a2d444b721cb-console-oauth-config\") pod \"console-5998c4dcdc-rfsn4\" (UID: \"d1ed06a6-893b-4e37-b06e-a2d444b721cb\") " pod="openshift-console/console-5998c4dcdc-rfsn4" Apr 16 04:31:45.087198 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:45.087052 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1ed06a6-893b-4e37-b06e-a2d444b721cb-trusted-ca-bundle\") pod \"console-5998c4dcdc-rfsn4\" (UID: \"d1ed06a6-893b-4e37-b06e-a2d444b721cb\") " pod="openshift-console/console-5998c4dcdc-rfsn4" Apr 16 04:31:45.087198 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:45.087106 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d1ed06a6-893b-4e37-b06e-a2d444b721cb-console-config\") pod \"console-5998c4dcdc-rfsn4\" (UID: \"d1ed06a6-893b-4e37-b06e-a2d444b721cb\") " pod="openshift-console/console-5998c4dcdc-rfsn4" Apr 16 04:31:45.087198 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:45.087158 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d1ed06a6-893b-4e37-b06e-a2d444b721cb-service-ca\") pod \"console-5998c4dcdc-rfsn4\" (UID: \"d1ed06a6-893b-4e37-b06e-a2d444b721cb\") " pod="openshift-console/console-5998c4dcdc-rfsn4" Apr 16 04:31:45.087198 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:45.087176 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1ed06a6-893b-4e37-b06e-a2d444b721cb-console-serving-cert\") pod \"console-5998c4dcdc-rfsn4\" (UID: \"d1ed06a6-893b-4e37-b06e-a2d444b721cb\") " pod="openshift-console/console-5998c4dcdc-rfsn4" Apr 16 04:31:45.087198 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:45.087199 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nz5v9\" (UniqueName: \"kubernetes.io/projected/d1ed06a6-893b-4e37-b06e-a2d444b721cb-kube-api-access-nz5v9\") pod \"console-5998c4dcdc-rfsn4\" (UID: \"d1ed06a6-893b-4e37-b06e-a2d444b721cb\") " pod="openshift-console/console-5998c4dcdc-rfsn4" Apr 16 04:31:45.087831 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:45.087808 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d1ed06a6-893b-4e37-b06e-a2d444b721cb-oauth-serving-cert\") pod \"console-5998c4dcdc-rfsn4\" (UID: \"d1ed06a6-893b-4e37-b06e-a2d444b721cb\") " pod="openshift-console/console-5998c4dcdc-rfsn4" Apr 16 04:31:45.087912 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:45.087888 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d1ed06a6-893b-4e37-b06e-a2d444b721cb-console-config\") pod \"console-5998c4dcdc-rfsn4\" (UID: \"d1ed06a6-893b-4e37-b06e-a2d444b721cb\") " pod="openshift-console/console-5998c4dcdc-rfsn4" Apr 16 04:31:45.088011 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:45.087990 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1ed06a6-893b-4e37-b06e-a2d444b721cb-trusted-ca-bundle\") pod \"console-5998c4dcdc-rfsn4\" (UID: \"d1ed06a6-893b-4e37-b06e-a2d444b721cb\") " pod="openshift-console/console-5998c4dcdc-rfsn4" Apr 16 04:31:45.088047 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:45.088014 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d1ed06a6-893b-4e37-b06e-a2d444b721cb-service-ca\") pod \"console-5998c4dcdc-rfsn4\" (UID: \"d1ed06a6-893b-4e37-b06e-a2d444b721cb\") " pod="openshift-console/console-5998c4dcdc-rfsn4" Apr 16 04:31:45.089556 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:45.089535 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d1ed06a6-893b-4e37-b06e-a2d444b721cb-console-oauth-config\") pod \"console-5998c4dcdc-rfsn4\" (UID: \"d1ed06a6-893b-4e37-b06e-a2d444b721cb\") " pod="openshift-console/console-5998c4dcdc-rfsn4" Apr 16 04:31:45.089680 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:45.089660 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1ed06a6-893b-4e37-b06e-a2d444b721cb-console-serving-cert\") pod \"console-5998c4dcdc-rfsn4\" (UID: \"d1ed06a6-893b-4e37-b06e-a2d444b721cb\") " pod="openshift-console/console-5998c4dcdc-rfsn4" Apr 16 04:31:45.095164 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:45.095140 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz5v9\" (UniqueName: \"kubernetes.io/projected/d1ed06a6-893b-4e37-b06e-a2d444b721cb-kube-api-access-nz5v9\") pod \"console-5998c4dcdc-rfsn4\" (UID: \"d1ed06a6-893b-4e37-b06e-a2d444b721cb\") " pod="openshift-console/console-5998c4dcdc-rfsn4" Apr 16 04:31:45.143413 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:45.143374 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5998c4dcdc-rfsn4" Apr 16 04:31:45.275155 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:45.275127 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5998c4dcdc-rfsn4"] Apr 16 04:31:45.276925 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:31:45.276899 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1ed06a6_893b_4e37_b06e_a2d444b721cb.slice/crio-4e08c503092ec17479f902f593a370dc097a306eb040daedc8dfc481e8269c5e WatchSource:0}: Error finding container 4e08c503092ec17479f902f593a370dc097a306eb040daedc8dfc481e8269c5e: Status 404 returned error can't find the container with id 4e08c503092ec17479f902f593a370dc097a306eb040daedc8dfc481e8269c5e Apr 16 04:31:45.335723 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:45.335689 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5998c4dcdc-rfsn4" event={"ID":"d1ed06a6-893b-4e37-b06e-a2d444b721cb","Type":"ContainerStarted","Data":"4e08c503092ec17479f902f593a370dc097a306eb040daedc8dfc481e8269c5e"} Apr 16 04:31:46.340718 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:46.340683 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5998c4dcdc-rfsn4" event={"ID":"d1ed06a6-893b-4e37-b06e-a2d444b721cb","Type":"ContainerStarted","Data":"5a95967f88ead2726140618538b6e58c2cac6979f283fabf2f023a36bc9c579f"} Apr 16 04:31:46.358564 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:46.358503 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5998c4dcdc-rfsn4" podStartSLOduration=2.3584862429999998 podStartE2EDuration="2.358486243s" podCreationTimestamp="2026-04-16 04:31:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 04:31:46.356552338 +0000 UTC m=+454.259328339" watchObservedRunningTime="2026-04-16 04:31:46.358486243 +0000 UTC m=+454.261262241" Apr 16 04:31:47.305007 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:47.304979 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9cp6s" Apr 16 04:31:55.143748 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:55.143712 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5998c4dcdc-rfsn4" Apr 16 04:31:55.144141 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:55.143762 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5998c4dcdc-rfsn4" Apr 16 04:31:55.148620 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:55.148600 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5998c4dcdc-rfsn4" Apr 16 04:31:55.376378 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:55.376348 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5998c4dcdc-rfsn4" Apr 16 04:31:55.419896 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:55.419814 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-88b6ddcd5-x7jmn"] Apr 16 04:31:59.231939 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:59.231903 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-cqlkc"] Apr 16 04:31:59.232381 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:59.232141 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-cqlkc" podUID="bdf7259c-f67a-4238-96eb-da5d561be217" containerName="manager" containerID="cri-o://180090cbba2774c274dce6e9904788e91ca874195ab4416c0e0844e7bd8107a5" gracePeriod=2 Apr 16 04:31:59.241460 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:59.241433 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-cqlkc"] Apr 16 04:31:59.255257 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:59.255230 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-fl4zz"] Apr 16 04:31:59.255671 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:59.255656 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bdf7259c-f67a-4238-96eb-da5d561be217" containerName="manager" Apr 16 04:31:59.255723 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:59.255673 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdf7259c-f67a-4238-96eb-da5d561be217" containerName="manager" Apr 16 04:31:59.255757 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:59.255727 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="bdf7259c-f67a-4238-96eb-da5d561be217" containerName="manager" Apr 16 04:31:59.259851 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:59.259829 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-fl4zz" Apr 16 04:31:59.263087 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:59.263054 2575 status_manager.go:895] "Failed to get status for pod" podUID="bdf7259c-f67a-4238-96eb-da5d561be217" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-cqlkc" err="pods \"limitador-operator-controller-manager-85c4996f8c-cqlkc\" is forbidden: User \"system:node:ip-10-0-133-103.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-103.ec2.internal' and this object" Apr 16 04:31:59.269428 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:59.269403 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-fl4zz"] Apr 16 04:31:59.388587 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:59.388557 2575 generic.go:358] "Generic (PLEG): container finished" podID="bdf7259c-f67a-4238-96eb-da5d561be217" containerID="180090cbba2774c274dce6e9904788e91ca874195ab4416c0e0844e7bd8107a5" exitCode=0 Apr 16 04:31:59.417934 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:59.417904 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzhvj\" (UniqueName: \"kubernetes.io/projected/6724642e-f23d-4124-bd71-50f9d98f8ed6-kube-api-access-qzhvj\") pod \"limitador-operator-controller-manager-85c4996f8c-fl4zz\" (UID: \"6724642e-f23d-4124-bd71-50f9d98f8ed6\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-fl4zz" Apr 16 04:31:59.472737 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:59.472713 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-cqlkc" Apr 16 04:31:59.475194 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:59.475165 2575 status_manager.go:895] "Failed to get status for pod" podUID="bdf7259c-f67a-4238-96eb-da5d561be217" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-cqlkc" err="pods \"limitador-operator-controller-manager-85c4996f8c-cqlkc\" is forbidden: User \"system:node:ip-10-0-133-103.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-103.ec2.internal' and this object" Apr 16 04:31:59.518766 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:59.518729 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qzhvj\" (UniqueName: \"kubernetes.io/projected/6724642e-f23d-4124-bd71-50f9d98f8ed6-kube-api-access-qzhvj\") pod \"limitador-operator-controller-manager-85c4996f8c-fl4zz\" (UID: \"6724642e-f23d-4124-bd71-50f9d98f8ed6\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-fl4zz" Apr 16 04:31:59.534646 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:59.534608 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzhvj\" (UniqueName: \"kubernetes.io/projected/6724642e-f23d-4124-bd71-50f9d98f8ed6-kube-api-access-qzhvj\") pod \"limitador-operator-controller-manager-85c4996f8c-fl4zz\" (UID: \"6724642e-f23d-4124-bd71-50f9d98f8ed6\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-fl4zz" Apr 16 04:31:59.610701 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:59.610666 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-fl4zz" Apr 16 04:31:59.619714 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:59.619600 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpqbc\" (UniqueName: \"kubernetes.io/projected/bdf7259c-f67a-4238-96eb-da5d561be217-kube-api-access-tpqbc\") pod \"bdf7259c-f67a-4238-96eb-da5d561be217\" (UID: \"bdf7259c-f67a-4238-96eb-da5d561be217\") " Apr 16 04:31:59.621529 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:59.621504 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdf7259c-f67a-4238-96eb-da5d561be217-kube-api-access-tpqbc" (OuterVolumeSpecName: "kube-api-access-tpqbc") pod "bdf7259c-f67a-4238-96eb-da5d561be217" (UID: "bdf7259c-f67a-4238-96eb-da5d561be217"). InnerVolumeSpecName "kube-api-access-tpqbc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 04:31:59.720881 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:59.720847 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tpqbc\" (UniqueName: \"kubernetes.io/projected/bdf7259c-f67a-4238-96eb-da5d561be217-kube-api-access-tpqbc\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:31:59.736567 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:59.736541 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-fl4zz"] Apr 16 04:31:59.738670 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:31:59.738647 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6724642e_f23d_4124_bd71_50f9d98f8ed6.slice/crio-d44f6468cd7723da1f866dd68b0099eeacd0e36314c422bb25601ca6025d846d WatchSource:0}: Error finding container d44f6468cd7723da1f866dd68b0099eeacd0e36314c422bb25601ca6025d846d: Status 404 returned error can't find the container with id d44f6468cd7723da1f866dd68b0099eeacd0e36314c422bb25601ca6025d846d Apr 16 04:31:59.803685 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:59.803657 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-ckrjs"] Apr 16 04:31:59.807075 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:59.807057 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-ckrjs" Apr 16 04:31:59.809641 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:59.809620 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-ppqd7\"" Apr 16 04:31:59.818500 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:59.818479 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-ckrjs"] Apr 16 04:31:59.831784 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:59.831749 2575 status_manager.go:895] "Failed to get status for pod" podUID="bdf7259c-f67a-4238-96eb-da5d561be217" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-cqlkc" err="pods \"limitador-operator-controller-manager-85c4996f8c-cqlkc\" is forbidden: User \"system:node:ip-10-0-133-103.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-103.ec2.internal' and this object" Apr 16 04:31:59.922992 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:59.922957 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/ea10468f-116f-46e5-b15d-dc8a905e4834-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-ckrjs\" (UID: \"ea10468f-116f-46e5-b15d-dc8a905e4834\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-ckrjs" Apr 16 04:31:59.923182 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:31:59.923025 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8nps\" (UniqueName: \"kubernetes.io/projected/ea10468f-116f-46e5-b15d-dc8a905e4834-kube-api-access-c8nps\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-ckrjs\" (UID: \"ea10468f-116f-46e5-b15d-dc8a905e4834\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-ckrjs" Apr 16 04:32:00.024629 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:00.024519 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8nps\" (UniqueName: \"kubernetes.io/projected/ea10468f-116f-46e5-b15d-dc8a905e4834-kube-api-access-c8nps\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-ckrjs\" (UID: \"ea10468f-116f-46e5-b15d-dc8a905e4834\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-ckrjs" Apr 16 04:32:00.024629 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:00.024618 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/ea10468f-116f-46e5-b15d-dc8a905e4834-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-ckrjs\" (UID: \"ea10468f-116f-46e5-b15d-dc8a905e4834\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-ckrjs" Apr 16 04:32:00.025021 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:00.024999 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/ea10468f-116f-46e5-b15d-dc8a905e4834-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-ckrjs\" (UID: \"ea10468f-116f-46e5-b15d-dc8a905e4834\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-ckrjs" Apr 16 04:32:00.038470 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:00.038439 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8nps\" (UniqueName: \"kubernetes.io/projected/ea10468f-116f-46e5-b15d-dc8a905e4834-kube-api-access-c8nps\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-ckrjs\" (UID: \"ea10468f-116f-46e5-b15d-dc8a905e4834\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-ckrjs" Apr 16 04:32:00.118658 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:00.118621 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-ckrjs" Apr 16 04:32:00.242445 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:00.242419 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-ckrjs"] Apr 16 04:32:00.244600 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:32:00.244570 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea10468f_116f_46e5_b15d_dc8a905e4834.slice/crio-17b215e2edf880eef090c09655d303dd0fc6d5d33cca8cfbb70394c39e9b4140 WatchSource:0}: Error finding container 17b215e2edf880eef090c09655d303dd0fc6d5d33cca8cfbb70394c39e9b4140: Status 404 returned error can't find the container with id 17b215e2edf880eef090c09655d303dd0fc6d5d33cca8cfbb70394c39e9b4140 Apr 16 04:32:00.393144 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:00.393063 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-cqlkc" Apr 16 04:32:00.393356 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:00.393063 2575 scope.go:117] "RemoveContainer" containerID="180090cbba2774c274dce6e9904788e91ca874195ab4416c0e0844e7bd8107a5" Apr 16 04:32:00.394443 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:00.394417 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-ckrjs" event={"ID":"ea10468f-116f-46e5-b15d-dc8a905e4834","Type":"ContainerStarted","Data":"17b215e2edf880eef090c09655d303dd0fc6d5d33cca8cfbb70394c39e9b4140"} Apr 16 04:32:00.395861 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:00.395838 2575 status_manager.go:895] "Failed to get status for pod" podUID="bdf7259c-f67a-4238-96eb-da5d561be217" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-cqlkc" err="pods \"limitador-operator-controller-manager-85c4996f8c-cqlkc\" is forbidden: User \"system:node:ip-10-0-133-103.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-103.ec2.internal' and this object" Apr 16 04:32:00.395973 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:00.395955 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-fl4zz" event={"ID":"6724642e-f23d-4124-bd71-50f9d98f8ed6","Type":"ContainerStarted","Data":"ae0c4b62197e3a16bcf5ccf67132ebbdf50d3eba30c80a4ead78e18b3b10fb73"} Apr 16 04:32:00.396025 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:00.395980 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-fl4zz" event={"ID":"6724642e-f23d-4124-bd71-50f9d98f8ed6","Type":"ContainerStarted","Data":"d44f6468cd7723da1f866dd68b0099eeacd0e36314c422bb25601ca6025d846d"} Apr 16 04:32:00.396120 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:00.396105 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-fl4zz" Apr 16 04:32:00.398236 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:00.398207 2575 status_manager.go:895] "Failed to get status for pod" podUID="bdf7259c-f67a-4238-96eb-da5d561be217" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-cqlkc" err="pods \"limitador-operator-controller-manager-85c4996f8c-cqlkc\" is forbidden: User \"system:node:ip-10-0-133-103.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-103.ec2.internal' and this object" Apr 16 04:32:00.414815 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:00.414761 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-fl4zz" podStartSLOduration=1.414744346 podStartE2EDuration="1.414744346s" podCreationTimestamp="2026-04-16 04:31:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 04:32:00.414237152 +0000 UTC m=+468.317013164" watchObservedRunningTime="2026-04-16 04:32:00.414744346 +0000 UTC m=+468.317520345" Apr 16 04:32:00.416418 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:00.416390 2575 status_manager.go:895] "Failed to get status for pod" podUID="bdf7259c-f67a-4238-96eb-da5d561be217" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-cqlkc" err="pods \"limitador-operator-controller-manager-85c4996f8c-cqlkc\" is forbidden: User \"system:node:ip-10-0-133-103.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-103.ec2.internal' and this object" Apr 16 04:32:00.707695 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:00.707617 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdf7259c-f67a-4238-96eb-da5d561be217" path="/var/lib/kubelet/pods/bdf7259c-f67a-4238-96eb-da5d561be217/volumes" Apr 16 04:32:04.415823 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:04.415791 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-ckrjs" event={"ID":"ea10468f-116f-46e5-b15d-dc8a905e4834","Type":"ContainerStarted","Data":"7f5b5ebc347d0d766f039caf62e524717b55fd6326b3f3708a95ed2d48afe07d"} Apr 16 04:32:04.416232 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:04.415899 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-ckrjs" Apr 16 04:32:04.437060 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:04.436992 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-ckrjs" podStartSLOduration=1.49093428 podStartE2EDuration="5.436975718s" podCreationTimestamp="2026-04-16 04:31:59 +0000 UTC" firstStartedPulling="2026-04-16 04:32:00.246890447 +0000 UTC m=+468.149666424" lastFinishedPulling="2026-04-16 04:32:04.19293188 +0000 UTC m=+472.095707862" observedRunningTime="2026-04-16 04:32:04.434314959 +0000 UTC m=+472.337090983" watchObservedRunningTime="2026-04-16 04:32:04.436975718 +0000 UTC m=+472.339751717" Apr 16 04:32:11.403591 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:11.403563 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-fl4zz" Apr 16 04:32:15.422807 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:15.422778 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-ckrjs" Apr 16 04:32:20.441757 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:20.441705 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-88b6ddcd5-x7jmn" podUID="ef624b49-759e-4d94-85ea-e26cf039a0fc" containerName="console" containerID="cri-o://f3998ea6d54d446189df1a40937de910aed477d18396af751ab3a45362932f10" gracePeriod=15 Apr 16 04:32:20.680659 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:20.680634 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-88b6ddcd5-x7jmn_ef624b49-759e-4d94-85ea-e26cf039a0fc/console/0.log" Apr 16 04:32:20.680786 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:20.680696 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-88b6ddcd5-x7jmn" Apr 16 04:32:20.810993 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:20.810955 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ef624b49-759e-4d94-85ea-e26cf039a0fc-oauth-serving-cert\") pod \"ef624b49-759e-4d94-85ea-e26cf039a0fc\" (UID: \"ef624b49-759e-4d94-85ea-e26cf039a0fc\") " Apr 16 04:32:20.811176 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:20.811013 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ef624b49-759e-4d94-85ea-e26cf039a0fc-console-oauth-config\") pod \"ef624b49-759e-4d94-85ea-e26cf039a0fc\" (UID: \"ef624b49-759e-4d94-85ea-e26cf039a0fc\") " Apr 16 04:32:20.811176 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:20.811052 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ef624b49-759e-4d94-85ea-e26cf039a0fc-console-config\") pod \"ef624b49-759e-4d94-85ea-e26cf039a0fc\" (UID: \"ef624b49-759e-4d94-85ea-e26cf039a0fc\") " Apr 16 04:32:20.811176 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:20.811085 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ef624b49-759e-4d94-85ea-e26cf039a0fc-service-ca\") pod \"ef624b49-759e-4d94-85ea-e26cf039a0fc\" (UID: \"ef624b49-759e-4d94-85ea-e26cf039a0fc\") " Apr 16 04:32:20.811176 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:20.811128 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zr2zt\" (UniqueName: \"kubernetes.io/projected/ef624b49-759e-4d94-85ea-e26cf039a0fc-kube-api-access-zr2zt\") pod \"ef624b49-759e-4d94-85ea-e26cf039a0fc\" (UID: \"ef624b49-759e-4d94-85ea-e26cf039a0fc\") " Apr 16 04:32:20.811176 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:20.811172 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef624b49-759e-4d94-85ea-e26cf039a0fc-console-serving-cert\") pod \"ef624b49-759e-4d94-85ea-e26cf039a0fc\" (UID: \"ef624b49-759e-4d94-85ea-e26cf039a0fc\") " Apr 16 04:32:20.811447 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:20.811197 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef624b49-759e-4d94-85ea-e26cf039a0fc-trusted-ca-bundle\") pod \"ef624b49-759e-4d94-85ea-e26cf039a0fc\" (UID: \"ef624b49-759e-4d94-85ea-e26cf039a0fc\") " Apr 16 04:32:20.811506 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:20.811480 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef624b49-759e-4d94-85ea-e26cf039a0fc-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ef624b49-759e-4d94-85ea-e26cf039a0fc" (UID: "ef624b49-759e-4d94-85ea-e26cf039a0fc"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 04:32:20.811560 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:20.811519 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef624b49-759e-4d94-85ea-e26cf039a0fc-console-config" (OuterVolumeSpecName: "console-config") pod "ef624b49-759e-4d94-85ea-e26cf039a0fc" (UID: "ef624b49-759e-4d94-85ea-e26cf039a0fc"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 04:32:20.811774 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:20.811747 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef624b49-759e-4d94-85ea-e26cf039a0fc-service-ca" (OuterVolumeSpecName: "service-ca") pod "ef624b49-759e-4d94-85ea-e26cf039a0fc" (UID: "ef624b49-759e-4d94-85ea-e26cf039a0fc"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 04:32:20.811878 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:20.811744 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef624b49-759e-4d94-85ea-e26cf039a0fc-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ef624b49-759e-4d94-85ea-e26cf039a0fc" (UID: "ef624b49-759e-4d94-85ea-e26cf039a0fc"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 04:32:20.813448 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:20.813419 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef624b49-759e-4d94-85ea-e26cf039a0fc-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ef624b49-759e-4d94-85ea-e26cf039a0fc" (UID: "ef624b49-759e-4d94-85ea-e26cf039a0fc"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 04:32:20.813536 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:20.813476 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef624b49-759e-4d94-85ea-e26cf039a0fc-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ef624b49-759e-4d94-85ea-e26cf039a0fc" (UID: "ef624b49-759e-4d94-85ea-e26cf039a0fc"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 04:32:20.813536 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:20.813483 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef624b49-759e-4d94-85ea-e26cf039a0fc-kube-api-access-zr2zt" (OuterVolumeSpecName: "kube-api-access-zr2zt") pod "ef624b49-759e-4d94-85ea-e26cf039a0fc" (UID: "ef624b49-759e-4d94-85ea-e26cf039a0fc"). InnerVolumeSpecName "kube-api-access-zr2zt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 04:32:20.912181 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:20.912150 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ef624b49-759e-4d94-85ea-e26cf039a0fc-oauth-serving-cert\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:32:20.912181 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:20.912175 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ef624b49-759e-4d94-85ea-e26cf039a0fc-console-oauth-config\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:32:20.912181 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:20.912185 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ef624b49-759e-4d94-85ea-e26cf039a0fc-console-config\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:32:20.912426 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:20.912194 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ef624b49-759e-4d94-85ea-e26cf039a0fc-service-ca\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:32:20.912426 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:20.912203 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zr2zt\" (UniqueName: \"kubernetes.io/projected/ef624b49-759e-4d94-85ea-e26cf039a0fc-kube-api-access-zr2zt\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:32:20.912426 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:20.912212 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef624b49-759e-4d94-85ea-e26cf039a0fc-console-serving-cert\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:32:20.912426 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:20.912221 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef624b49-759e-4d94-85ea-e26cf039a0fc-trusted-ca-bundle\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:32:21.481449 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:21.481423 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-88b6ddcd5-x7jmn_ef624b49-759e-4d94-85ea-e26cf039a0fc/console/0.log" Apr 16 04:32:21.481925 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:21.481462 2575 generic.go:358] "Generic (PLEG): container finished" podID="ef624b49-759e-4d94-85ea-e26cf039a0fc" containerID="f3998ea6d54d446189df1a40937de910aed477d18396af751ab3a45362932f10" exitCode=2 Apr 16 04:32:21.481925 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:21.481492 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-88b6ddcd5-x7jmn" event={"ID":"ef624b49-759e-4d94-85ea-e26cf039a0fc","Type":"ContainerDied","Data":"f3998ea6d54d446189df1a40937de910aed477d18396af751ab3a45362932f10"} Apr 16 04:32:21.481925 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:21.481532 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-88b6ddcd5-x7jmn" event={"ID":"ef624b49-759e-4d94-85ea-e26cf039a0fc","Type":"ContainerDied","Data":"61d8d4659941950e20d5ec218a1c1436724c9a912e046651e5dd0f0af806df52"} Apr 16 04:32:21.481925 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:21.481541 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-88b6ddcd5-x7jmn" Apr 16 04:32:21.481925 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:21.481547 2575 scope.go:117] "RemoveContainer" containerID="f3998ea6d54d446189df1a40937de910aed477d18396af751ab3a45362932f10" Apr 16 04:32:21.490357 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:21.490319 2575 scope.go:117] "RemoveContainer" containerID="f3998ea6d54d446189df1a40937de910aed477d18396af751ab3a45362932f10" Apr 16 04:32:21.490635 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:32:21.490616 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3998ea6d54d446189df1a40937de910aed477d18396af751ab3a45362932f10\": container with ID starting with f3998ea6d54d446189df1a40937de910aed477d18396af751ab3a45362932f10 not found: ID does not exist" containerID="f3998ea6d54d446189df1a40937de910aed477d18396af751ab3a45362932f10" Apr 16 04:32:21.490699 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:21.490641 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3998ea6d54d446189df1a40937de910aed477d18396af751ab3a45362932f10"} err="failed to get container status \"f3998ea6d54d446189df1a40937de910aed477d18396af751ab3a45362932f10\": rpc error: code = NotFound desc = could not find container \"f3998ea6d54d446189df1a40937de910aed477d18396af751ab3a45362932f10\": container with ID starting with f3998ea6d54d446189df1a40937de910aed477d18396af751ab3a45362932f10 not found: ID does not exist" Apr 16 04:32:21.505595 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:21.505569 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-88b6ddcd5-x7jmn"] Apr 16 04:32:21.511156 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:21.511119 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-88b6ddcd5-x7jmn"] Apr 16 04:32:22.708274 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:22.708231 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef624b49-759e-4d94-85ea-e26cf039a0fc" path="/var/lib/kubelet/pods/ef624b49-759e-4d94-85ea-e26cf039a0fc/volumes" Apr 16 04:32:36.999705 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:36.999669 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-vfgc7"] Apr 16 04:32:37.000126 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:37.000070 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ef624b49-759e-4d94-85ea-e26cf039a0fc" containerName="console" Apr 16 04:32:37.000126 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:37.000083 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef624b49-759e-4d94-85ea-e26cf039a0fc" containerName="console" Apr 16 04:32:37.000203 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:37.000157 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="ef624b49-759e-4d94-85ea-e26cf039a0fc" containerName="console" Apr 16 04:32:37.020268 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:37.020229 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-vfgc7"] Apr 16 04:32:37.020472 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:37.020380 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-vfgc7" Apr 16 04:32:37.022943 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:37.022920 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-w8nb4\"" Apr 16 04:32:37.169142 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:37.169098 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpf72\" (UniqueName: \"kubernetes.io/projected/afb0cdc8-2e9d-408d-bf4b-eef5baafcbc0-kube-api-access-zpf72\") pod \"authorino-f99f4b5cd-vfgc7\" (UID: \"afb0cdc8-2e9d-408d-bf4b-eef5baafcbc0\") " pod="kuadrant-system/authorino-f99f4b5cd-vfgc7" Apr 16 04:32:37.270203 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:37.270165 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zpf72\" (UniqueName: \"kubernetes.io/projected/afb0cdc8-2e9d-408d-bf4b-eef5baafcbc0-kube-api-access-zpf72\") pod \"authorino-f99f4b5cd-vfgc7\" (UID: \"afb0cdc8-2e9d-408d-bf4b-eef5baafcbc0\") " pod="kuadrant-system/authorino-f99f4b5cd-vfgc7" Apr 16 04:32:37.278220 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:37.278185 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpf72\" (UniqueName: \"kubernetes.io/projected/afb0cdc8-2e9d-408d-bf4b-eef5baafcbc0-kube-api-access-zpf72\") pod \"authorino-f99f4b5cd-vfgc7\" (UID: \"afb0cdc8-2e9d-408d-bf4b-eef5baafcbc0\") " pod="kuadrant-system/authorino-f99f4b5cd-vfgc7" Apr 16 04:32:37.331968 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:37.331933 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-vfgc7" Apr 16 04:32:37.459386 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:37.459360 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-vfgc7"] Apr 16 04:32:37.461619 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:32:37.461590 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafb0cdc8_2e9d_408d_bf4b_eef5baafcbc0.slice/crio-3989000689d5274b8fe9b12a0891f5bca66c71579951ada77d55e25d837a6089 WatchSource:0}: Error finding container 3989000689d5274b8fe9b12a0891f5bca66c71579951ada77d55e25d837a6089: Status 404 returned error can't find the container with id 3989000689d5274b8fe9b12a0891f5bca66c71579951ada77d55e25d837a6089 Apr 16 04:32:37.542461 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:37.542366 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-vfgc7" event={"ID":"afb0cdc8-2e9d-408d-bf4b-eef5baafcbc0","Type":"ContainerStarted","Data":"3989000689d5274b8fe9b12a0891f5bca66c71579951ada77d55e25d837a6089"} Apr 16 04:32:40.555192 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:40.555151 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-vfgc7" event={"ID":"afb0cdc8-2e9d-408d-bf4b-eef5baafcbc0","Type":"ContainerStarted","Data":"2da2c2689bbebe456c87168fd7d2f5792a5c0f2d6c9d679add07a97aca10ed5d"} Apr 16 04:32:40.571448 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:40.571396 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-vfgc7" podStartSLOduration=2.100768576 podStartE2EDuration="4.571381559s" podCreationTimestamp="2026-04-16 04:32:36 +0000 UTC" firstStartedPulling="2026-04-16 04:32:37.462960682 +0000 UTC m=+505.365736660" lastFinishedPulling="2026-04-16 04:32:39.933573657 +0000 UTC m=+507.836349643" observedRunningTime="2026-04-16 04:32:40.569611079 +0000 UTC m=+508.472387079" watchObservedRunningTime="2026-04-16 04:32:40.571381559 +0000 UTC m=+508.474157558" Apr 16 04:32:43.471851 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:43.471813 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-vfgc7"] Apr 16 04:32:43.472301 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:43.472003 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-vfgc7" podUID="afb0cdc8-2e9d-408d-bf4b-eef5baafcbc0" containerName="authorino" containerID="cri-o://2da2c2689bbebe456c87168fd7d2f5792a5c0f2d6c9d679add07a97aca10ed5d" gracePeriod=30 Apr 16 04:32:43.717566 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:43.717538 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-vfgc7" Apr 16 04:32:43.832324 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:43.832292 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpf72\" (UniqueName: \"kubernetes.io/projected/afb0cdc8-2e9d-408d-bf4b-eef5baafcbc0-kube-api-access-zpf72\") pod \"afb0cdc8-2e9d-408d-bf4b-eef5baafcbc0\" (UID: \"afb0cdc8-2e9d-408d-bf4b-eef5baafcbc0\") " Apr 16 04:32:43.834313 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:43.834280 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afb0cdc8-2e9d-408d-bf4b-eef5baafcbc0-kube-api-access-zpf72" (OuterVolumeSpecName: "kube-api-access-zpf72") pod "afb0cdc8-2e9d-408d-bf4b-eef5baafcbc0" (UID: "afb0cdc8-2e9d-408d-bf4b-eef5baafcbc0"). InnerVolumeSpecName "kube-api-access-zpf72". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 04:32:43.932906 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:43.932872 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zpf72\" (UniqueName: \"kubernetes.io/projected/afb0cdc8-2e9d-408d-bf4b-eef5baafcbc0-kube-api-access-zpf72\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:32:44.572511 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:44.572470 2575 generic.go:358] "Generic (PLEG): container finished" podID="afb0cdc8-2e9d-408d-bf4b-eef5baafcbc0" containerID="2da2c2689bbebe456c87168fd7d2f5792a5c0f2d6c9d679add07a97aca10ed5d" exitCode=0 Apr 16 04:32:44.572939 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:44.572518 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-vfgc7" Apr 16 04:32:44.572939 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:44.572549 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-vfgc7" event={"ID":"afb0cdc8-2e9d-408d-bf4b-eef5baafcbc0","Type":"ContainerDied","Data":"2da2c2689bbebe456c87168fd7d2f5792a5c0f2d6c9d679add07a97aca10ed5d"} Apr 16 04:32:44.572939 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:44.572579 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-vfgc7" event={"ID":"afb0cdc8-2e9d-408d-bf4b-eef5baafcbc0","Type":"ContainerDied","Data":"3989000689d5274b8fe9b12a0891f5bca66c71579951ada77d55e25d837a6089"} Apr 16 04:32:44.572939 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:44.572595 2575 scope.go:117] "RemoveContainer" containerID="2da2c2689bbebe456c87168fd7d2f5792a5c0f2d6c9d679add07a97aca10ed5d" Apr 16 04:32:44.582282 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:44.582260 2575 scope.go:117] "RemoveContainer" containerID="2da2c2689bbebe456c87168fd7d2f5792a5c0f2d6c9d679add07a97aca10ed5d" Apr 16 04:32:44.582565 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:32:44.582548 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2da2c2689bbebe456c87168fd7d2f5792a5c0f2d6c9d679add07a97aca10ed5d\": container with ID starting with 2da2c2689bbebe456c87168fd7d2f5792a5c0f2d6c9d679add07a97aca10ed5d not found: ID does not exist" containerID="2da2c2689bbebe456c87168fd7d2f5792a5c0f2d6c9d679add07a97aca10ed5d" Apr 16 04:32:44.582624 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:44.582573 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2da2c2689bbebe456c87168fd7d2f5792a5c0f2d6c9d679add07a97aca10ed5d"} err="failed to get container status \"2da2c2689bbebe456c87168fd7d2f5792a5c0f2d6c9d679add07a97aca10ed5d\": rpc error: code = NotFound desc = could not find container \"2da2c2689bbebe456c87168fd7d2f5792a5c0f2d6c9d679add07a97aca10ed5d\": container with ID starting with 2da2c2689bbebe456c87168fd7d2f5792a5c0f2d6c9d679add07a97aca10ed5d not found: ID does not exist" Apr 16 04:32:44.593232 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:44.593210 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-vfgc7"] Apr 16 04:32:44.596216 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:44.596195 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-vfgc7"] Apr 16 04:32:44.707049 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:32:44.707020 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afb0cdc8-2e9d-408d-bf4b-eef5baafcbc0" path="/var/lib/kubelet/pods/afb0cdc8-2e9d-408d-bf4b-eef5baafcbc0/volumes" Apr 16 04:33:10.890683 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:10.890601 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae135044nqm"] Apr 16 04:33:10.891124 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:10.891021 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="afb0cdc8-2e9d-408d-bf4b-eef5baafcbc0" containerName="authorino" Apr 16 04:33:10.891124 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:10.891032 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="afb0cdc8-2e9d-408d-bf4b-eef5baafcbc0" containerName="authorino" Apr 16 04:33:10.891124 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:10.891110 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="afb0cdc8-2e9d-408d-bf4b-eef5baafcbc0" containerName="authorino" Apr 16 04:33:10.894630 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:10.894612 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae135044nqm" Apr 16 04:33:10.897487 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:10.897461 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 04:33:10.898619 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:10.898603 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-v68fx\"" Apr 16 04:33:10.898707 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:10.898634 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 04:33:10.902598 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:10.902575 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae135044nqm"] Apr 16 04:33:10.966919 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:10.966884 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tqqq\" (UniqueName: \"kubernetes.io/projected/2e499c1a-69c6-4dcd-b112-d0a9186b4efe-kube-api-access-5tqqq\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae135044nqm\" (UID: \"2e499c1a-69c6-4dcd-b112-d0a9186b4efe\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae135044nqm" Apr 16 04:33:10.967095 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:10.966927 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e499c1a-69c6-4dcd-b112-d0a9186b4efe-bundle\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae135044nqm\" (UID: \"2e499c1a-69c6-4dcd-b112-d0a9186b4efe\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae135044nqm" Apr 16 04:33:10.967095 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:10.967019 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e499c1a-69c6-4dcd-b112-d0a9186b4efe-util\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae135044nqm\" (UID: \"2e499c1a-69c6-4dcd-b112-d0a9186b4efe\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae135044nqm" Apr 16 04:33:11.067804 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:11.067765 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e499c1a-69c6-4dcd-b112-d0a9186b4efe-util\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae135044nqm\" (UID: \"2e499c1a-69c6-4dcd-b112-d0a9186b4efe\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae135044nqm" Apr 16 04:33:11.067964 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:11.067842 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5tqqq\" (UniqueName: \"kubernetes.io/projected/2e499c1a-69c6-4dcd-b112-d0a9186b4efe-kube-api-access-5tqqq\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae135044nqm\" (UID: \"2e499c1a-69c6-4dcd-b112-d0a9186b4efe\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae135044nqm" Apr 16 04:33:11.067964 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:11.067875 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e499c1a-69c6-4dcd-b112-d0a9186b4efe-bundle\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae135044nqm\" (UID: \"2e499c1a-69c6-4dcd-b112-d0a9186b4efe\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae135044nqm" Apr 16 04:33:11.068251 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:11.068230 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e499c1a-69c6-4dcd-b112-d0a9186b4efe-util\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae135044nqm\" (UID: \"2e499c1a-69c6-4dcd-b112-d0a9186b4efe\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae135044nqm" Apr 16 04:33:11.068303 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:11.068242 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e499c1a-69c6-4dcd-b112-d0a9186b4efe-bundle\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae135044nqm\" (UID: \"2e499c1a-69c6-4dcd-b112-d0a9186b4efe\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae135044nqm" Apr 16 04:33:11.077767 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:11.077744 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tqqq\" (UniqueName: \"kubernetes.io/projected/2e499c1a-69c6-4dcd-b112-d0a9186b4efe-kube-api-access-5tqqq\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae135044nqm\" (UID: \"2e499c1a-69c6-4dcd-b112-d0a9186b4efe\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae135044nqm" Apr 16 04:33:11.204912 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:11.204804 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae135044nqm" Apr 16 04:33:11.331522 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:11.331491 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae135044nqm"] Apr 16 04:33:11.333539 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:33:11.333512 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e499c1a_69c6_4dcd_b112_d0a9186b4efe.slice/crio-dc983a43beb609ae11985f6b9b0d60cd205b35c5d294944a50d6440fd5838345 WatchSource:0}: Error finding container dc983a43beb609ae11985f6b9b0d60cd205b35c5d294944a50d6440fd5838345: Status 404 returned error can't find the container with id dc983a43beb609ae11985f6b9b0d60cd205b35c5d294944a50d6440fd5838345 Apr 16 04:33:11.676544 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:11.676507 2575 generic.go:358] "Generic (PLEG): container finished" podID="2e499c1a-69c6-4dcd-b112-d0a9186b4efe" containerID="43428849fd6f01f0d15ef80032ed539324481c3c901f94b11dd6424e4cbc024c" exitCode=0 Apr 16 04:33:11.676544 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:11.676544 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae135044nqm" event={"ID":"2e499c1a-69c6-4dcd-b112-d0a9186b4efe","Type":"ContainerDied","Data":"43428849fd6f01f0d15ef80032ed539324481c3c901f94b11dd6424e4cbc024c"} Apr 16 04:33:11.676763 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:11.676564 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae135044nqm" event={"ID":"2e499c1a-69c6-4dcd-b112-d0a9186b4efe","Type":"ContainerStarted","Data":"dc983a43beb609ae11985f6b9b0d60cd205b35c5d294944a50d6440fd5838345"} Apr 16 04:33:12.682738 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:12.682641 2575 generic.go:358] "Generic (PLEG): container finished" podID="2e499c1a-69c6-4dcd-b112-d0a9186b4efe" containerID="cf70f38655615ba1ccd400493d81b9a9b64d30eb4b0cd8dde9d9b98a2da40b5f" exitCode=0 Apr 16 04:33:12.683159 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:12.682730 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae135044nqm" event={"ID":"2e499c1a-69c6-4dcd-b112-d0a9186b4efe","Type":"ContainerDied","Data":"cf70f38655615ba1ccd400493d81b9a9b64d30eb4b0cd8dde9d9b98a2da40b5f"} Apr 16 04:33:13.688571 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:13.688486 2575 generic.go:358] "Generic (PLEG): container finished" podID="2e499c1a-69c6-4dcd-b112-d0a9186b4efe" containerID="e834fbd016dbc42b7d2548cdfc94aa7c20e12c440b5e37fba74fb2797831836e" exitCode=0 Apr 16 04:33:13.688900 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:13.688572 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae135044nqm" event={"ID":"2e499c1a-69c6-4dcd-b112-d0a9186b4efe","Type":"ContainerDied","Data":"e834fbd016dbc42b7d2548cdfc94aa7c20e12c440b5e37fba74fb2797831836e"} Apr 16 04:33:14.827721 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:14.827693 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae135044nqm" Apr 16 04:33:14.903579 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:14.903537 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e499c1a-69c6-4dcd-b112-d0a9186b4efe-util\") pod \"2e499c1a-69c6-4dcd-b112-d0a9186b4efe\" (UID: \"2e499c1a-69c6-4dcd-b112-d0a9186b4efe\") " Apr 16 04:33:14.903740 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:14.903626 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tqqq\" (UniqueName: \"kubernetes.io/projected/2e499c1a-69c6-4dcd-b112-d0a9186b4efe-kube-api-access-5tqqq\") pod \"2e499c1a-69c6-4dcd-b112-d0a9186b4efe\" (UID: \"2e499c1a-69c6-4dcd-b112-d0a9186b4efe\") " Apr 16 04:33:14.903740 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:14.903698 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e499c1a-69c6-4dcd-b112-d0a9186b4efe-bundle\") pod \"2e499c1a-69c6-4dcd-b112-d0a9186b4efe\" (UID: \"2e499c1a-69c6-4dcd-b112-d0a9186b4efe\") " Apr 16 04:33:14.904221 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:14.904192 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e499c1a-69c6-4dcd-b112-d0a9186b4efe-bundle" (OuterVolumeSpecName: "bundle") pod "2e499c1a-69c6-4dcd-b112-d0a9186b4efe" (UID: "2e499c1a-69c6-4dcd-b112-d0a9186b4efe"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 04:33:14.905777 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:14.905743 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e499c1a-69c6-4dcd-b112-d0a9186b4efe-kube-api-access-5tqqq" (OuterVolumeSpecName: "kube-api-access-5tqqq") pod "2e499c1a-69c6-4dcd-b112-d0a9186b4efe" (UID: "2e499c1a-69c6-4dcd-b112-d0a9186b4efe"). InnerVolumeSpecName "kube-api-access-5tqqq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 04:33:14.909295 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:14.909275 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e499c1a-69c6-4dcd-b112-d0a9186b4efe-util" (OuterVolumeSpecName: "util") pod "2e499c1a-69c6-4dcd-b112-d0a9186b4efe" (UID: "2e499c1a-69c6-4dcd-b112-d0a9186b4efe"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 04:33:15.005099 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:15.005063 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e499c1a-69c6-4dcd-b112-d0a9186b4efe-bundle\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:33:15.005099 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:15.005097 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e499c1a-69c6-4dcd-b112-d0a9186b4efe-util\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:33:15.005279 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:15.005107 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5tqqq\" (UniqueName: \"kubernetes.io/projected/2e499c1a-69c6-4dcd-b112-d0a9186b4efe-kube-api-access-5tqqq\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:33:15.701206 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:15.701171 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae135044nqm" event={"ID":"2e499c1a-69c6-4dcd-b112-d0a9186b4efe","Type":"ContainerDied","Data":"dc983a43beb609ae11985f6b9b0d60cd205b35c5d294944a50d6440fd5838345"} Apr 16 04:33:15.701206 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:15.701205 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc983a43beb609ae11985f6b9b0d60cd205b35c5d294944a50d6440fd5838345" Apr 16 04:33:15.701436 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:15.701215 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae135044nqm" Apr 16 04:33:33.681250 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:33.681209 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 16 04:33:33.681631 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:33.681572 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e499c1a-69c6-4dcd-b112-d0a9186b4efe" containerName="pull" Apr 16 04:33:33.681631 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:33.681584 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e499c1a-69c6-4dcd-b112-d0a9186b4efe" containerName="pull" Apr 16 04:33:33.681631 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:33.681597 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e499c1a-69c6-4dcd-b112-d0a9186b4efe" containerName="extract" Apr 16 04:33:33.681631 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:33.681602 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e499c1a-69c6-4dcd-b112-d0a9186b4efe" containerName="extract" Apr 16 04:33:33.681631 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:33.681612 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e499c1a-69c6-4dcd-b112-d0a9186b4efe" containerName="util" Apr 16 04:33:33.681631 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:33.681618 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e499c1a-69c6-4dcd-b112-d0a9186b4efe" containerName="util" Apr 16 04:33:33.681823 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:33.681673 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="2e499c1a-69c6-4dcd-b112-d0a9186b4efe" containerName="extract" Apr 16 04:33:33.684695 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:33.684673 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 16 04:33:33.687416 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:33.687395 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"openshift-service-ca.crt\"" Apr 16 04:33:33.687552 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:33.687400 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"maas-keycloak-initial-admin\"" Apr 16 04:33:33.687552 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:33.687400 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"kube-root-ca.crt\"" Apr 16 04:33:33.687552 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:33.687535 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"default-dockercfg-slx6w\"" Apr 16 04:33:33.692646 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:33.692624 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 16 04:33:33.770769 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:33.770729 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zkqk\" (UniqueName: \"kubernetes.io/projected/473425e3-e500-41b5-9700-b4f185b1e3e4-kube-api-access-7zkqk\") pod \"maas-keycloak-0\" (UID: \"473425e3-e500-41b5-9700-b4f185b1e3e4\") " pod="keycloak-system/maas-keycloak-0" Apr 16 04:33:33.871651 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:33.871608 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7zkqk\" (UniqueName: \"kubernetes.io/projected/473425e3-e500-41b5-9700-b4f185b1e3e4-kube-api-access-7zkqk\") pod \"maas-keycloak-0\" (UID: \"473425e3-e500-41b5-9700-b4f185b1e3e4\") " pod="keycloak-system/maas-keycloak-0" Apr 16 04:33:33.879674 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:33.879644 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zkqk\" (UniqueName: \"kubernetes.io/projected/473425e3-e500-41b5-9700-b4f185b1e3e4-kube-api-access-7zkqk\") pod \"maas-keycloak-0\" (UID: \"473425e3-e500-41b5-9700-b4f185b1e3e4\") " pod="keycloak-system/maas-keycloak-0" Apr 16 04:33:33.994915 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:33.994808 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 16 04:33:34.124535 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:34.124506 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 16 04:33:34.126181 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:33:34.126152 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod473425e3_e500_41b5_9700_b4f185b1e3e4.slice/crio-38a145fcd932f83014a75705e32d6bac6abc5f8dae9427ab7e46924fe1a25a81 WatchSource:0}: Error finding container 38a145fcd932f83014a75705e32d6bac6abc5f8dae9427ab7e46924fe1a25a81: Status 404 returned error can't find the container with id 38a145fcd932f83014a75705e32d6bac6abc5f8dae9427ab7e46924fe1a25a81 Apr 16 04:33:34.774923 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:34.774870 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"473425e3-e500-41b5-9700-b4f185b1e3e4","Type":"ContainerStarted","Data":"38a145fcd932f83014a75705e32d6bac6abc5f8dae9427ab7e46924fe1a25a81"} Apr 16 04:33:39.799034 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:39.798987 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"473425e3-e500-41b5-9700-b4f185b1e3e4","Type":"ContainerStarted","Data":"59889d9654945973545cebaec5dfb4e912a402aba3b934936b8158964953a89c"} Apr 16 04:33:39.819348 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:39.819233 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keycloak-system/maas-keycloak-0" podStartSLOduration=1.51761789 podStartE2EDuration="6.819208228s" podCreationTimestamp="2026-04-16 04:33:33 +0000 UTC" firstStartedPulling="2026-04-16 04:33:34.128051958 +0000 UTC m=+562.030827953" lastFinishedPulling="2026-04-16 04:33:39.429642304 +0000 UTC m=+567.332418291" observedRunningTime="2026-04-16 04:33:39.814911151 +0000 UTC m=+567.717687151" watchObservedRunningTime="2026-04-16 04:33:39.819208228 +0000 UTC m=+567.721984227" Apr 16 04:33:39.995135 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:39.995086 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keycloak-system/maas-keycloak-0" Apr 16 04:33:39.996977 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:39.996938 2575 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="473425e3-e500-41b5-9700-b4f185b1e3e4" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.44:9000/health/started\": dial tcp 10.134.0.44:9000: connect: connection refused" Apr 16 04:33:40.995808 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:40.995750 2575 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="473425e3-e500-41b5-9700-b4f185b1e3e4" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.44:9000/health/started\": dial tcp 10.134.0.44:9000: connect: connection refused" Apr 16 04:33:41.995707 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:41.995650 2575 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="473425e3-e500-41b5-9700-b4f185b1e3e4" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.44:9000/health/started\": dial tcp 10.134.0.44:9000: connect: connection refused" Apr 16 04:33:42.996108 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:42.996052 2575 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="473425e3-e500-41b5-9700-b4f185b1e3e4" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.44:9000/health/started\": dial tcp 10.134.0.44:9000: connect: connection refused" Apr 16 04:33:43.995817 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:43.995758 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="keycloak-system/maas-keycloak-0" Apr 16 04:33:43.996163 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:43.996127 2575 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="473425e3-e500-41b5-9700-b4f185b1e3e4" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.44:9000/health/started\": dial tcp 10.134.0.44:9000: connect: connection refused" Apr 16 04:33:44.995685 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:44.995620 2575 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="473425e3-e500-41b5-9700-b4f185b1e3e4" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.44:9000/health/started\": dial tcp 10.134.0.44:9000: connect: connection refused" Apr 16 04:33:45.995796 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:45.995747 2575 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="473425e3-e500-41b5-9700-b4f185b1e3e4" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.44:9000/health/started\": dial tcp 10.134.0.44:9000: connect: connection refused" Apr 16 04:33:46.996032 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:46.995959 2575 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="473425e3-e500-41b5-9700-b4f185b1e3e4" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.44:9000/health/started\": dial tcp 10.134.0.44:9000: connect: connection refused" Apr 16 04:33:47.996180 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:47.996123 2575 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="473425e3-e500-41b5-9700-b4f185b1e3e4" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.44:9000/health/started\": dial tcp 10.134.0.44:9000: connect: connection refused" Apr 16 04:33:48.995744 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:48.995697 2575 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="473425e3-e500-41b5-9700-b4f185b1e3e4" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.44:9000/health/started\": dial tcp 10.134.0.44:9000: connect: connection refused" Apr 16 04:33:49.995981 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:49.995929 2575 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="473425e3-e500-41b5-9700-b4f185b1e3e4" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.44:9000/health/started\": dial tcp 10.134.0.44:9000: connect: connection refused" Apr 16 04:33:50.996303 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:50.996255 2575 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="473425e3-e500-41b5-9700-b4f185b1e3e4" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.44:9000/health/started\": dial tcp 10.134.0.44:9000: connect: connection refused" Apr 16 04:33:51.996188 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:51.996140 2575 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="473425e3-e500-41b5-9700-b4f185b1e3e4" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.44:9000/health/started\": dial tcp 10.134.0.44:9000: connect: connection refused" Apr 16 04:33:53.122913 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:53.122875 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="keycloak-system/maas-keycloak-0" Apr 16 04:33:53.141144 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:33:53.141094 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="keycloak-system/maas-keycloak-0" podUID="473425e3-e500-41b5-9700-b4f185b1e3e4" containerName="keycloak" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 04:34:03.129991 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:03.129953 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="keycloak-system/maas-keycloak-0" Apr 16 04:34:04.295045 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:04.294989 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-f98fbb98f-cb48g"] Apr 16 04:34:04.304866 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:04.304831 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-f98fbb98f-cb48g" Apr 16 04:34:04.307652 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:04.307623 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 16 04:34:04.307820 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:04.307685 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 16 04:34:04.307820 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:04.307693 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-h2mqr\"" Apr 16 04:34:04.308163 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:04.308138 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-f98fbb98f-cb48g"] Apr 16 04:34:04.313509 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:04.313310 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-565654dcd4-n56b2"] Apr 16 04:34:04.317783 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:04.317757 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-565654dcd4-n56b2" Apr 16 04:34:04.320615 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:04.320557 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-x7cw2\"" Apr 16 04:34:04.323236 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:04.323217 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-565654dcd4-n56b2"] Apr 16 04:34:04.470471 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:04.470439 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/cf67c95a-7b9d-4a0c-9157-0a85efa33c57-maas-api-tls\") pod \"maas-api-f98fbb98f-cb48g\" (UID: \"cf67c95a-7b9d-4a0c-9157-0a85efa33c57\") " pod="opendatahub/maas-api-f98fbb98f-cb48g" Apr 16 04:34:04.470676 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:04.470494 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wntp8\" (UniqueName: \"kubernetes.io/projected/e8cce96d-2d13-419e-8a4f-9f1636dde424-kube-api-access-wntp8\") pod \"maas-controller-565654dcd4-n56b2\" (UID: \"e8cce96d-2d13-419e-8a4f-9f1636dde424\") " pod="opendatahub/maas-controller-565654dcd4-n56b2" Apr 16 04:34:04.470676 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:04.470523 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88tbx\" (UniqueName: \"kubernetes.io/projected/cf67c95a-7b9d-4a0c-9157-0a85efa33c57-kube-api-access-88tbx\") pod \"maas-api-f98fbb98f-cb48g\" (UID: \"cf67c95a-7b9d-4a0c-9157-0a85efa33c57\") " pod="opendatahub/maas-api-f98fbb98f-cb48g" Apr 16 04:34:04.571779 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:04.571687 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wntp8\" (UniqueName: \"kubernetes.io/projected/e8cce96d-2d13-419e-8a4f-9f1636dde424-kube-api-access-wntp8\") pod \"maas-controller-565654dcd4-n56b2\" (UID: \"e8cce96d-2d13-419e-8a4f-9f1636dde424\") " pod="opendatahub/maas-controller-565654dcd4-n56b2" Apr 16 04:34:04.571779 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:04.571724 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-88tbx\" (UniqueName: \"kubernetes.io/projected/cf67c95a-7b9d-4a0c-9157-0a85efa33c57-kube-api-access-88tbx\") pod \"maas-api-f98fbb98f-cb48g\" (UID: \"cf67c95a-7b9d-4a0c-9157-0a85efa33c57\") " pod="opendatahub/maas-api-f98fbb98f-cb48g" Apr 16 04:34:04.571994 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:04.571810 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/cf67c95a-7b9d-4a0c-9157-0a85efa33c57-maas-api-tls\") pod \"maas-api-f98fbb98f-cb48g\" (UID: \"cf67c95a-7b9d-4a0c-9157-0a85efa33c57\") " pod="opendatahub/maas-api-f98fbb98f-cb48g" Apr 16 04:34:04.571994 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:34:04.571915 2575 secret.go:189] Couldn't get secret opendatahub/maas-api-serving-cert: secret "maas-api-serving-cert" not found Apr 16 04:34:04.571994 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:34:04.571977 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf67c95a-7b9d-4a0c-9157-0a85efa33c57-maas-api-tls podName:cf67c95a-7b9d-4a0c-9157-0a85efa33c57 nodeName:}" failed. No retries permitted until 2026-04-16 04:34:05.071958057 +0000 UTC m=+592.974734036 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "maas-api-tls" (UniqueName: "kubernetes.io/secret/cf67c95a-7b9d-4a0c-9157-0a85efa33c57-maas-api-tls") pod "maas-api-f98fbb98f-cb48g" (UID: "cf67c95a-7b9d-4a0c-9157-0a85efa33c57") : secret "maas-api-serving-cert" not found Apr 16 04:34:04.582150 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:04.582120 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-88tbx\" (UniqueName: \"kubernetes.io/projected/cf67c95a-7b9d-4a0c-9157-0a85efa33c57-kube-api-access-88tbx\") pod \"maas-api-f98fbb98f-cb48g\" (UID: \"cf67c95a-7b9d-4a0c-9157-0a85efa33c57\") " pod="opendatahub/maas-api-f98fbb98f-cb48g" Apr 16 04:34:04.582434 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:04.582417 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wntp8\" (UniqueName: \"kubernetes.io/projected/e8cce96d-2d13-419e-8a4f-9f1636dde424-kube-api-access-wntp8\") pod \"maas-controller-565654dcd4-n56b2\" (UID: \"e8cce96d-2d13-419e-8a4f-9f1636dde424\") " pod="opendatahub/maas-controller-565654dcd4-n56b2" Apr 16 04:34:04.630321 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:04.630278 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-565654dcd4-n56b2" Apr 16 04:34:04.977633 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:04.977602 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-565654dcd4-n56b2"] Apr 16 04:34:04.979085 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:34:04.979050 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8cce96d_2d13_419e_8a4f_9f1636dde424.slice/crio-338e7bbeb9bfb9fe843b1aa8e428aac066d69110b4d92b5dbac4c00f73a3f2db WatchSource:0}: Error finding container 338e7bbeb9bfb9fe843b1aa8e428aac066d69110b4d92b5dbac4c00f73a3f2db: Status 404 returned error can't find the container with id 338e7bbeb9bfb9fe843b1aa8e428aac066d69110b4d92b5dbac4c00f73a3f2db Apr 16 04:34:05.077600 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:05.077554 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/cf67c95a-7b9d-4a0c-9157-0a85efa33c57-maas-api-tls\") pod \"maas-api-f98fbb98f-cb48g\" (UID: \"cf67c95a-7b9d-4a0c-9157-0a85efa33c57\") " pod="opendatahub/maas-api-f98fbb98f-cb48g" Apr 16 04:34:05.079988 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:05.079964 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/cf67c95a-7b9d-4a0c-9157-0a85efa33c57-maas-api-tls\") pod \"maas-api-f98fbb98f-cb48g\" (UID: \"cf67c95a-7b9d-4a0c-9157-0a85efa33c57\") " pod="opendatahub/maas-api-f98fbb98f-cb48g" Apr 16 04:34:05.118270 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:05.118234 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-54997d6c57-4xkbg"] Apr 16 04:34:05.123446 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:05.123423 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-54997d6c57-4xkbg" Apr 16 04:34:05.145150 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:05.145122 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-54997d6c57-4xkbg"] Apr 16 04:34:05.217560 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:05.217526 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-f98fbb98f-cb48g" Apr 16 04:34:05.280098 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:05.280063 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/0f5cd07a-518a-4d00-b76d-86795485da57-maas-api-tls\") pod \"maas-api-54997d6c57-4xkbg\" (UID: \"0f5cd07a-518a-4d00-b76d-86795485da57\") " pod="opendatahub/maas-api-54997d6c57-4xkbg" Apr 16 04:34:05.280263 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:05.280163 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58t5w\" (UniqueName: \"kubernetes.io/projected/0f5cd07a-518a-4d00-b76d-86795485da57-kube-api-access-58t5w\") pod \"maas-api-54997d6c57-4xkbg\" (UID: \"0f5cd07a-518a-4d00-b76d-86795485da57\") " pod="opendatahub/maas-api-54997d6c57-4xkbg" Apr 16 04:34:05.351196 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:05.351165 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-f98fbb98f-cb48g"] Apr 16 04:34:05.352755 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:34:05.352731 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf67c95a_7b9d_4a0c_9157_0a85efa33c57.slice/crio-603600f349b7f17f2341282dbfb20db315a9f8412cf13d4f1598763fb2936759 WatchSource:0}: Error finding container 603600f349b7f17f2341282dbfb20db315a9f8412cf13d4f1598763fb2936759: Status 404 returned error can't find the container with id 603600f349b7f17f2341282dbfb20db315a9f8412cf13d4f1598763fb2936759 Apr 16 04:34:05.380954 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:05.380924 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/0f5cd07a-518a-4d00-b76d-86795485da57-maas-api-tls\") pod \"maas-api-54997d6c57-4xkbg\" (UID: \"0f5cd07a-518a-4d00-b76d-86795485da57\") " pod="opendatahub/maas-api-54997d6c57-4xkbg" Apr 16 04:34:05.381149 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:05.381016 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58t5w\" (UniqueName: \"kubernetes.io/projected/0f5cd07a-518a-4d00-b76d-86795485da57-kube-api-access-58t5w\") pod \"maas-api-54997d6c57-4xkbg\" (UID: \"0f5cd07a-518a-4d00-b76d-86795485da57\") " pod="opendatahub/maas-api-54997d6c57-4xkbg" Apr 16 04:34:05.383364 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:05.383320 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/0f5cd07a-518a-4d00-b76d-86795485da57-maas-api-tls\") pod \"maas-api-54997d6c57-4xkbg\" (UID: \"0f5cd07a-518a-4d00-b76d-86795485da57\") " pod="opendatahub/maas-api-54997d6c57-4xkbg" Apr 16 04:34:05.388721 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:05.388694 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-58t5w\" (UniqueName: \"kubernetes.io/projected/0f5cd07a-518a-4d00-b76d-86795485da57-kube-api-access-58t5w\") pod \"maas-api-54997d6c57-4xkbg\" (UID: \"0f5cd07a-518a-4d00-b76d-86795485da57\") " pod="opendatahub/maas-api-54997d6c57-4xkbg" Apr 16 04:34:05.434191 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:05.434151 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-54997d6c57-4xkbg" Apr 16 04:34:05.580510 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:05.580462 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-54997d6c57-4xkbg"] Apr 16 04:34:05.585113 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:34:05.585021 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f5cd07a_518a_4d00_b76d_86795485da57.slice/crio-427890bc0119971045180739248b74e31717684baade3edb0b54a7b962f23f92 WatchSource:0}: Error finding container 427890bc0119971045180739248b74e31717684baade3edb0b54a7b962f23f92: Status 404 returned error can't find the container with id 427890bc0119971045180739248b74e31717684baade3edb0b54a7b962f23f92 Apr 16 04:34:05.616023 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:05.615964 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-766d555cff-4lsmj"] Apr 16 04:34:05.620720 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:05.620690 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-766d555cff-4lsmj" Apr 16 04:34:05.623674 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:05.623648 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 16 04:34:05.623915 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:05.623892 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-w8nb4\"" Apr 16 04:34:05.625147 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:05.625122 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-766d555cff-4lsmj"] Apr 16 04:34:05.784852 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:05.784801 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/ecc2b8b6-1dda-46e6-ac6b-a9ef7b9981db-tls-cert\") pod \"authorino-766d555cff-4lsmj\" (UID: \"ecc2b8b6-1dda-46e6-ac6b-a9ef7b9981db\") " pod="kuadrant-system/authorino-766d555cff-4lsmj" Apr 16 04:34:05.785052 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:05.784890 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x5n8\" (UniqueName: \"kubernetes.io/projected/ecc2b8b6-1dda-46e6-ac6b-a9ef7b9981db-kube-api-access-7x5n8\") pod \"authorino-766d555cff-4lsmj\" (UID: \"ecc2b8b6-1dda-46e6-ac6b-a9ef7b9981db\") " pod="kuadrant-system/authorino-766d555cff-4lsmj" Apr 16 04:34:05.886268 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:05.886232 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7x5n8\" (UniqueName: \"kubernetes.io/projected/ecc2b8b6-1dda-46e6-ac6b-a9ef7b9981db-kube-api-access-7x5n8\") pod \"authorino-766d555cff-4lsmj\" (UID: \"ecc2b8b6-1dda-46e6-ac6b-a9ef7b9981db\") " pod="kuadrant-system/authorino-766d555cff-4lsmj" Apr 16 04:34:05.886490 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:05.886341 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/ecc2b8b6-1dda-46e6-ac6b-a9ef7b9981db-tls-cert\") pod \"authorino-766d555cff-4lsmj\" (UID: \"ecc2b8b6-1dda-46e6-ac6b-a9ef7b9981db\") " pod="kuadrant-system/authorino-766d555cff-4lsmj" Apr 16 04:34:05.889035 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:05.889011 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/ecc2b8b6-1dda-46e6-ac6b-a9ef7b9981db-tls-cert\") pod \"authorino-766d555cff-4lsmj\" (UID: \"ecc2b8b6-1dda-46e6-ac6b-a9ef7b9981db\") " pod="kuadrant-system/authorino-766d555cff-4lsmj" Apr 16 04:34:05.894905 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:05.894875 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x5n8\" (UniqueName: \"kubernetes.io/projected/ecc2b8b6-1dda-46e6-ac6b-a9ef7b9981db-kube-api-access-7x5n8\") pod \"authorino-766d555cff-4lsmj\" (UID: \"ecc2b8b6-1dda-46e6-ac6b-a9ef7b9981db\") " pod="kuadrant-system/authorino-766d555cff-4lsmj" Apr 16 04:34:05.923637 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:05.923582 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-54997d6c57-4xkbg" event={"ID":"0f5cd07a-518a-4d00-b76d-86795485da57","Type":"ContainerStarted","Data":"427890bc0119971045180739248b74e31717684baade3edb0b54a7b962f23f92"} Apr 16 04:34:05.925013 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:05.924978 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-565654dcd4-n56b2" event={"ID":"e8cce96d-2d13-419e-8a4f-9f1636dde424","Type":"ContainerStarted","Data":"338e7bbeb9bfb9fe843b1aa8e428aac066d69110b4d92b5dbac4c00f73a3f2db"} Apr 16 04:34:05.926350 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:05.926283 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-f98fbb98f-cb48g" event={"ID":"cf67c95a-7b9d-4a0c-9157-0a85efa33c57","Type":"ContainerStarted","Data":"603600f349b7f17f2341282dbfb20db315a9f8412cf13d4f1598763fb2936759"} Apr 16 04:34:05.936609 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:05.936579 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-766d555cff-4lsmj" Apr 16 04:34:06.098912 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:06.098743 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-766d555cff-4lsmj"] Apr 16 04:34:06.101111 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:34:06.101073 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podecc2b8b6_1dda_46e6_ac6b_a9ef7b9981db.slice/crio-fe8503aad63d2bda9c3c783563ef82d5ffaff5889d72feccf57cb5f51aa25342 WatchSource:0}: Error finding container fe8503aad63d2bda9c3c783563ef82d5ffaff5889d72feccf57cb5f51aa25342: Status 404 returned error can't find the container with id fe8503aad63d2bda9c3c783563ef82d5ffaff5889d72feccf57cb5f51aa25342 Apr 16 04:34:06.934072 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:06.934034 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-766d555cff-4lsmj" event={"ID":"ecc2b8b6-1dda-46e6-ac6b-a9ef7b9981db","Type":"ContainerStarted","Data":"fe8503aad63d2bda9c3c783563ef82d5ffaff5889d72feccf57cb5f51aa25342"} Apr 16 04:34:08.945176 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:08.945138 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-f98fbb98f-cb48g" event={"ID":"cf67c95a-7b9d-4a0c-9157-0a85efa33c57","Type":"ContainerStarted","Data":"e00f4cf5acabbbef568d3a856dc8cd9b292e8be04f1addf377a9b7b74a4ff562"} Apr 16 04:34:08.945176 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:08.945189 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-f98fbb98f-cb48g" Apr 16 04:34:08.946535 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:08.946513 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-54997d6c57-4xkbg" event={"ID":"0f5cd07a-518a-4d00-b76d-86795485da57","Type":"ContainerStarted","Data":"aa093d553621d4cb79682c55de48ce10a5c079ca3b8339402acbe6d7b43d2e60"} Apr 16 04:34:08.946642 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:08.946623 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-54997d6c57-4xkbg" Apr 16 04:34:08.947854 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:08.947837 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-565654dcd4-n56b2" event={"ID":"e8cce96d-2d13-419e-8a4f-9f1636dde424","Type":"ContainerStarted","Data":"f28430fd2e5accd5826c2e5e8f4a59c9a7d71656d355723c66208d6b8183e0e4"} Apr 16 04:34:08.947915 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:08.947898 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-565654dcd4-n56b2" Apr 16 04:34:08.949144 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:08.949118 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-766d555cff-4lsmj" event={"ID":"ecc2b8b6-1dda-46e6-ac6b-a9ef7b9981db","Type":"ContainerStarted","Data":"1e40075f4f71c8366af3b1c52bb5bcc8e796346d65dc73aa68da3742101fa044"} Apr 16 04:34:08.961900 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:08.961853 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-f98fbb98f-cb48g" podStartSLOduration=2.159122554 podStartE2EDuration="4.961837843s" podCreationTimestamp="2026-04-16 04:34:04 +0000 UTC" firstStartedPulling="2026-04-16 04:34:05.35412822 +0000 UTC m=+593.256904198" lastFinishedPulling="2026-04-16 04:34:08.1568435 +0000 UTC m=+596.059619487" observedRunningTime="2026-04-16 04:34:08.959918569 +0000 UTC m=+596.862694571" watchObservedRunningTime="2026-04-16 04:34:08.961837843 +0000 UTC m=+596.864613852" Apr 16 04:34:08.973581 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:08.973524 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-565654dcd4-n56b2" podStartSLOduration=1.79747943 podStartE2EDuration="4.9735066s" podCreationTimestamp="2026-04-16 04:34:04 +0000 UTC" firstStartedPulling="2026-04-16 04:34:04.980840698 +0000 UTC m=+592.883616683" lastFinishedPulling="2026-04-16 04:34:08.156867862 +0000 UTC m=+596.059643853" observedRunningTime="2026-04-16 04:34:08.973006315 +0000 UTC m=+596.875782305" watchObservedRunningTime="2026-04-16 04:34:08.9735066 +0000 UTC m=+596.876282613" Apr 16 04:34:08.989590 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:08.989532 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-766d555cff-4lsmj" podStartSLOduration=1.935345088 podStartE2EDuration="3.989512432s" podCreationTimestamp="2026-04-16 04:34:05 +0000 UTC" firstStartedPulling="2026-04-16 04:34:06.102874058 +0000 UTC m=+594.005650039" lastFinishedPulling="2026-04-16 04:34:08.157041405 +0000 UTC m=+596.059817383" observedRunningTime="2026-04-16 04:34:08.989315386 +0000 UTC m=+596.892091389" watchObservedRunningTime="2026-04-16 04:34:08.989512432 +0000 UTC m=+596.892288436" Apr 16 04:34:09.009476 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:09.009422 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-54997d6c57-4xkbg" podStartSLOduration=1.438350007 podStartE2EDuration="4.009405506s" podCreationTimestamp="2026-04-16 04:34:05 +0000 UTC" firstStartedPulling="2026-04-16 04:34:05.586498627 +0000 UTC m=+593.489274617" lastFinishedPulling="2026-04-16 04:34:08.157554128 +0000 UTC m=+596.060330116" observedRunningTime="2026-04-16 04:34:09.007049786 +0000 UTC m=+596.909825783" watchObservedRunningTime="2026-04-16 04:34:09.009405506 +0000 UTC m=+596.912181504" Apr 16 04:34:13.326963 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:13.326932 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kn57s_ed401949-3fdf-455d-bb0a-400a7a15466c/console-operator/2.log" Apr 16 04:34:13.329537 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:13.327127 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kn57s_ed401949-3fdf-455d-bb0a-400a7a15466c/console-operator/2.log" Apr 16 04:34:14.960458 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:14.960416 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-54997d6c57-4xkbg" Apr 16 04:34:14.960911 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:14.960618 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-f98fbb98f-cb48g" Apr 16 04:34:15.016358 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:15.016309 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-f98fbb98f-cb48g"] Apr 16 04:34:15.016592 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:15.016568 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-f98fbb98f-cb48g" podUID="cf67c95a-7b9d-4a0c-9157-0a85efa33c57" containerName="maas-api" containerID="cri-o://e00f4cf5acabbbef568d3a856dc8cd9b292e8be04f1addf377a9b7b74a4ff562" gracePeriod=30 Apr 16 04:34:15.266340 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:15.266305 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-f98fbb98f-cb48g" Apr 16 04:34:15.269574 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:15.269553 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88tbx\" (UniqueName: \"kubernetes.io/projected/cf67c95a-7b9d-4a0c-9157-0a85efa33c57-kube-api-access-88tbx\") pod \"cf67c95a-7b9d-4a0c-9157-0a85efa33c57\" (UID: \"cf67c95a-7b9d-4a0c-9157-0a85efa33c57\") " Apr 16 04:34:15.269640 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:15.269602 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/cf67c95a-7b9d-4a0c-9157-0a85efa33c57-maas-api-tls\") pod \"cf67c95a-7b9d-4a0c-9157-0a85efa33c57\" (UID: \"cf67c95a-7b9d-4a0c-9157-0a85efa33c57\") " Apr 16 04:34:15.271750 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:15.271719 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf67c95a-7b9d-4a0c-9157-0a85efa33c57-kube-api-access-88tbx" (OuterVolumeSpecName: "kube-api-access-88tbx") pod "cf67c95a-7b9d-4a0c-9157-0a85efa33c57" (UID: "cf67c95a-7b9d-4a0c-9157-0a85efa33c57"). InnerVolumeSpecName "kube-api-access-88tbx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 04:34:15.271877 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:15.271787 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf67c95a-7b9d-4a0c-9157-0a85efa33c57-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "cf67c95a-7b9d-4a0c-9157-0a85efa33c57" (UID: "cf67c95a-7b9d-4a0c-9157-0a85efa33c57"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 04:34:15.370392 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:15.370318 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-88tbx\" (UniqueName: \"kubernetes.io/projected/cf67c95a-7b9d-4a0c-9157-0a85efa33c57-kube-api-access-88tbx\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:34:15.370392 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:15.370385 2575 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/cf67c95a-7b9d-4a0c-9157-0a85efa33c57-maas-api-tls\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:34:15.982514 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:15.982478 2575 generic.go:358] "Generic (PLEG): container finished" podID="cf67c95a-7b9d-4a0c-9157-0a85efa33c57" containerID="e00f4cf5acabbbef568d3a856dc8cd9b292e8be04f1addf377a9b7b74a4ff562" exitCode=0 Apr 16 04:34:15.982925 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:15.982541 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-f98fbb98f-cb48g" Apr 16 04:34:15.982925 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:15.982562 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-f98fbb98f-cb48g" event={"ID":"cf67c95a-7b9d-4a0c-9157-0a85efa33c57","Type":"ContainerDied","Data":"e00f4cf5acabbbef568d3a856dc8cd9b292e8be04f1addf377a9b7b74a4ff562"} Apr 16 04:34:15.982925 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:15.982601 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-f98fbb98f-cb48g" event={"ID":"cf67c95a-7b9d-4a0c-9157-0a85efa33c57","Type":"ContainerDied","Data":"603600f349b7f17f2341282dbfb20db315a9f8412cf13d4f1598763fb2936759"} Apr 16 04:34:15.982925 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:15.982621 2575 scope.go:117] "RemoveContainer" containerID="e00f4cf5acabbbef568d3a856dc8cd9b292e8be04f1addf377a9b7b74a4ff562" Apr 16 04:34:15.991871 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:15.991853 2575 scope.go:117] "RemoveContainer" containerID="e00f4cf5acabbbef568d3a856dc8cd9b292e8be04f1addf377a9b7b74a4ff562" Apr 16 04:34:15.992131 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:34:15.992110 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e00f4cf5acabbbef568d3a856dc8cd9b292e8be04f1addf377a9b7b74a4ff562\": container with ID starting with e00f4cf5acabbbef568d3a856dc8cd9b292e8be04f1addf377a9b7b74a4ff562 not found: ID does not exist" containerID="e00f4cf5acabbbef568d3a856dc8cd9b292e8be04f1addf377a9b7b74a4ff562" Apr 16 04:34:15.992201 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:15.992146 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e00f4cf5acabbbef568d3a856dc8cd9b292e8be04f1addf377a9b7b74a4ff562"} err="failed to get container status \"e00f4cf5acabbbef568d3a856dc8cd9b292e8be04f1addf377a9b7b74a4ff562\": rpc error: code = NotFound desc = could not find container \"e00f4cf5acabbbef568d3a856dc8cd9b292e8be04f1addf377a9b7b74a4ff562\": container with ID starting with e00f4cf5acabbbef568d3a856dc8cd9b292e8be04f1addf377a9b7b74a4ff562 not found: ID does not exist" Apr 16 04:34:16.003921 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:16.003895 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-f98fbb98f-cb48g"] Apr 16 04:34:16.009283 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:16.009259 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-f98fbb98f-cb48g"] Apr 16 04:34:16.708031 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:16.707989 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf67c95a-7b9d-4a0c-9157-0a85efa33c57" path="/var/lib/kubelet/pods/cf67c95a-7b9d-4a0c-9157-0a85efa33c57/volumes" Apr 16 04:34:19.959109 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:19.959081 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-565654dcd4-n56b2" Apr 16 04:34:20.259716 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:20.259680 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-5757c45c47-9hxdq"] Apr 16 04:34:20.260092 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:20.260079 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cf67c95a-7b9d-4a0c-9157-0a85efa33c57" containerName="maas-api" Apr 16 04:34:20.260139 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:20.260093 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf67c95a-7b9d-4a0c-9157-0a85efa33c57" containerName="maas-api" Apr 16 04:34:20.260174 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:20.260146 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="cf67c95a-7b9d-4a0c-9157-0a85efa33c57" containerName="maas-api" Apr 16 04:34:20.264419 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:20.264399 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5757c45c47-9hxdq" Apr 16 04:34:20.269695 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:20.269669 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-5757c45c47-9hxdq"] Apr 16 04:34:20.305170 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:20.305123 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqf89\" (UniqueName: \"kubernetes.io/projected/252cb17f-e425-41c7-92e5-6f96222c7ab1-kube-api-access-dqf89\") pod \"maas-controller-5757c45c47-9hxdq\" (UID: \"252cb17f-e425-41c7-92e5-6f96222c7ab1\") " pod="opendatahub/maas-controller-5757c45c47-9hxdq" Apr 16 04:34:20.406177 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:20.406138 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dqf89\" (UniqueName: \"kubernetes.io/projected/252cb17f-e425-41c7-92e5-6f96222c7ab1-kube-api-access-dqf89\") pod \"maas-controller-5757c45c47-9hxdq\" (UID: \"252cb17f-e425-41c7-92e5-6f96222c7ab1\") " pod="opendatahub/maas-controller-5757c45c47-9hxdq" Apr 16 04:34:20.414695 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:20.414669 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqf89\" (UniqueName: \"kubernetes.io/projected/252cb17f-e425-41c7-92e5-6f96222c7ab1-kube-api-access-dqf89\") pod \"maas-controller-5757c45c47-9hxdq\" (UID: \"252cb17f-e425-41c7-92e5-6f96222c7ab1\") " pod="opendatahub/maas-controller-5757c45c47-9hxdq" Apr 16 04:34:20.575781 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:20.575687 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5757c45c47-9hxdq" Apr 16 04:34:20.700243 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:20.700211 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-5757c45c47-9hxdq"] Apr 16 04:34:20.701956 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:34:20.701930 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod252cb17f_e425_41c7_92e5_6f96222c7ab1.slice/crio-10eb55c01b2464a8848e0b1081cb285ffd789f6abaee8e9b0c81c990d37cd368 WatchSource:0}: Error finding container 10eb55c01b2464a8848e0b1081cb285ffd789f6abaee8e9b0c81c990d37cd368: Status 404 returned error can't find the container with id 10eb55c01b2464a8848e0b1081cb285ffd789f6abaee8e9b0c81c990d37cd368 Apr 16 04:34:21.003938 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:21.003900 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5757c45c47-9hxdq" event={"ID":"252cb17f-e425-41c7-92e5-6f96222c7ab1","Type":"ContainerStarted","Data":"10eb55c01b2464a8848e0b1081cb285ffd789f6abaee8e9b0c81c990d37cd368"} Apr 16 04:34:22.009350 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:22.009285 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5757c45c47-9hxdq" event={"ID":"252cb17f-e425-41c7-92e5-6f96222c7ab1","Type":"ContainerStarted","Data":"59d37f4e92dbd2d86ede8ba9149939d848530dfe33b5cdd08fd8369bc7ed14dc"} Apr 16 04:34:22.009791 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:22.009372 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-5757c45c47-9hxdq" Apr 16 04:34:22.028977 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:22.028921 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-5757c45c47-9hxdq" podStartSLOduration=1.5719954189999998 podStartE2EDuration="2.028905249s" podCreationTimestamp="2026-04-16 04:34:20 +0000 UTC" firstStartedPulling="2026-04-16 04:34:20.703706261 +0000 UTC m=+608.606482245" lastFinishedPulling="2026-04-16 04:34:21.160616095 +0000 UTC m=+609.063392075" observedRunningTime="2026-04-16 04:34:22.027973301 +0000 UTC m=+609.930749299" watchObservedRunningTime="2026-04-16 04:34:22.028905249 +0000 UTC m=+609.931681247" Apr 16 04:34:33.019267 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:33.019193 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-5757c45c47-9hxdq" Apr 16 04:34:33.057010 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:33.056980 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-565654dcd4-n56b2"] Apr 16 04:34:33.057248 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:33.057222 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-565654dcd4-n56b2" podUID="e8cce96d-2d13-419e-8a4f-9f1636dde424" containerName="manager" containerID="cri-o://f28430fd2e5accd5826c2e5e8f4a59c9a7d71656d355723c66208d6b8183e0e4" gracePeriod=10 Apr 16 04:34:33.296725 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:33.296701 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-565654dcd4-n56b2" Apr 16 04:34:33.419982 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:33.419941 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wntp8\" (UniqueName: \"kubernetes.io/projected/e8cce96d-2d13-419e-8a4f-9f1636dde424-kube-api-access-wntp8\") pod \"e8cce96d-2d13-419e-8a4f-9f1636dde424\" (UID: \"e8cce96d-2d13-419e-8a4f-9f1636dde424\") " Apr 16 04:34:33.422028 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:33.422004 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8cce96d-2d13-419e-8a4f-9f1636dde424-kube-api-access-wntp8" (OuterVolumeSpecName: "kube-api-access-wntp8") pod "e8cce96d-2d13-419e-8a4f-9f1636dde424" (UID: "e8cce96d-2d13-419e-8a4f-9f1636dde424"). InnerVolumeSpecName "kube-api-access-wntp8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 04:34:33.521624 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:33.521589 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wntp8\" (UniqueName: \"kubernetes.io/projected/e8cce96d-2d13-419e-8a4f-9f1636dde424-kube-api-access-wntp8\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:34:33.866273 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:33.866237 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 16 04:34:33.866535 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:33.866503 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="keycloak-system/maas-keycloak-0" podUID="473425e3-e500-41b5-9700-b4f185b1e3e4" containerName="keycloak" containerID="cri-o://59889d9654945973545cebaec5dfb4e912a402aba3b934936b8158964953a89c" gracePeriod=30 Apr 16 04:34:34.057106 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:34.057068 2575 generic.go:358] "Generic (PLEG): container finished" podID="e8cce96d-2d13-419e-8a4f-9f1636dde424" containerID="f28430fd2e5accd5826c2e5e8f4a59c9a7d71656d355723c66208d6b8183e0e4" exitCode=0 Apr 16 04:34:34.057537 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:34.057119 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-565654dcd4-n56b2" event={"ID":"e8cce96d-2d13-419e-8a4f-9f1636dde424","Type":"ContainerDied","Data":"f28430fd2e5accd5826c2e5e8f4a59c9a7d71656d355723c66208d6b8183e0e4"} Apr 16 04:34:34.057537 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:34.057126 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-565654dcd4-n56b2" Apr 16 04:34:34.057537 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:34.057148 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-565654dcd4-n56b2" event={"ID":"e8cce96d-2d13-419e-8a4f-9f1636dde424","Type":"ContainerDied","Data":"338e7bbeb9bfb9fe843b1aa8e428aac066d69110b4d92b5dbac4c00f73a3f2db"} Apr 16 04:34:34.057537 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:34.057165 2575 scope.go:117] "RemoveContainer" containerID="f28430fd2e5accd5826c2e5e8f4a59c9a7d71656d355723c66208d6b8183e0e4" Apr 16 04:34:34.066315 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:34.066299 2575 scope.go:117] "RemoveContainer" containerID="f28430fd2e5accd5826c2e5e8f4a59c9a7d71656d355723c66208d6b8183e0e4" Apr 16 04:34:34.066598 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:34:34.066581 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f28430fd2e5accd5826c2e5e8f4a59c9a7d71656d355723c66208d6b8183e0e4\": container with ID starting with f28430fd2e5accd5826c2e5e8f4a59c9a7d71656d355723c66208d6b8183e0e4 not found: ID does not exist" containerID="f28430fd2e5accd5826c2e5e8f4a59c9a7d71656d355723c66208d6b8183e0e4" Apr 16 04:34:34.066650 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:34.066606 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f28430fd2e5accd5826c2e5e8f4a59c9a7d71656d355723c66208d6b8183e0e4"} err="failed to get container status \"f28430fd2e5accd5826c2e5e8f4a59c9a7d71656d355723c66208d6b8183e0e4\": rpc error: code = NotFound desc = could not find container \"f28430fd2e5accd5826c2e5e8f4a59c9a7d71656d355723c66208d6b8183e0e4\": container with ID starting with f28430fd2e5accd5826c2e5e8f4a59c9a7d71656d355723c66208d6b8183e0e4 not found: ID does not exist" Apr 16 04:34:34.078022 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:34.077989 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-565654dcd4-n56b2"] Apr 16 04:34:34.081669 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:34.081642 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-565654dcd4-n56b2"] Apr 16 04:34:34.708148 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:34.708111 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8cce96d-2d13-419e-8a4f-9f1636dde424" path="/var/lib/kubelet/pods/e8cce96d-2d13-419e-8a4f-9f1636dde424/volumes" Apr 16 04:34:35.915432 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:35.915402 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 16 04:34:36.043994 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:36.043893 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zkqk\" (UniqueName: \"kubernetes.io/projected/473425e3-e500-41b5-9700-b4f185b1e3e4-kube-api-access-7zkqk\") pod \"473425e3-e500-41b5-9700-b4f185b1e3e4\" (UID: \"473425e3-e500-41b5-9700-b4f185b1e3e4\") " Apr 16 04:34:36.046169 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:36.046135 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/473425e3-e500-41b5-9700-b4f185b1e3e4-kube-api-access-7zkqk" (OuterVolumeSpecName: "kube-api-access-7zkqk") pod "473425e3-e500-41b5-9700-b4f185b1e3e4" (UID: "473425e3-e500-41b5-9700-b4f185b1e3e4"). InnerVolumeSpecName "kube-api-access-7zkqk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 04:34:36.066557 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:36.066520 2575 generic.go:358] "Generic (PLEG): container finished" podID="473425e3-e500-41b5-9700-b4f185b1e3e4" containerID="59889d9654945973545cebaec5dfb4e912a402aba3b934936b8158964953a89c" exitCode=143 Apr 16 04:34:36.066699 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:36.066566 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"473425e3-e500-41b5-9700-b4f185b1e3e4","Type":"ContainerDied","Data":"59889d9654945973545cebaec5dfb4e912a402aba3b934936b8158964953a89c"} Apr 16 04:34:36.066699 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:36.066584 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 16 04:34:36.066699 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:36.066594 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"473425e3-e500-41b5-9700-b4f185b1e3e4","Type":"ContainerDied","Data":"38a145fcd932f83014a75705e32d6bac6abc5f8dae9427ab7e46924fe1a25a81"} Apr 16 04:34:36.066699 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:36.066613 2575 scope.go:117] "RemoveContainer" containerID="59889d9654945973545cebaec5dfb4e912a402aba3b934936b8158964953a89c" Apr 16 04:34:36.077064 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:36.077042 2575 scope.go:117] "RemoveContainer" containerID="59889d9654945973545cebaec5dfb4e912a402aba3b934936b8158964953a89c" Apr 16 04:34:36.077354 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:34:36.077319 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59889d9654945973545cebaec5dfb4e912a402aba3b934936b8158964953a89c\": container with ID starting with 59889d9654945973545cebaec5dfb4e912a402aba3b934936b8158964953a89c not found: ID does not exist" containerID="59889d9654945973545cebaec5dfb4e912a402aba3b934936b8158964953a89c" Apr 16 04:34:36.077406 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:36.077366 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59889d9654945973545cebaec5dfb4e912a402aba3b934936b8158964953a89c"} err="failed to get container status \"59889d9654945973545cebaec5dfb4e912a402aba3b934936b8158964953a89c\": rpc error: code = NotFound desc = could not find container \"59889d9654945973545cebaec5dfb4e912a402aba3b934936b8158964953a89c\": container with ID starting with 59889d9654945973545cebaec5dfb4e912a402aba3b934936b8158964953a89c not found: ID does not exist" Apr 16 04:34:36.087951 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:36.087920 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 16 04:34:36.090149 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:36.090123 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 16 04:34:36.107355 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:36.107311 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 16 04:34:36.107731 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:36.107719 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e8cce96d-2d13-419e-8a4f-9f1636dde424" containerName="manager" Apr 16 04:34:36.107794 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:36.107732 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8cce96d-2d13-419e-8a4f-9f1636dde424" containerName="manager" Apr 16 04:34:36.107794 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:36.107760 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="473425e3-e500-41b5-9700-b4f185b1e3e4" containerName="keycloak" Apr 16 04:34:36.107794 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:36.107765 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="473425e3-e500-41b5-9700-b4f185b1e3e4" containerName="keycloak" Apr 16 04:34:36.107892 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:36.107827 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="473425e3-e500-41b5-9700-b4f185b1e3e4" containerName="keycloak" Apr 16 04:34:36.107892 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:36.107836 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="e8cce96d-2d13-419e-8a4f-9f1636dde424" containerName="manager" Apr 16 04:34:36.112461 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:36.112442 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 16 04:34:36.114944 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:36.114892 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"maas-keycloak-initial-admin\"" Apr 16 04:34:36.115130 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:36.115078 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"keycloak-test-realms\"" Apr 16 04:34:36.115204 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:36.115187 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"default-dockercfg-slx6w\"" Apr 16 04:34:36.115286 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:36.115249 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"openshift-service-ca.crt\"" Apr 16 04:34:36.115286 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:36.115267 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"kube-root-ca.crt\"" Apr 16 04:34:36.119240 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:36.119214 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 16 04:34:36.145229 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:36.145200 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7zkqk\" (UniqueName: \"kubernetes.io/projected/473425e3-e500-41b5-9700-b4f185b1e3e4-kube-api-access-7zkqk\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:34:36.246411 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:36.246362 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm2rt\" (UniqueName: \"kubernetes.io/projected/65eed9e8-ea29-43f4-9367-16092aa35d02-kube-api-access-jm2rt\") pod \"maas-keycloak-0\" (UID: \"65eed9e8-ea29-43f4-9367-16092aa35d02\") " pod="keycloak-system/maas-keycloak-0" Apr 16 04:34:36.246559 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:36.246453 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-realms\" (UniqueName: \"kubernetes.io/configmap/65eed9e8-ea29-43f4-9367-16092aa35d02-test-realms\") pod \"maas-keycloak-0\" (UID: \"65eed9e8-ea29-43f4-9367-16092aa35d02\") " pod="keycloak-system/maas-keycloak-0" Apr 16 04:34:36.347779 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:36.347686 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"test-realms\" (UniqueName: \"kubernetes.io/configmap/65eed9e8-ea29-43f4-9367-16092aa35d02-test-realms\") pod \"maas-keycloak-0\" (UID: \"65eed9e8-ea29-43f4-9367-16092aa35d02\") " pod="keycloak-system/maas-keycloak-0" Apr 16 04:34:36.347779 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:36.347759 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jm2rt\" (UniqueName: \"kubernetes.io/projected/65eed9e8-ea29-43f4-9367-16092aa35d02-kube-api-access-jm2rt\") pod \"maas-keycloak-0\" (UID: \"65eed9e8-ea29-43f4-9367-16092aa35d02\") " pod="keycloak-system/maas-keycloak-0" Apr 16 04:34:36.348431 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:36.348413 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"test-realms\" (UniqueName: \"kubernetes.io/configmap/65eed9e8-ea29-43f4-9367-16092aa35d02-test-realms\") pod \"maas-keycloak-0\" (UID: \"65eed9e8-ea29-43f4-9367-16092aa35d02\") " pod="keycloak-system/maas-keycloak-0" Apr 16 04:34:36.355193 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:36.355168 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm2rt\" (UniqueName: \"kubernetes.io/projected/65eed9e8-ea29-43f4-9367-16092aa35d02-kube-api-access-jm2rt\") pod \"maas-keycloak-0\" (UID: \"65eed9e8-ea29-43f4-9367-16092aa35d02\") " pod="keycloak-system/maas-keycloak-0" Apr 16 04:34:36.423206 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:36.423168 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 16 04:34:36.551051 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:36.551016 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 16 04:34:36.708125 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:36.708039 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="473425e3-e500-41b5-9700-b4f185b1e3e4" path="/var/lib/kubelet/pods/473425e3-e500-41b5-9700-b4f185b1e3e4/volumes" Apr 16 04:34:37.074404 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:37.074370 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"65eed9e8-ea29-43f4-9367-16092aa35d02","Type":"ContainerStarted","Data":"7992cf7c3d191df331f1d4b4fe724138223a53bda5596a255bd3dfb97a9cbc60"} Apr 16 04:34:37.074404 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:37.074408 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"65eed9e8-ea29-43f4-9367-16092aa35d02","Type":"ContainerStarted","Data":"6f5df25ed0bcfe9867c1f7f50e42240d49b48b36e627e13c00fd7a90910a20a7"} Apr 16 04:34:37.091918 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:37.091856 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keycloak-system/maas-keycloak-0" podStartSLOduration=0.689195446 podStartE2EDuration="1.091838514s" podCreationTimestamp="2026-04-16 04:34:36 +0000 UTC" firstStartedPulling="2026-04-16 04:34:36.557183746 +0000 UTC m=+624.459959724" lastFinishedPulling="2026-04-16 04:34:36.95982681 +0000 UTC m=+624.862602792" observedRunningTime="2026-04-16 04:34:37.090261984 +0000 UTC m=+624.993037997" watchObservedRunningTime="2026-04-16 04:34:37.091838514 +0000 UTC m=+624.994614516" Apr 16 04:34:37.423931 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:37.423884 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keycloak-system/maas-keycloak-0" Apr 16 04:34:37.425732 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:37.425699 2575 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="65eed9e8-ea29-43f4-9367-16092aa35d02" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.50:9000/health/started\": dial tcp 10.134.0.50:9000: connect: connection refused" Apr 16 04:34:38.424184 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:38.424125 2575 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="65eed9e8-ea29-43f4-9367-16092aa35d02" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.50:9000/health/started\": dial tcp 10.134.0.50:9000: connect: connection refused" Apr 16 04:34:39.424143 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:39.424089 2575 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="65eed9e8-ea29-43f4-9367-16092aa35d02" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.50:9000/health/started\": dial tcp 10.134.0.50:9000: connect: connection refused" Apr 16 04:34:40.424556 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:40.424506 2575 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="65eed9e8-ea29-43f4-9367-16092aa35d02" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.50:9000/health/started\": dial tcp 10.134.0.50:9000: connect: connection refused" Apr 16 04:34:41.424591 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:41.424544 2575 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="65eed9e8-ea29-43f4-9367-16092aa35d02" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.50:9000/health/started\": dial tcp 10.134.0.50:9000: connect: connection refused" Apr 16 04:34:42.423953 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:42.423907 2575 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="65eed9e8-ea29-43f4-9367-16092aa35d02" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.50:9000/health/started\": dial tcp 10.134.0.50:9000: connect: connection refused" Apr 16 04:34:43.424523 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:43.424475 2575 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="65eed9e8-ea29-43f4-9367-16092aa35d02" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.50:9000/health/started\": dial tcp 10.134.0.50:9000: connect: connection refused" Apr 16 04:34:44.424400 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:44.424348 2575 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="65eed9e8-ea29-43f4-9367-16092aa35d02" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.50:9000/health/started\": dial tcp 10.134.0.50:9000: connect: connection refused" Apr 16 04:34:45.424398 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:45.424351 2575 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="65eed9e8-ea29-43f4-9367-16092aa35d02" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.50:9000/health/started\": dial tcp 10.134.0.50:9000: connect: connection refused" Apr 16 04:34:46.424063 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:46.424011 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="keycloak-system/maas-keycloak-0" Apr 16 04:34:46.424314 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:46.424283 2575 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="65eed9e8-ea29-43f4-9367-16092aa35d02" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.50:9000/health/started\": dial tcp 10.134.0.50:9000: connect: connection refused" Apr 16 04:34:47.424403 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:47.424319 2575 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="65eed9e8-ea29-43f4-9367-16092aa35d02" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.50:9000/health/started\": dial tcp 10.134.0.50:9000: connect: connection refused" Apr 16 04:34:48.424060 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:48.424007 2575 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="65eed9e8-ea29-43f4-9367-16092aa35d02" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.50:9000/health/started\": dial tcp 10.134.0.50:9000: connect: connection refused" Apr 16 04:34:49.424633 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:49.424588 2575 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="65eed9e8-ea29-43f4-9367-16092aa35d02" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.50:9000/health/started\": dial tcp 10.134.0.50:9000: connect: connection refused" Apr 16 04:34:50.424048 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:50.423985 2575 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="65eed9e8-ea29-43f4-9367-16092aa35d02" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.50:9000/health/started\": dial tcp 10.134.0.50:9000: connect: connection refused" Apr 16 04:34:51.559464 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:51.558106 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="keycloak-system/maas-keycloak-0" Apr 16 04:34:51.580132 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:34:51.579829 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="keycloak-system/maas-keycloak-0" podUID="65eed9e8-ea29-43f4-9367-16092aa35d02" containerName="keycloak" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 04:35:01.566037 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:01.566003 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="keycloak-system/maas-keycloak-0" Apr 16 04:35:13.940749 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:13.940713 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-766d555cff-4lsmj"] Apr 16 04:35:13.941258 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:13.940998 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-766d555cff-4lsmj" podUID="ecc2b8b6-1dda-46e6-ac6b-a9ef7b9981db" containerName="authorino" containerID="cri-o://1e40075f4f71c8366af3b1c52bb5bcc8e796346d65dc73aa68da3742101fa044" gracePeriod=30 Apr 16 04:35:14.182305 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:14.182281 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-766d555cff-4lsmj" Apr 16 04:35:14.214388 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:14.214263 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/ecc2b8b6-1dda-46e6-ac6b-a9ef7b9981db-tls-cert\") pod \"ecc2b8b6-1dda-46e6-ac6b-a9ef7b9981db\" (UID: \"ecc2b8b6-1dda-46e6-ac6b-a9ef7b9981db\") " Apr 16 04:35:14.214388 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:14.214381 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7x5n8\" (UniqueName: \"kubernetes.io/projected/ecc2b8b6-1dda-46e6-ac6b-a9ef7b9981db-kube-api-access-7x5n8\") pod \"ecc2b8b6-1dda-46e6-ac6b-a9ef7b9981db\" (UID: \"ecc2b8b6-1dda-46e6-ac6b-a9ef7b9981db\") " Apr 16 04:35:14.216852 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:14.216806 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecc2b8b6-1dda-46e6-ac6b-a9ef7b9981db-kube-api-access-7x5n8" (OuterVolumeSpecName: "kube-api-access-7x5n8") pod "ecc2b8b6-1dda-46e6-ac6b-a9ef7b9981db" (UID: "ecc2b8b6-1dda-46e6-ac6b-a9ef7b9981db"). InnerVolumeSpecName "kube-api-access-7x5n8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 04:35:14.227063 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:14.227030 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecc2b8b6-1dda-46e6-ac6b-a9ef7b9981db-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "ecc2b8b6-1dda-46e6-ac6b-a9ef7b9981db" (UID: "ecc2b8b6-1dda-46e6-ac6b-a9ef7b9981db"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 04:35:14.247915 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:14.247881 2575 generic.go:358] "Generic (PLEG): container finished" podID="ecc2b8b6-1dda-46e6-ac6b-a9ef7b9981db" containerID="1e40075f4f71c8366af3b1c52bb5bcc8e796346d65dc73aa68da3742101fa044" exitCode=0 Apr 16 04:35:14.248112 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:14.247934 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-766d555cff-4lsmj" Apr 16 04:35:14.248112 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:14.247941 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-766d555cff-4lsmj" event={"ID":"ecc2b8b6-1dda-46e6-ac6b-a9ef7b9981db","Type":"ContainerDied","Data":"1e40075f4f71c8366af3b1c52bb5bcc8e796346d65dc73aa68da3742101fa044"} Apr 16 04:35:14.248112 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:14.247980 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-766d555cff-4lsmj" event={"ID":"ecc2b8b6-1dda-46e6-ac6b-a9ef7b9981db","Type":"ContainerDied","Data":"fe8503aad63d2bda9c3c783563ef82d5ffaff5889d72feccf57cb5f51aa25342"} Apr 16 04:35:14.248112 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:14.247998 2575 scope.go:117] "RemoveContainer" containerID="1e40075f4f71c8366af3b1c52bb5bcc8e796346d65dc73aa68da3742101fa044" Apr 16 04:35:14.257189 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:14.257170 2575 scope.go:117] "RemoveContainer" containerID="1e40075f4f71c8366af3b1c52bb5bcc8e796346d65dc73aa68da3742101fa044" Apr 16 04:35:14.257478 ip-10-0-133-103 kubenswrapper[2575]: E0416 04:35:14.257456 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e40075f4f71c8366af3b1c52bb5bcc8e796346d65dc73aa68da3742101fa044\": container with ID starting with 1e40075f4f71c8366af3b1c52bb5bcc8e796346d65dc73aa68da3742101fa044 not found: ID does not exist" containerID="1e40075f4f71c8366af3b1c52bb5bcc8e796346d65dc73aa68da3742101fa044" Apr 16 04:35:14.257557 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:14.257485 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e40075f4f71c8366af3b1c52bb5bcc8e796346d65dc73aa68da3742101fa044"} err="failed to get container status \"1e40075f4f71c8366af3b1c52bb5bcc8e796346d65dc73aa68da3742101fa044\": rpc error: code = NotFound desc = could not find container \"1e40075f4f71c8366af3b1c52bb5bcc8e796346d65dc73aa68da3742101fa044\": container with ID starting with 1e40075f4f71c8366af3b1c52bb5bcc8e796346d65dc73aa68da3742101fa044 not found: ID does not exist" Apr 16 04:35:14.269486 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:14.269458 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-766d555cff-4lsmj"] Apr 16 04:35:14.271639 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:14.271617 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-766d555cff-4lsmj"] Apr 16 04:35:14.315690 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:14.315658 2575 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/ecc2b8b6-1dda-46e6-ac6b-a9ef7b9981db-tls-cert\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:35:14.315690 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:14.315688 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7x5n8\" (UniqueName: \"kubernetes.io/projected/ecc2b8b6-1dda-46e6-ac6b-a9ef7b9981db-kube-api-access-7x5n8\") on node \"ip-10-0-133-103.ec2.internal\" DevicePath \"\"" Apr 16 04:35:14.707294 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:14.707263 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecc2b8b6-1dda-46e6-ac6b-a9ef7b9981db" path="/var/lib/kubelet/pods/ecc2b8b6-1dda-46e6-ac6b-a9ef7b9981db/volumes" Apr 16 04:35:17.369459 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:17.369417 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k"] Apr 16 04:35:17.369840 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:17.369799 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ecc2b8b6-1dda-46e6-ac6b-a9ef7b9981db" containerName="authorino" Apr 16 04:35:17.369840 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:17.369810 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecc2b8b6-1dda-46e6-ac6b-a9ef7b9981db" containerName="authorino" Apr 16 04:35:17.369913 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:17.369877 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="ecc2b8b6-1dda-46e6-ac6b-a9ef7b9981db" containerName="authorino" Apr 16 04:35:17.374410 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:17.374392 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k" Apr 16 04:35:17.377090 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:17.377068 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 16 04:35:17.378515 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:17.378495 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 16 04:35:17.378641 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:17.378563 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-gmrh6\"" Apr 16 04:35:17.378710 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:17.378533 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 16 04:35:17.381735 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:17.381715 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k"] Apr 16 04:35:17.441877 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:17.441842 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fd45fa94-1a52-4f60-bef9-f5ae05e06e75-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k\" (UID: \"fd45fa94-1a52-4f60-bef9-f5ae05e06e75\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k" Apr 16 04:35:17.442048 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:17.441904 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fd45fa94-1a52-4f60-bef9-f5ae05e06e75-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k\" (UID: \"fd45fa94-1a52-4f60-bef9-f5ae05e06e75\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k" Apr 16 04:35:17.442048 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:17.441939 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fd45fa94-1a52-4f60-bef9-f5ae05e06e75-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k\" (UID: \"fd45fa94-1a52-4f60-bef9-f5ae05e06e75\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k" Apr 16 04:35:17.442048 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:17.441957 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fd45fa94-1a52-4f60-bef9-f5ae05e06e75-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k\" (UID: \"fd45fa94-1a52-4f60-bef9-f5ae05e06e75\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k" Apr 16 04:35:17.442048 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:17.441988 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s6hl\" (UniqueName: \"kubernetes.io/projected/fd45fa94-1a52-4f60-bef9-f5ae05e06e75-kube-api-access-4s6hl\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k\" (UID: \"fd45fa94-1a52-4f60-bef9-f5ae05e06e75\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k" Apr 16 04:35:17.442048 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:17.442036 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fd45fa94-1a52-4f60-bef9-f5ae05e06e75-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k\" (UID: \"fd45fa94-1a52-4f60-bef9-f5ae05e06e75\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k" Apr 16 04:35:17.542960 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:17.542917 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fd45fa94-1a52-4f60-bef9-f5ae05e06e75-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k\" (UID: \"fd45fa94-1a52-4f60-bef9-f5ae05e06e75\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k" Apr 16 04:35:17.543156 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:17.542974 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fd45fa94-1a52-4f60-bef9-f5ae05e06e75-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k\" (UID: \"fd45fa94-1a52-4f60-bef9-f5ae05e06e75\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k" Apr 16 04:35:17.543156 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:17.542994 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fd45fa94-1a52-4f60-bef9-f5ae05e06e75-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k\" (UID: \"fd45fa94-1a52-4f60-bef9-f5ae05e06e75\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k" Apr 16 04:35:17.543156 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:17.543018 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4s6hl\" (UniqueName: \"kubernetes.io/projected/fd45fa94-1a52-4f60-bef9-f5ae05e06e75-kube-api-access-4s6hl\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k\" (UID: \"fd45fa94-1a52-4f60-bef9-f5ae05e06e75\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k" Apr 16 04:35:17.543156 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:17.543039 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fd45fa94-1a52-4f60-bef9-f5ae05e06e75-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k\" (UID: \"fd45fa94-1a52-4f60-bef9-f5ae05e06e75\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k" Apr 16 04:35:17.543156 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:17.543084 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fd45fa94-1a52-4f60-bef9-f5ae05e06e75-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k\" (UID: \"fd45fa94-1a52-4f60-bef9-f5ae05e06e75\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k" Apr 16 04:35:17.543561 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:17.543534 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fd45fa94-1a52-4f60-bef9-f5ae05e06e75-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k\" (UID: \"fd45fa94-1a52-4f60-bef9-f5ae05e06e75\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k" Apr 16 04:35:17.543621 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:17.543559 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fd45fa94-1a52-4f60-bef9-f5ae05e06e75-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k\" (UID: \"fd45fa94-1a52-4f60-bef9-f5ae05e06e75\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k" Apr 16 04:35:17.543621 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:17.543599 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fd45fa94-1a52-4f60-bef9-f5ae05e06e75-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k\" (UID: \"fd45fa94-1a52-4f60-bef9-f5ae05e06e75\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k" Apr 16 04:35:17.545678 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:17.545644 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fd45fa94-1a52-4f60-bef9-f5ae05e06e75-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k\" (UID: \"fd45fa94-1a52-4f60-bef9-f5ae05e06e75\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k" Apr 16 04:35:17.545932 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:17.545913 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fd45fa94-1a52-4f60-bef9-f5ae05e06e75-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k\" (UID: \"fd45fa94-1a52-4f60-bef9-f5ae05e06e75\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k" Apr 16 04:35:17.554031 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:17.554003 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s6hl\" (UniqueName: \"kubernetes.io/projected/fd45fa94-1a52-4f60-bef9-f5ae05e06e75-kube-api-access-4s6hl\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k\" (UID: \"fd45fa94-1a52-4f60-bef9-f5ae05e06e75\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k" Apr 16 04:35:17.686798 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:17.686691 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k" Apr 16 04:35:17.816174 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:17.816146 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k"] Apr 16 04:35:17.817840 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:35:17.817812 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd45fa94_1a52_4f60_bef9_f5ae05e06e75.slice/crio-70d8e09161d3ee1fd6fa4d1d282297ce7519f3db4235e9bf41ae95f98a85e54c WatchSource:0}: Error finding container 70d8e09161d3ee1fd6fa4d1d282297ce7519f3db4235e9bf41ae95f98a85e54c: Status 404 returned error can't find the container with id 70d8e09161d3ee1fd6fa4d1d282297ce7519f3db4235e9bf41ae95f98a85e54c Apr 16 04:35:17.820110 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:17.820095 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 04:35:18.268450 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:18.268418 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k" event={"ID":"fd45fa94-1a52-4f60-bef9-f5ae05e06e75","Type":"ContainerStarted","Data":"70d8e09161d3ee1fd6fa4d1d282297ce7519f3db4235e9bf41ae95f98a85e54c"} Apr 16 04:35:25.302284 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:25.302247 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k" event={"ID":"fd45fa94-1a52-4f60-bef9-f5ae05e06e75","Type":"ContainerStarted","Data":"a314772d21361d7daae580f0331563099293e65f4dbe1949e725f832d2d8134c"} Apr 16 04:35:31.329064 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:31.329030 2575 generic.go:358] "Generic (PLEG): container finished" podID="fd45fa94-1a52-4f60-bef9-f5ae05e06e75" containerID="a314772d21361d7daae580f0331563099293e65f4dbe1949e725f832d2d8134c" exitCode=0 Apr 16 04:35:31.329477 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:31.329104 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k" event={"ID":"fd45fa94-1a52-4f60-bef9-f5ae05e06e75","Type":"ContainerDied","Data":"a314772d21361d7daae580f0331563099293e65f4dbe1949e725f832d2d8134c"} Apr 16 04:35:35.348929 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:35.348842 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k" event={"ID":"fd45fa94-1a52-4f60-bef9-f5ae05e06e75","Type":"ContainerStarted","Data":"2dd70c2d5fa40f5ea520d41daeb1c2bc8927836e2e9f7d43245190113fcbd04a"} Apr 16 04:35:35.349402 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:35.349050 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k" Apr 16 04:35:35.365965 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:35.365910 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k" podStartSLOduration=1.169847652 podStartE2EDuration="18.365895918s" podCreationTimestamp="2026-04-16 04:35:17 +0000 UTC" firstStartedPulling="2026-04-16 04:35:17.82022756 +0000 UTC m=+665.723003537" lastFinishedPulling="2026-04-16 04:35:35.016275826 +0000 UTC m=+682.919051803" observedRunningTime="2026-04-16 04:35:35.364890131 +0000 UTC m=+683.267666130" watchObservedRunningTime="2026-04-16 04:35:35.365895918 +0000 UTC m=+683.268671917" Apr 16 04:35:35.975192 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:35.975151 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-gx46j"] Apr 16 04:35:35.979219 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:35.979193 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gx46j" Apr 16 04:35:35.981814 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:35.981787 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 16 04:35:35.987887 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:35.987861 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-gx46j"] Apr 16 04:35:36.124643 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:36.124605 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b93e3973-d0aa-4de9-8b08-53a47d8d9760-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-gx46j\" (UID: \"b93e3973-d0aa-4de9-8b08-53a47d8d9760\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gx46j" Apr 16 04:35:36.124643 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:36.124647 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b93e3973-d0aa-4de9-8b08-53a47d8d9760-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-gx46j\" (UID: \"b93e3973-d0aa-4de9-8b08-53a47d8d9760\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gx46j" Apr 16 04:35:36.124878 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:36.124675 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b93e3973-d0aa-4de9-8b08-53a47d8d9760-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-gx46j\" (UID: \"b93e3973-d0aa-4de9-8b08-53a47d8d9760\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gx46j" Apr 16 04:35:36.124878 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:36.124772 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bchqp\" (UniqueName: \"kubernetes.io/projected/b93e3973-d0aa-4de9-8b08-53a47d8d9760-kube-api-access-bchqp\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-gx46j\" (UID: \"b93e3973-d0aa-4de9-8b08-53a47d8d9760\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gx46j" Apr 16 04:35:36.124878 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:36.124850 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b93e3973-d0aa-4de9-8b08-53a47d8d9760-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-gx46j\" (UID: \"b93e3973-d0aa-4de9-8b08-53a47d8d9760\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gx46j" Apr 16 04:35:36.125001 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:36.124916 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b93e3973-d0aa-4de9-8b08-53a47d8d9760-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-gx46j\" (UID: \"b93e3973-d0aa-4de9-8b08-53a47d8d9760\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gx46j" Apr 16 04:35:36.225990 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:36.225892 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b93e3973-d0aa-4de9-8b08-53a47d8d9760-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-gx46j\" (UID: \"b93e3973-d0aa-4de9-8b08-53a47d8d9760\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gx46j" Apr 16 04:35:36.225990 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:36.225935 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b93e3973-d0aa-4de9-8b08-53a47d8d9760-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-gx46j\" (UID: \"b93e3973-d0aa-4de9-8b08-53a47d8d9760\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gx46j" Apr 16 04:35:36.225990 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:36.225958 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b93e3973-d0aa-4de9-8b08-53a47d8d9760-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-gx46j\" (UID: \"b93e3973-d0aa-4de9-8b08-53a47d8d9760\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gx46j" Apr 16 04:35:36.226255 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:36.226006 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bchqp\" (UniqueName: \"kubernetes.io/projected/b93e3973-d0aa-4de9-8b08-53a47d8d9760-kube-api-access-bchqp\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-gx46j\" (UID: \"b93e3973-d0aa-4de9-8b08-53a47d8d9760\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gx46j" Apr 16 04:35:36.226255 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:36.226035 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b93e3973-d0aa-4de9-8b08-53a47d8d9760-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-gx46j\" (UID: \"b93e3973-d0aa-4de9-8b08-53a47d8d9760\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gx46j" Apr 16 04:35:36.226255 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:36.226088 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b93e3973-d0aa-4de9-8b08-53a47d8d9760-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-gx46j\" (UID: \"b93e3973-d0aa-4de9-8b08-53a47d8d9760\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gx46j" Apr 16 04:35:36.226431 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:36.226407 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b93e3973-d0aa-4de9-8b08-53a47d8d9760-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-gx46j\" (UID: \"b93e3973-d0aa-4de9-8b08-53a47d8d9760\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gx46j" Apr 16 04:35:36.226490 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:36.226437 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b93e3973-d0aa-4de9-8b08-53a47d8d9760-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-gx46j\" (UID: \"b93e3973-d0aa-4de9-8b08-53a47d8d9760\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gx46j" Apr 16 04:35:36.226490 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:36.226481 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b93e3973-d0aa-4de9-8b08-53a47d8d9760-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-gx46j\" (UID: \"b93e3973-d0aa-4de9-8b08-53a47d8d9760\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gx46j" Apr 16 04:35:36.228475 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:36.228451 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b93e3973-d0aa-4de9-8b08-53a47d8d9760-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-gx46j\" (UID: \"b93e3973-d0aa-4de9-8b08-53a47d8d9760\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gx46j" Apr 16 04:35:36.228761 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:36.228740 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b93e3973-d0aa-4de9-8b08-53a47d8d9760-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-gx46j\" (UID: \"b93e3973-d0aa-4de9-8b08-53a47d8d9760\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gx46j" Apr 16 04:35:36.233665 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:36.233638 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bchqp\" (UniqueName: \"kubernetes.io/projected/b93e3973-d0aa-4de9-8b08-53a47d8d9760-kube-api-access-bchqp\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-gx46j\" (UID: \"b93e3973-d0aa-4de9-8b08-53a47d8d9760\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gx46j" Apr 16 04:35:36.290755 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:36.290712 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gx46j" Apr 16 04:35:36.423369 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:36.423339 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-gx46j"] Apr 16 04:35:36.425158 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:35:36.425126 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb93e3973_d0aa_4de9_8b08_53a47d8d9760.slice/crio-d8a8db100bb5960cf4cd6fbcfd437c7db8b4dbae2220716e0cfbc0587ff7a5cb WatchSource:0}: Error finding container d8a8db100bb5960cf4cd6fbcfd437c7db8b4dbae2220716e0cfbc0587ff7a5cb: Status 404 returned error can't find the container with id d8a8db100bb5960cf4cd6fbcfd437c7db8b4dbae2220716e0cfbc0587ff7a5cb Apr 16 04:35:37.359716 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:37.359671 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gx46j" event={"ID":"b93e3973-d0aa-4de9-8b08-53a47d8d9760","Type":"ContainerStarted","Data":"094b84ce206ad2c8a99b80dbd6665d5dbd8af4483036139a544538fe016e24ec"} Apr 16 04:35:37.359716 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:37.359715 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gx46j" event={"ID":"b93e3973-d0aa-4de9-8b08-53a47d8d9760","Type":"ContainerStarted","Data":"d8a8db100bb5960cf4cd6fbcfd437c7db8b4dbae2220716e0cfbc0587ff7a5cb"} Apr 16 04:35:46.367964 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:46.367925 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k" Apr 16 04:35:46.397154 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:46.397120 2575 generic.go:358] "Generic (PLEG): container finished" podID="b93e3973-d0aa-4de9-8b08-53a47d8d9760" containerID="094b84ce206ad2c8a99b80dbd6665d5dbd8af4483036139a544538fe016e24ec" exitCode=0 Apr 16 04:35:46.397359 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:46.397178 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gx46j" event={"ID":"b93e3973-d0aa-4de9-8b08-53a47d8d9760","Type":"ContainerDied","Data":"094b84ce206ad2c8a99b80dbd6665d5dbd8af4483036139a544538fe016e24ec"} Apr 16 04:35:47.403660 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:47.403620 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gx46j" event={"ID":"b93e3973-d0aa-4de9-8b08-53a47d8d9760","Type":"ContainerStarted","Data":"3d35ebaa2e0e2aa49eca3adb7985f9bd5d035b57a7890d0c50b20d341ea41e63"} Apr 16 04:35:47.404144 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:47.403832 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gx46j" Apr 16 04:35:47.422771 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:47.422713 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gx46j" podStartSLOduration=12.24324688 podStartE2EDuration="12.422696977s" podCreationTimestamp="2026-04-16 04:35:35 +0000 UTC" firstStartedPulling="2026-04-16 04:35:46.397821766 +0000 UTC m=+694.300597742" lastFinishedPulling="2026-04-16 04:35:46.57727186 +0000 UTC m=+694.480047839" observedRunningTime="2026-04-16 04:35:47.42052802 +0000 UTC m=+695.323304020" watchObservedRunningTime="2026-04-16 04:35:47.422696977 +0000 UTC m=+695.325473039" Apr 16 04:35:50.371412 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:50.371376 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-pxbhw"] Apr 16 04:35:50.377422 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:50.377397 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pxbhw" Apr 16 04:35:50.381364 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:50.381320 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 16 04:35:50.383636 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:50.383613 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-pxbhw"] Apr 16 04:35:50.459075 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:50.459041 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/67e85ac4-e072-439e-bc31-0451314d3e94-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-pxbhw\" (UID: \"67e85ac4-e072-439e-bc31-0451314d3e94\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pxbhw" Apr 16 04:35:50.459248 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:50.459088 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/67e85ac4-e072-439e-bc31-0451314d3e94-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-pxbhw\" (UID: \"67e85ac4-e072-439e-bc31-0451314d3e94\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pxbhw" Apr 16 04:35:50.459248 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:50.459127 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/67e85ac4-e072-439e-bc31-0451314d3e94-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-pxbhw\" (UID: \"67e85ac4-e072-439e-bc31-0451314d3e94\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pxbhw" Apr 16 04:35:50.459323 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:50.459258 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/67e85ac4-e072-439e-bc31-0451314d3e94-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-pxbhw\" (UID: \"67e85ac4-e072-439e-bc31-0451314d3e94\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pxbhw" Apr 16 04:35:50.459376 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:50.459317 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/67e85ac4-e072-439e-bc31-0451314d3e94-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-pxbhw\" (UID: \"67e85ac4-e072-439e-bc31-0451314d3e94\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pxbhw" Apr 16 04:35:50.459376 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:50.459365 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjxch\" (UniqueName: \"kubernetes.io/projected/67e85ac4-e072-439e-bc31-0451314d3e94-kube-api-access-pjxch\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-pxbhw\" (UID: \"67e85ac4-e072-439e-bc31-0451314d3e94\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pxbhw" Apr 16 04:35:50.560679 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:50.560639 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/67e85ac4-e072-439e-bc31-0451314d3e94-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-pxbhw\" (UID: \"67e85ac4-e072-439e-bc31-0451314d3e94\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pxbhw" Apr 16 04:35:50.560874 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:50.560720 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/67e85ac4-e072-439e-bc31-0451314d3e94-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-pxbhw\" (UID: \"67e85ac4-e072-439e-bc31-0451314d3e94\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pxbhw" Apr 16 04:35:50.560874 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:50.560750 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/67e85ac4-e072-439e-bc31-0451314d3e94-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-pxbhw\" (UID: \"67e85ac4-e072-439e-bc31-0451314d3e94\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pxbhw" Apr 16 04:35:50.560874 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:50.560770 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pjxch\" (UniqueName: \"kubernetes.io/projected/67e85ac4-e072-439e-bc31-0451314d3e94-kube-api-access-pjxch\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-pxbhw\" (UID: \"67e85ac4-e072-439e-bc31-0451314d3e94\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pxbhw" Apr 16 04:35:50.560874 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:50.560818 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/67e85ac4-e072-439e-bc31-0451314d3e94-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-pxbhw\" (UID: \"67e85ac4-e072-439e-bc31-0451314d3e94\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pxbhw" Apr 16 04:35:50.560874 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:50.560857 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/67e85ac4-e072-439e-bc31-0451314d3e94-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-pxbhw\" (UID: \"67e85ac4-e072-439e-bc31-0451314d3e94\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pxbhw" Apr 16 04:35:50.561184 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:50.561160 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/67e85ac4-e072-439e-bc31-0451314d3e94-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-pxbhw\" (UID: \"67e85ac4-e072-439e-bc31-0451314d3e94\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pxbhw" Apr 16 04:35:50.561241 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:50.561220 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/67e85ac4-e072-439e-bc31-0451314d3e94-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-pxbhw\" (UID: \"67e85ac4-e072-439e-bc31-0451314d3e94\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pxbhw" Apr 16 04:35:50.561294 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:50.561269 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/67e85ac4-e072-439e-bc31-0451314d3e94-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-pxbhw\" (UID: \"67e85ac4-e072-439e-bc31-0451314d3e94\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pxbhw" Apr 16 04:35:50.563137 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:50.563115 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/67e85ac4-e072-439e-bc31-0451314d3e94-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-pxbhw\" (UID: \"67e85ac4-e072-439e-bc31-0451314d3e94\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pxbhw" Apr 16 04:35:50.563305 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:50.563277 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/67e85ac4-e072-439e-bc31-0451314d3e94-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-pxbhw\" (UID: \"67e85ac4-e072-439e-bc31-0451314d3e94\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pxbhw" Apr 16 04:35:50.568585 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:50.568560 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjxch\" (UniqueName: \"kubernetes.io/projected/67e85ac4-e072-439e-bc31-0451314d3e94-kube-api-access-pjxch\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-pxbhw\" (UID: \"67e85ac4-e072-439e-bc31-0451314d3e94\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pxbhw" Apr 16 04:35:50.713142 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:50.713051 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pxbhw" Apr 16 04:35:50.838243 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:50.838213 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-pxbhw"] Apr 16 04:35:50.840041 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:35:50.840005 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67e85ac4_e072_439e_bc31_0451314d3e94.slice/crio-55f2da4f5965d85c3b0cc83e31f39bc4e18e80af902532dd1b0b3ef36a4fe4a2 WatchSource:0}: Error finding container 55f2da4f5965d85c3b0cc83e31f39bc4e18e80af902532dd1b0b3ef36a4fe4a2: Status 404 returned error can't find the container with id 55f2da4f5965d85c3b0cc83e31f39bc4e18e80af902532dd1b0b3ef36a4fe4a2 Apr 16 04:35:51.430163 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:51.430118 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pxbhw" event={"ID":"67e85ac4-e072-439e-bc31-0451314d3e94","Type":"ContainerStarted","Data":"ee6e07c83cb1dcafd4f5a3ffbb18a2190a0bd182a83b2132e52895f8c9fb1de0"} Apr 16 04:35:51.430163 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:51.430170 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pxbhw" event={"ID":"67e85ac4-e072-439e-bc31-0451314d3e94","Type":"ContainerStarted","Data":"55f2da4f5965d85c3b0cc83e31f39bc4e18e80af902532dd1b0b3ef36a4fe4a2"} Apr 16 04:35:57.458198 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:57.458167 2575 generic.go:358] "Generic (PLEG): container finished" podID="67e85ac4-e072-439e-bc31-0451314d3e94" containerID="ee6e07c83cb1dcafd4f5a3ffbb18a2190a0bd182a83b2132e52895f8c9fb1de0" exitCode=0 Apr 16 04:35:57.458595 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:57.458241 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pxbhw" event={"ID":"67e85ac4-e072-439e-bc31-0451314d3e94","Type":"ContainerDied","Data":"ee6e07c83cb1dcafd4f5a3ffbb18a2190a0bd182a83b2132e52895f8c9fb1de0"} Apr 16 04:35:58.421571 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:58.421544 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gx46j" Apr 16 04:35:58.465110 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:58.465077 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pxbhw" event={"ID":"67e85ac4-e072-439e-bc31-0451314d3e94","Type":"ContainerStarted","Data":"7118b9bc48caa59fa04eb677b3e7838c41645e07d79c4825fc69c1189825e2e4"} Apr 16 04:35:58.465497 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:58.465369 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pxbhw" Apr 16 04:35:58.485618 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:35:58.485562 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pxbhw" podStartSLOduration=8.087692909 podStartE2EDuration="8.485546845s" podCreationTimestamp="2026-04-16 04:35:50 +0000 UTC" firstStartedPulling="2026-04-16 04:35:57.458931267 +0000 UTC m=+705.361707244" lastFinishedPulling="2026-04-16 04:35:57.856785199 +0000 UTC m=+705.759561180" observedRunningTime="2026-04-16 04:35:58.481279154 +0000 UTC m=+706.384055167" watchObservedRunningTime="2026-04-16 04:35:58.485546845 +0000 UTC m=+706.388322855" Apr 16 04:36:09.482090 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:36:09.482057 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-pxbhw" Apr 16 04:39:13.361372 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:39:13.361248 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kn57s_ed401949-3fdf-455d-bb0a-400a7a15466c/console-operator/2.log" Apr 16 04:39:13.365644 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:39:13.362380 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kn57s_ed401949-3fdf-455d-bb0a-400a7a15466c/console-operator/2.log" Apr 16 04:44:13.392752 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:44:13.392628 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kn57s_ed401949-3fdf-455d-bb0a-400a7a15466c/console-operator/2.log" Apr 16 04:44:13.395969 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:44:13.395949 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kn57s_ed401949-3fdf-455d-bb0a-400a7a15466c/console-operator/2.log" Apr 16 04:49:13.428972 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:49:13.428852 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kn57s_ed401949-3fdf-455d-bb0a-400a7a15466c/console-operator/2.log" Apr 16 04:49:13.434244 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:49:13.434220 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kn57s_ed401949-3fdf-455d-bb0a-400a7a15466c/console-operator/2.log" Apr 16 04:54:13.460254 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:54:13.460131 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kn57s_ed401949-3fdf-455d-bb0a-400a7a15466c/console-operator/2.log" Apr 16 04:54:13.469210 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:54:13.469185 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kn57s_ed401949-3fdf-455d-bb0a-400a7a15466c/console-operator/2.log" Apr 16 04:54:48.139070 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:54:48.139039 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-54997d6c57-4xkbg_0f5cd07a-518a-4d00-b76d-86795485da57/maas-api/0.log" Apr 16 04:54:48.256024 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:54:48.255992 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-5757c45c47-9hxdq_252cb17f-e425-41c7-92e5-6f96222c7ab1/manager/0.log" Apr 16 04:54:48.712985 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:54:48.712957 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-c7946b447-rkb2q_41916b10-588d-4396-b383-4bd43536f029/manager/0.log" Apr 16 04:54:49.587599 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:54:49.587557 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759t9bx9_ace60148-af6e-4ac7-aa73-cc2276b4f3c7/pull/0.log" Apr 16 04:54:49.593795 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:54:49.593775 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759t9bx9_ace60148-af6e-4ac7-aa73-cc2276b4f3c7/extract/0.log" Apr 16 04:54:49.599754 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:54:49.599730 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759t9bx9_ace60148-af6e-4ac7-aa73-cc2276b4f3c7/util/0.log" Apr 16 04:54:49.709905 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:54:49.709880 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0lwbrn_5c4034f6-34d8-4652-b48a-c39d4dca98db/extract/0.log" Apr 16 04:54:49.715943 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:54:49.715919 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0lwbrn_5c4034f6-34d8-4652-b48a-c39d4dca98db/util/0.log" Apr 16 04:54:49.722092 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:54:49.722071 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0lwbrn_5c4034f6-34d8-4652-b48a-c39d4dca98db/pull/0.log" Apr 16 04:54:49.829534 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:54:49.829508 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736lkx4_0883035b-5114-4548-a7e0-5145e03e2f01/util/0.log" Apr 16 04:54:49.835841 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:54:49.835796 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736lkx4_0883035b-5114-4548-a7e0-5145e03e2f01/pull/0.log" Apr 16 04:54:49.841579 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:54:49.841504 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736lkx4_0883035b-5114-4548-a7e0-5145e03e2f01/extract/0.log" Apr 16 04:54:49.951812 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:54:49.951789 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ltqff_29185a8a-aac1-4386-8cb3-10e29ac2369e/util/0.log" Apr 16 04:54:49.957684 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:54:49.957649 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ltqff_29185a8a-aac1-4386-8cb3-10e29ac2369e/pull/0.log" Apr 16 04:54:49.963075 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:54:49.963052 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ltqff_29185a8a-aac1-4386-8cb3-10e29ac2369e/extract/0.log" Apr 16 04:54:50.304408 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:54:50.304382 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-9cp6s_3fdbb361-f9e6-49bc-ad0c-aade3124e9fc/manager/0.log" Apr 16 04:54:50.531811 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:54:50.531780 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-lllgk_6b976af6-87f2-4d13-850b-af9235ade66d/registry-server/0.log" Apr 16 04:54:50.645748 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:54:50.645662 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6bc9f4c76f-ckrjs_ea10468f-116f-46e5-b15d-dc8a905e4834/manager/0.log" Apr 16 04:54:50.870394 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:54:50.870363 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-fl4zz_6724642e-f23d-4124-bd71-50f9d98f8ed6/manager/0.log" Apr 16 04:54:51.205481 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:54:51.205451 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff_c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8/istio-proxy/0.log" Apr 16 04:54:51.762671 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:54:51.762626 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5d4bf78676-l9b55_a33cde98-e7f0-4bc9-a9e5-1a6136039882/router/0.log" Apr 16 04:54:52.106117 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:54:52.106041 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k_fd45fa94-1a52-4f60-bef9-f5ae05e06e75/storage-initializer/0.log" Apr 16 04:54:52.112884 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:54:52.112862 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-8454f99c75-xqz4k_fd45fa94-1a52-4f60-bef9-f5ae05e06e75/main/0.log" Apr 16 04:54:52.230312 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:54:52.230285 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-69d7bf476b-pxbhw_67e85ac4-e072-439e-bc31-0451314d3e94/main/0.log" Apr 16 04:54:52.238137 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:54:52.238114 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-69d7bf476b-pxbhw_67e85ac4-e072-439e-bc31-0451314d3e94/storage-initializer/0.log" Apr 16 04:54:52.593854 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:54:52.593827 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-f5df4587b-gx46j_b93e3973-d0aa-4de9-8b08-53a47d8d9760/storage-initializer/0.log" Apr 16 04:54:52.601178 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:54:52.601157 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-f5df4587b-gx46j_b93e3973-d0aa-4de9-8b08-53a47d8d9760/main/0.log" Apr 16 04:54:59.276323 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:54:59.276290 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-fm95k_5b75a953-7e5c-49dd-823f-92d909687592/global-pull-secret-syncer/0.log" Apr 16 04:54:59.360955 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:54:59.360921 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-c4k78_fbd09ec2-88c2-46c7-b3da-09c43c74e6ab/konnectivity-agent/0.log" Apr 16 04:54:59.404759 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:54:59.404729 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-103.ec2.internal_036d1e1671f062f79bfb2c499603021f/haproxy/0.log" Apr 16 04:55:03.551818 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:03.551783 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759t9bx9_ace60148-af6e-4ac7-aa73-cc2276b4f3c7/extract/0.log" Apr 16 04:55:03.578180 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:03.578143 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759t9bx9_ace60148-af6e-4ac7-aa73-cc2276b4f3c7/util/0.log" Apr 16 04:55:03.603101 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:03.603077 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759t9bx9_ace60148-af6e-4ac7-aa73-cc2276b4f3c7/pull/0.log" Apr 16 04:55:03.637963 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:03.637934 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0lwbrn_5c4034f6-34d8-4652-b48a-c39d4dca98db/extract/0.log" Apr 16 04:55:03.664531 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:03.664499 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0lwbrn_5c4034f6-34d8-4652-b48a-c39d4dca98db/util/0.log" Apr 16 04:55:03.688207 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:03.688175 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0lwbrn_5c4034f6-34d8-4652-b48a-c39d4dca98db/pull/0.log" Apr 16 04:55:03.716635 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:03.716605 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736lkx4_0883035b-5114-4548-a7e0-5145e03e2f01/extract/0.log" Apr 16 04:55:03.736476 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:03.736442 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736lkx4_0883035b-5114-4548-a7e0-5145e03e2f01/util/0.log" Apr 16 04:55:03.763425 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:03.763396 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736lkx4_0883035b-5114-4548-a7e0-5145e03e2f01/pull/0.log" Apr 16 04:55:03.793996 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:03.793961 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ltqff_29185a8a-aac1-4386-8cb3-10e29ac2369e/extract/0.log" Apr 16 04:55:03.815117 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:03.815045 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ltqff_29185a8a-aac1-4386-8cb3-10e29ac2369e/util/0.log" Apr 16 04:55:03.840533 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:03.840504 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ltqff_29185a8a-aac1-4386-8cb3-10e29ac2369e/pull/0.log" Apr 16 04:55:04.139045 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:04.138956 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-9cp6s_3fdbb361-f9e6-49bc-ad0c-aade3124e9fc/manager/0.log" Apr 16 04:55:04.207397 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:04.207370 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-lllgk_6b976af6-87f2-4d13-850b-af9235ade66d/registry-server/0.log" Apr 16 04:55:04.318343 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:04.318304 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6bc9f4c76f-ckrjs_ea10468f-116f-46e5-b15d-dc8a905e4834/manager/0.log" Apr 16 04:55:04.405493 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:04.405407 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-fl4zz_6724642e-f23d-4124-bd71-50f9d98f8ed6/manager/0.log" Apr 16 04:55:05.829658 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:05.829628 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_11531649-599b-411c-a280-3750a982723c/alertmanager/0.log" Apr 16 04:55:05.851698 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:05.851671 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_11531649-599b-411c-a280-3750a982723c/config-reloader/0.log" Apr 16 04:55:05.873559 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:05.873530 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_11531649-599b-411c-a280-3750a982723c/kube-rbac-proxy-web/0.log" Apr 16 04:55:05.893544 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:05.893517 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_11531649-599b-411c-a280-3750a982723c/kube-rbac-proxy/0.log" Apr 16 04:55:05.914844 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:05.914813 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_11531649-599b-411c-a280-3750a982723c/kube-rbac-proxy-metric/0.log" Apr 16 04:55:05.938753 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:05.938725 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_11531649-599b-411c-a280-3750a982723c/prom-label-proxy/0.log" Apr 16 04:55:05.959928 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:05.959897 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_11531649-599b-411c-a280-3750a982723c/init-config-reloader/0.log" Apr 16 04:55:06.322298 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:06.322268 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lmlvh_b4ab77b7-6bc2-4d37-b605-775e02af4e25/node-exporter/0.log" Apr 16 04:55:06.343286 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:06.343255 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lmlvh_b4ab77b7-6bc2-4d37-b605-775e02af4e25/kube-rbac-proxy/0.log" Apr 16 04:55:06.366689 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:06.366662 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lmlvh_b4ab77b7-6bc2-4d37-b605-775e02af4e25/init-textfile/0.log" Apr 16 04:55:06.468187 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:06.468156 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-f2phz_caa3e4a8-086e-41c4-8c8d-baead7f9e776/kube-rbac-proxy-main/0.log" Apr 16 04:55:06.489291 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:06.489261 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-f2phz_caa3e4a8-086e-41c4-8c8d-baead7f9e776/kube-rbac-proxy-self/0.log" Apr 16 04:55:06.512034 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:06.512006 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-f2phz_caa3e4a8-086e-41c4-8c8d-baead7f9e776/openshift-state-metrics/0.log" Apr 16 04:55:06.881862 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:06.881830 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-84c6dc88d4-45qc8_b1e4d17e-936f-4c41-ae50-b17dd9528813/thanos-query/0.log" Apr 16 04:55:06.904682 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:06.904657 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-84c6dc88d4-45qc8_b1e4d17e-936f-4c41-ae50-b17dd9528813/kube-rbac-proxy-web/0.log" Apr 16 04:55:06.925949 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:06.925923 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-84c6dc88d4-45qc8_b1e4d17e-936f-4c41-ae50-b17dd9528813/kube-rbac-proxy/0.log" Apr 16 04:55:06.946820 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:06.946793 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-84c6dc88d4-45qc8_b1e4d17e-936f-4c41-ae50-b17dd9528813/prom-label-proxy/0.log" Apr 16 04:55:06.968161 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:06.968135 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-84c6dc88d4-45qc8_b1e4d17e-936f-4c41-ae50-b17dd9528813/kube-rbac-proxy-rules/0.log" Apr 16 04:55:06.992849 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:06.992822 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-84c6dc88d4-45qc8_b1e4d17e-936f-4c41-ae50-b17dd9528813/kube-rbac-proxy-metrics/0.log" Apr 16 04:55:07.685970 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:07.685936 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4r9jh/perf-node-gather-daemonset-r6hth"] Apr 16 04:55:07.689797 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:07.689773 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-r6hth" Apr 16 04:55:07.692711 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:07.692687 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-4r9jh\"/\"openshift-service-ca.crt\"" Apr 16 04:55:07.692829 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:07.692688 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-4r9jh\"/\"default-dockercfg-4f7rr\"" Apr 16 04:55:07.693839 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:07.693817 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-4r9jh\"/\"kube-root-ca.crt\"" Apr 16 04:55:07.699872 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:07.699848 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4r9jh/perf-node-gather-daemonset-r6hth"] Apr 16 04:55:07.781057 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:07.781017 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7a7fe5de-cf6e-4266-82bf-aa3403dde010-sys\") pod \"perf-node-gather-daemonset-r6hth\" (UID: \"7a7fe5de-cf6e-4266-82bf-aa3403dde010\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-r6hth" Apr 16 04:55:07.781057 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:07.781063 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7a7fe5de-cf6e-4266-82bf-aa3403dde010-podres\") pod \"perf-node-gather-daemonset-r6hth\" (UID: \"7a7fe5de-cf6e-4266-82bf-aa3403dde010\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-r6hth" Apr 16 04:55:07.781289 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:07.781177 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7a7fe5de-cf6e-4266-82bf-aa3403dde010-lib-modules\") pod \"perf-node-gather-daemonset-r6hth\" (UID: \"7a7fe5de-cf6e-4266-82bf-aa3403dde010\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-r6hth" Apr 16 04:55:07.781289 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:07.781237 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7a7fe5de-cf6e-4266-82bf-aa3403dde010-proc\") pod \"perf-node-gather-daemonset-r6hth\" (UID: \"7a7fe5de-cf6e-4266-82bf-aa3403dde010\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-r6hth" Apr 16 04:55:07.781289 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:07.781272 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q976m\" (UniqueName: \"kubernetes.io/projected/7a7fe5de-cf6e-4266-82bf-aa3403dde010-kube-api-access-q976m\") pod \"perf-node-gather-daemonset-r6hth\" (UID: \"7a7fe5de-cf6e-4266-82bf-aa3403dde010\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-r6hth" Apr 16 04:55:07.881772 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:07.881724 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7a7fe5de-cf6e-4266-82bf-aa3403dde010-sys\") pod \"perf-node-gather-daemonset-r6hth\" (UID: \"7a7fe5de-cf6e-4266-82bf-aa3403dde010\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-r6hth" Apr 16 04:55:07.881772 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:07.881772 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7a7fe5de-cf6e-4266-82bf-aa3403dde010-podres\") pod \"perf-node-gather-daemonset-r6hth\" (UID: \"7a7fe5de-cf6e-4266-82bf-aa3403dde010\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-r6hth" Apr 16 04:55:07.882238 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:07.881820 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7a7fe5de-cf6e-4266-82bf-aa3403dde010-lib-modules\") pod \"perf-node-gather-daemonset-r6hth\" (UID: \"7a7fe5de-cf6e-4266-82bf-aa3403dde010\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-r6hth" Apr 16 04:55:07.882238 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:07.881870 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7a7fe5de-cf6e-4266-82bf-aa3403dde010-proc\") pod \"perf-node-gather-daemonset-r6hth\" (UID: \"7a7fe5de-cf6e-4266-82bf-aa3403dde010\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-r6hth" Apr 16 04:55:07.882238 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:07.881867 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7a7fe5de-cf6e-4266-82bf-aa3403dde010-sys\") pod \"perf-node-gather-daemonset-r6hth\" (UID: \"7a7fe5de-cf6e-4266-82bf-aa3403dde010\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-r6hth" Apr 16 04:55:07.882238 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:07.881916 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q976m\" (UniqueName: \"kubernetes.io/projected/7a7fe5de-cf6e-4266-82bf-aa3403dde010-kube-api-access-q976m\") pod \"perf-node-gather-daemonset-r6hth\" (UID: \"7a7fe5de-cf6e-4266-82bf-aa3403dde010\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-r6hth" Apr 16 04:55:07.882238 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:07.881944 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7a7fe5de-cf6e-4266-82bf-aa3403dde010-proc\") pod \"perf-node-gather-daemonset-r6hth\" (UID: \"7a7fe5de-cf6e-4266-82bf-aa3403dde010\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-r6hth" Apr 16 04:55:07.882238 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:07.881970 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7a7fe5de-cf6e-4266-82bf-aa3403dde010-podres\") pod \"perf-node-gather-daemonset-r6hth\" (UID: \"7a7fe5de-cf6e-4266-82bf-aa3403dde010\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-r6hth" Apr 16 04:55:07.882238 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:07.881992 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7a7fe5de-cf6e-4266-82bf-aa3403dde010-lib-modules\") pod \"perf-node-gather-daemonset-r6hth\" (UID: \"7a7fe5de-cf6e-4266-82bf-aa3403dde010\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-r6hth" Apr 16 04:55:07.889150 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:07.889130 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q976m\" (UniqueName: \"kubernetes.io/projected/7a7fe5de-cf6e-4266-82bf-aa3403dde010-kube-api-access-q976m\") pod \"perf-node-gather-daemonset-r6hth\" (UID: \"7a7fe5de-cf6e-4266-82bf-aa3403dde010\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-r6hth" Apr 16 04:55:08.000680 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:08.000654 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-r6hth" Apr 16 04:55:08.135476 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:08.135443 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4r9jh/perf-node-gather-daemonset-r6hth"] Apr 16 04:55:08.136896 ip-10-0-133-103 kubenswrapper[2575]: W0416 04:55:08.136859 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7a7fe5de_cf6e_4266_82bf_aa3403dde010.slice/crio-ca9b0f4e8a9b3e8db7af26817dbae2cf5d5bd2064b7409f4e7d48e7c546fc296 WatchSource:0}: Error finding container ca9b0f4e8a9b3e8db7af26817dbae2cf5d5bd2064b7409f4e7d48e7c546fc296: Status 404 returned error can't find the container with id ca9b0f4e8a9b3e8db7af26817dbae2cf5d5bd2064b7409f4e7d48e7c546fc296 Apr 16 04:55:08.138488 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:08.138472 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 04:55:08.440643 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:08.440563 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kn57s_ed401949-3fdf-455d-bb0a-400a7a15466c/console-operator/2.log" Apr 16 04:55:08.444714 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:08.444697 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kn57s_ed401949-3fdf-455d-bb0a-400a7a15466c/console-operator/3.log" Apr 16 04:55:08.882543 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:08.882512 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5998c4dcdc-rfsn4_d1ed06a6-893b-4e37-b06e-a2d444b721cb/console/0.log" Apr 16 04:55:08.912858 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:08.912829 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-586b57c7b4-q9zfd_fab06a06-1137-449c-860c-0eb917231cd3/download-server/0.log" Apr 16 04:55:09.080830 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:09.080792 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-r6hth" event={"ID":"7a7fe5de-cf6e-4266-82bf-aa3403dde010","Type":"ContainerStarted","Data":"c4bab5e8c4ff2a4d80d541216a7f65d6e6fdb4985cf7fa7baccedff951aec16d"} Apr 16 04:55:09.080830 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:09.080833 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-r6hth" event={"ID":"7a7fe5de-cf6e-4266-82bf-aa3403dde010","Type":"ContainerStarted","Data":"ca9b0f4e8a9b3e8db7af26817dbae2cf5d5bd2064b7409f4e7d48e7c546fc296"} Apr 16 04:55:09.081155 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:09.080903 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-r6hth" Apr 16 04:55:09.098884 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:09.098839 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-r6hth" podStartSLOduration=2.098824704 podStartE2EDuration="2.098824704s" podCreationTimestamp="2026-04-16 04:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 04:55:09.096784954 +0000 UTC m=+1856.999560954" watchObservedRunningTime="2026-04-16 04:55:09.098824704 +0000 UTC m=+1857.001600733" Apr 16 04:55:10.116535 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:10.116501 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-fvbjz_6a8f8c89-66e4-4302-bf2b-05fd369633cf/dns/0.log" Apr 16 04:55:10.140550 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:10.140518 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-fvbjz_6a8f8c89-66e4-4302-bf2b-05fd369633cf/kube-rbac-proxy/0.log" Apr 16 04:55:10.299309 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:10.299274 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-w7h7r_1d98373a-f359-466d-8c72-1e9b4e835c04/dns-node-resolver/0.log" Apr 16 04:55:10.817456 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:10.817428 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-dj5pp_f587afea-01f4-4981-9efb-09779dde706b/node-ca/0.log" Apr 16 04:55:11.630371 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:11.630339 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfqkzff_c86bff4a-b5e8-4159-a34e-9c8bdf20b6b8/istio-proxy/0.log" Apr 16 04:55:11.849545 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:11.849517 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5d4bf78676-l9b55_a33cde98-e7f0-4bc9-a9e5-1a6136039882/router/0.log" Apr 16 04:55:12.369047 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:12.369018 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-8pzwz_b0ca4f43-c47b-4ec0-82f8-3a8054f7a277/serve-healthcheck-canary/0.log" Apr 16 04:55:12.955131 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:12.955103 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-kmwkb_7d5fb402-61cd-404e-9bcf-644c7a73c048/kube-rbac-proxy/0.log" Apr 16 04:55:12.975727 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:12.975704 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-kmwkb_7d5fb402-61cd-404e-9bcf-644c7a73c048/exporter/0.log" Apr 16 04:55:12.995196 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:12.995170 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-kmwkb_7d5fb402-61cd-404e-9bcf-644c7a73c048/extractor/0.log" Apr 16 04:55:14.915749 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:14.915716 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-54997d6c57-4xkbg_0f5cd07a-518a-4d00-b76d-86795485da57/maas-api/0.log" Apr 16 04:55:14.969893 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:14.969863 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-5757c45c47-9hxdq_252cb17f-e425-41c7-92e5-6f96222c7ab1/manager/0.log" Apr 16 04:55:15.087430 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:15.087398 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-c7946b447-rkb2q_41916b10-588d-4396-b383-4bd43536f029/manager/0.log" Apr 16 04:55:15.094096 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:15.094077 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-r6hth" Apr 16 04:55:16.423548 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:16.423518 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-5988777b7d-tsvd6_5b421dd7-c112-44f4-bc6d-a4e455703e31/manager/0.log" Apr 16 04:55:21.095830 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:21.095801 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-79vz7_b6856769-4d20-4817-9f00-63ded00b2073/kube-storage-version-migrator-operator/1.log" Apr 16 04:55:21.096633 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:21.096613 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-79vz7_b6856769-4d20-4817-9f00-63ded00b2073/kube-storage-version-migrator-operator/0.log" Apr 16 04:55:22.386805 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:22.386773 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cxwbn_56306845-377c-4423-8ae8-4f5824970788/kube-multus-additional-cni-plugins/0.log" Apr 16 04:55:22.408487 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:22.408454 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cxwbn_56306845-377c-4423-8ae8-4f5824970788/egress-router-binary-copy/0.log" Apr 16 04:55:22.443082 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:22.443056 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cxwbn_56306845-377c-4423-8ae8-4f5824970788/cni-plugins/0.log" Apr 16 04:55:22.466754 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:22.466729 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cxwbn_56306845-377c-4423-8ae8-4f5824970788/bond-cni-plugin/0.log" Apr 16 04:55:22.488475 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:22.488443 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cxwbn_56306845-377c-4423-8ae8-4f5824970788/routeoverride-cni/0.log" Apr 16 04:55:22.510114 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:22.510085 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cxwbn_56306845-377c-4423-8ae8-4f5824970788/whereabouts-cni-bincopy/0.log" Apr 16 04:55:22.534900 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:22.534871 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cxwbn_56306845-377c-4423-8ae8-4f5824970788/whereabouts-cni/0.log" Apr 16 04:55:22.596553 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:22.596509 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qksrt_961e60a9-039d-43e7-8613-150ee4d6b85d/kube-multus/0.log" Apr 16 04:55:22.651056 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:22.650967 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-875wc_414aeb12-26d8-4994-8fe8-1cad0342c929/network-metrics-daemon/0.log" Apr 16 04:55:22.673534 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:22.673507 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-875wc_414aeb12-26d8-4994-8fe8-1cad0342c929/kube-rbac-proxy/0.log" Apr 16 04:55:23.554097 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:23.554051 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4bdrp_1b42bbbf-91b9-4c9a-b525-a6832184768c/ovn-controller/0.log" Apr 16 04:55:23.579074 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:23.579043 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4bdrp_1b42bbbf-91b9-4c9a-b525-a6832184768c/ovn-acl-logging/0.log" Apr 16 04:55:23.597109 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:23.597078 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4bdrp_1b42bbbf-91b9-4c9a-b525-a6832184768c/kube-rbac-proxy-node/0.log" Apr 16 04:55:23.615826 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:23.615797 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4bdrp_1b42bbbf-91b9-4c9a-b525-a6832184768c/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 04:55:23.635671 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:23.635625 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4bdrp_1b42bbbf-91b9-4c9a-b525-a6832184768c/northd/0.log" Apr 16 04:55:23.655522 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:23.655482 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4bdrp_1b42bbbf-91b9-4c9a-b525-a6832184768c/nbdb/0.log" Apr 16 04:55:23.675430 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:23.675403 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4bdrp_1b42bbbf-91b9-4c9a-b525-a6832184768c/sbdb/0.log" Apr 16 04:55:23.774560 ip-10-0-133-103 kubenswrapper[2575]: I0416 04:55:23.774529 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4bdrp_1b42bbbf-91b9-4c9a-b525-a6832184768c/ovnkube-controller/0.log"