Apr 20 19:22:49.501401 ip-10-0-131-162 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 20 19:22:49.501413 ip-10-0-131-162 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 20 19:22:49.501420 ip-10-0-131-162 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 20 19:22:49.501882 ip-10-0-131-162 systemd[1]: Failed to start Kubernetes Kubelet. Apr 20 19:22:59.502891 ip-10-0-131-162 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 20 19:22:59.502912 ip-10-0-131-162 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 953582b5a618411ea9c5e7cdd93e70f7 -- Apr 20 19:25:25.883750 ip-10-0-131-162 systemd[1]: Starting Kubernetes Kubelet... Apr 20 19:25:26.333456 ip-10-0-131-162 kubenswrapper[2572]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 19:25:26.333456 ip-10-0-131-162 kubenswrapper[2572]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 19:25:26.333456 ip-10-0-131-162 kubenswrapper[2572]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 19:25:26.333456 ip-10-0-131-162 kubenswrapper[2572]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 19:25:26.333456 ip-10-0-131-162 kubenswrapper[2572]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 19:25:26.335897 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.335804 2572 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 19:25:26.344434 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344412 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:25:26.344434 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344431 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:25:26.344434 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344435 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:25:26.344545 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344453 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:25:26.344545 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344457 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:25:26.344545 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344460 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:25:26.344545 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344463 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:25:26.344545 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344466 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:25:26.344545 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344468 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:25:26.344545 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344471 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:25:26.344545 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344474 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:25:26.344545 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344478 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:25:26.344545 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344482 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:25:26.344545 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344485 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:25:26.344545 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344488 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:25:26.344545 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344491 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:25:26.344545 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344494 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:25:26.344545 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344497 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:25:26.344545 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344500 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:25:26.344545 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344503 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:25:26.344545 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344506 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:25:26.344545 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344508 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:25:26.344545 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344511 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:25:26.345023 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344514 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:25:26.345023 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344516 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:25:26.345023 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344519 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:25:26.345023 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344522 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:25:26.345023 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344524 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:25:26.345023 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344527 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:25:26.345023 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344529 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:25:26.345023 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344531 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:25:26.345023 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344534 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:25:26.345023 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344537 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:25:26.345023 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344539 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:25:26.345023 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344542 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:25:26.345023 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344544 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:25:26.345023 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344548 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:25:26.345023 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344551 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:25:26.345023 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344555 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:25:26.345023 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344558 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:25:26.345023 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344560 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:25:26.345023 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344563 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:25:26.345023 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344565 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:25:26.345826 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344569 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:25:26.345826 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344572 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:25:26.345826 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344575 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:25:26.345826 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344577 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:25:26.345826 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344580 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:25:26.345826 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344583 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:25:26.345826 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344585 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:25:26.345826 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344589 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:25:26.345826 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344593 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:25:26.345826 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344596 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:25:26.345826 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344599 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:25:26.345826 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344602 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:25:26.345826 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344604 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:25:26.345826 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344607 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:25:26.345826 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344609 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:25:26.345826 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344611 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:25:26.345826 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344614 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:25:26.345826 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344616 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:25:26.345826 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344619 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:25:26.346339 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344622 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:25:26.346339 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344624 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:25:26.346339 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344627 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:25:26.346339 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344630 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:25:26.346339 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344632 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:25:26.346339 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344636 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:25:26.346339 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344641 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:25:26.346339 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344645 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:25:26.346339 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344648 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:25:26.346339 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344650 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:25:26.346339 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344653 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:25:26.346339 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344656 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:25:26.346339 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344658 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:25:26.346339 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344661 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:25:26.346339 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344663 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:25:26.346339 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344665 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:25:26.346339 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344668 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:25:26.346339 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344670 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:25:26.346339 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344673 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:25:26.346818 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344676 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:25:26.346818 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344678 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:25:26.346818 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344680 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:25:26.346818 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344683 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:25:26.346818 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.344685 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:25:26.346818 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346711 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:25:26.346818 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346726 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:25:26.346818 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346732 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:25:26.346818 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346736 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:25:26.346818 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346739 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:25:26.346818 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346742 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:25:26.346818 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346745 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:25:26.346818 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346748 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:25:26.346818 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346751 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:25:26.346818 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346754 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:25:26.346818 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346756 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:25:26.346818 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346759 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:25:26.346818 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346762 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:25:26.346818 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346764 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:25:26.346818 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346767 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:25:26.347310 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346769 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:25:26.347310 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346772 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:25:26.347310 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346774 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:25:26.347310 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346777 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:25:26.347310 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346780 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:25:26.347310 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346783 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:25:26.347310 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346786 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:25:26.347310 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346790 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:25:26.347310 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346792 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:25:26.347310 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346795 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:25:26.347310 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346798 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:25:26.347310 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346800 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:25:26.347310 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346803 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:25:26.347310 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346805 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:25:26.347310 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346808 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:25:26.347310 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346810 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:25:26.347310 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346813 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:25:26.347310 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346815 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:25:26.347310 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346819 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:25:26.347310 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346822 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:25:26.347899 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346825 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:25:26.347899 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346827 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:25:26.347899 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346830 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:25:26.347899 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346833 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:25:26.347899 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346836 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:25:26.347899 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346838 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:25:26.347899 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346841 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:25:26.347899 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346843 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:25:26.347899 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346846 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:25:26.347899 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346848 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:25:26.347899 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346851 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:25:26.347899 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346853 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:25:26.347899 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346855 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:25:26.347899 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346858 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:25:26.347899 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346861 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:25:26.347899 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346864 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:25:26.347899 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346866 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:25:26.347899 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346869 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:25:26.347899 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346873 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:25:26.348372 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346877 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:25:26.348372 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346880 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:25:26.348372 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346883 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:25:26.348372 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346885 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:25:26.348372 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346887 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:25:26.348372 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346890 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:25:26.348372 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346893 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:25:26.348372 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346896 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:25:26.348372 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346899 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:25:26.348372 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346901 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:25:26.348372 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346904 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:25:26.348372 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346906 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:25:26.348372 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346909 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:25:26.348372 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346912 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:25:26.348372 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346915 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:25:26.348372 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346917 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:25:26.348372 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346919 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:25:26.348372 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346924 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:25:26.348372 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346926 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:25:26.348372 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346929 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:25:26.348906 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346932 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:25:26.348906 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346934 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:25:26.348906 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346937 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:25:26.348906 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346939 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:25:26.348906 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346943 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:25:26.348906 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346945 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:25:26.348906 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346948 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:25:26.348906 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346950 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:25:26.348906 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346953 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:25:26.348906 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346955 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:25:26.348906 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346958 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:25:26.348906 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.346965 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:25:26.348906 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347035 2572 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 19:25:26.348906 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347043 2572 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 19:25:26.348906 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347051 2572 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 19:25:26.348906 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347055 2572 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 19:25:26.348906 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347060 2572 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 19:25:26.348906 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347063 2572 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 19:25:26.348906 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347068 2572 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 19:25:26.348906 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347073 2572 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 19:25:26.348906 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347076 2572 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 19:25:26.349422 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347079 2572 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 19:25:26.349422 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347083 2572 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 19:25:26.349422 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347087 2572 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 19:25:26.349422 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347090 2572 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 19:25:26.349422 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347094 2572 flags.go:64] FLAG: --cgroup-root="" Apr 20 19:25:26.349422 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347097 2572 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 19:25:26.349422 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347100 2572 flags.go:64] FLAG: --client-ca-file="" Apr 20 19:25:26.349422 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347103 2572 flags.go:64] FLAG: --cloud-config="" Apr 20 19:25:26.349422 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347105 2572 flags.go:64] FLAG: --cloud-provider="external" Apr 20 19:25:26.349422 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347108 2572 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 19:25:26.349422 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347113 2572 flags.go:64] FLAG: --cluster-domain="" Apr 20 19:25:26.349422 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347116 2572 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 19:25:26.349422 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347119 2572 flags.go:64] FLAG: --config-dir="" Apr 20 19:25:26.349422 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347122 2572 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 19:25:26.349422 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347125 2572 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 19:25:26.349422 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347129 2572 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 19:25:26.349422 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347133 2572 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 19:25:26.349422 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347137 2572 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 19:25:26.349422 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347140 2572 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 19:25:26.349422 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347143 2572 flags.go:64] FLAG: --contention-profiling="false" Apr 20 19:25:26.349422 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347146 2572 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 19:25:26.349422 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347149 2572 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 19:25:26.349422 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347153 2572 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 19:25:26.349422 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347156 2572 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 19:25:26.349422 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347160 2572 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 19:25:26.350052 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347164 2572 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 19:25:26.350052 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347167 2572 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 19:25:26.350052 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347170 2572 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 19:25:26.350052 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347174 2572 flags.go:64] FLAG: --enable-server="true" Apr 20 19:25:26.350052 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347177 2572 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 19:25:26.350052 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347181 2572 flags.go:64] FLAG: --event-burst="100" Apr 20 19:25:26.350052 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347192 2572 flags.go:64] FLAG: --event-qps="50" Apr 20 19:25:26.350052 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347197 2572 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 19:25:26.350052 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347200 2572 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 19:25:26.350052 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347203 2572 flags.go:64] FLAG: --eviction-hard="" Apr 20 19:25:26.350052 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347207 2572 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 19:25:26.350052 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347210 2572 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 19:25:26.350052 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347213 2572 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 19:25:26.350052 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347216 2572 flags.go:64] FLAG: --eviction-soft="" Apr 20 19:25:26.350052 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347219 2572 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 19:25:26.350052 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347222 2572 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 19:25:26.350052 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347225 2572 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 19:25:26.350052 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347228 2572 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 19:25:26.350052 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347231 2572 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 19:25:26.350052 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347234 2572 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 19:25:26.350052 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347237 2572 flags.go:64] FLAG: --feature-gates="" Apr 20 19:25:26.350052 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347241 2572 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 19:25:26.350052 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347244 2572 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 19:25:26.350052 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347247 2572 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 19:25:26.350052 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347251 2572 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 19:25:26.350688 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347254 2572 flags.go:64] FLAG: --healthz-port="10248" Apr 20 19:25:26.350688 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347257 2572 flags.go:64] FLAG: --help="false" Apr 20 19:25:26.350688 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347260 2572 flags.go:64] FLAG: --hostname-override="ip-10-0-131-162.ec2.internal" Apr 20 19:25:26.350688 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347263 2572 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 19:25:26.350688 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347268 2572 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 19:25:26.350688 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347271 2572 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 19:25:26.350688 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347274 2572 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 19:25:26.350688 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347277 2572 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 19:25:26.350688 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347281 2572 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 19:25:26.350688 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347284 2572 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 19:25:26.350688 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347287 2572 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 19:25:26.350688 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347289 2572 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 19:25:26.350688 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347292 2572 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 19:25:26.350688 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347295 2572 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 19:25:26.350688 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347298 2572 flags.go:64] FLAG: --kube-reserved="" Apr 20 19:25:26.350688 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347301 2572 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 19:25:26.350688 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347304 2572 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 19:25:26.350688 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347307 2572 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 19:25:26.350688 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347310 2572 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 19:25:26.350688 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347313 2572 flags.go:64] FLAG: --lock-file="" Apr 20 19:25:26.350688 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347316 2572 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 19:25:26.350688 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347319 2572 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 19:25:26.350688 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347323 2572 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 19:25:26.350688 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347332 2572 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 19:25:26.351320 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347334 2572 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 19:25:26.351320 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347337 2572 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 19:25:26.351320 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347340 2572 flags.go:64] FLAG: --logging-format="text" Apr 20 19:25:26.351320 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347343 2572 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 19:25:26.351320 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347346 2572 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 19:25:26.351320 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347349 2572 flags.go:64] FLAG: --manifest-url="" Apr 20 19:25:26.351320 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347352 2572 flags.go:64] FLAG: --manifest-url-header="" Apr 20 19:25:26.351320 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347358 2572 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 19:25:26.351320 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347361 2572 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 19:25:26.351320 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347366 2572 flags.go:64] FLAG: --max-pods="110" Apr 20 19:25:26.351320 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347369 2572 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 19:25:26.351320 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347372 2572 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 19:25:26.351320 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347375 2572 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 19:25:26.351320 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347378 2572 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 19:25:26.351320 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347381 2572 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 19:25:26.351320 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347384 2572 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 19:25:26.351320 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347391 2572 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 19:25:26.351320 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347399 2572 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 19:25:26.351320 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347402 2572 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 19:25:26.351320 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347405 2572 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 19:25:26.351320 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347408 2572 flags.go:64] FLAG: --pod-cidr="" Apr 20 19:25:26.351320 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347411 2572 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 19:25:26.351320 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347416 2572 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 19:25:26.351906 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347419 2572 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 19:25:26.351906 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347422 2572 flags.go:64] FLAG: --pods-per-core="0" Apr 20 19:25:26.351906 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347425 2572 flags.go:64] FLAG: --port="10250" Apr 20 19:25:26.351906 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347428 2572 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 19:25:26.351906 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347431 2572 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-037e8055756a5f612" Apr 20 19:25:26.351906 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347434 2572 flags.go:64] FLAG: --qos-reserved="" Apr 20 19:25:26.351906 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347450 2572 flags.go:64] FLAG: --read-only-port="10255" Apr 20 19:25:26.351906 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347453 2572 flags.go:64] FLAG: --register-node="true" Apr 20 19:25:26.351906 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347456 2572 flags.go:64] FLAG: --register-schedulable="true" Apr 20 19:25:26.351906 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347459 2572 flags.go:64] FLAG: --register-with-taints="" Apr 20 19:25:26.351906 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347463 2572 flags.go:64] FLAG: --registry-burst="10" Apr 20 19:25:26.351906 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347466 2572 flags.go:64] FLAG: --registry-qps="5" Apr 20 19:25:26.351906 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347468 2572 flags.go:64] FLAG: --reserved-cpus="" Apr 20 19:25:26.351906 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347471 2572 flags.go:64] FLAG: --reserved-memory="" Apr 20 19:25:26.351906 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347475 2572 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 19:25:26.351906 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347478 2572 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 19:25:26.351906 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347481 2572 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 19:25:26.351906 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347484 2572 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 19:25:26.351906 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347487 2572 flags.go:64] FLAG: --runonce="false" Apr 20 19:25:26.351906 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347490 2572 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 19:25:26.351906 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347493 2572 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 19:25:26.351906 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347496 2572 flags.go:64] FLAG: --seccomp-default="false" Apr 20 19:25:26.351906 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347500 2572 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 19:25:26.351906 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347503 2572 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 19:25:26.351906 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347506 2572 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 19:25:26.351906 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347511 2572 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 19:25:26.352554 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347514 2572 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 19:25:26.352554 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347517 2572 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 19:25:26.352554 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347520 2572 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 19:25:26.352554 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347523 2572 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 19:25:26.352554 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347526 2572 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 19:25:26.352554 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347529 2572 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 19:25:26.352554 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347533 2572 flags.go:64] FLAG: --system-cgroups="" Apr 20 19:25:26.352554 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347535 2572 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 19:25:26.352554 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347541 2572 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 19:25:26.352554 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347544 2572 flags.go:64] FLAG: --tls-cert-file="" Apr 20 19:25:26.352554 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347547 2572 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 19:25:26.352554 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347551 2572 flags.go:64] FLAG: --tls-min-version="" Apr 20 19:25:26.352554 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347554 2572 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 19:25:26.352554 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347557 2572 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 19:25:26.352554 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347560 2572 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 19:25:26.352554 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347563 2572 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 19:25:26.352554 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347566 2572 flags.go:64] FLAG: --v="2" Apr 20 19:25:26.352554 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347570 2572 flags.go:64] FLAG: --version="false" Apr 20 19:25:26.352554 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347574 2572 flags.go:64] FLAG: --vmodule="" Apr 20 19:25:26.352554 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347579 2572 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 19:25:26.352554 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.347582 2572 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 19:25:26.352554 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347685 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:25:26.352554 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347689 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:25:26.352554 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347693 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:25:26.353142 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347696 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:25:26.353142 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347699 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:25:26.353142 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347702 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:25:26.353142 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347705 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:25:26.353142 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347708 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:25:26.353142 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347710 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:25:26.353142 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347713 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:25:26.353142 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347717 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:25:26.353142 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347720 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:25:26.353142 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347723 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:25:26.353142 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347725 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:25:26.353142 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347728 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:25:26.353142 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347730 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:25:26.353142 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347733 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:25:26.353142 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347736 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:25:26.353142 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347738 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:25:26.353142 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347742 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:25:26.353142 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347744 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:25:26.353142 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347747 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:25:26.353142 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347750 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:25:26.353142 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347752 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:25:26.353688 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347755 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:25:26.353688 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347757 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:25:26.353688 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347760 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:25:26.353688 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347762 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:25:26.353688 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347765 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:25:26.353688 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347767 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:25:26.353688 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347770 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:25:26.353688 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347772 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:25:26.353688 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347775 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:25:26.353688 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347777 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:25:26.353688 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347780 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:25:26.353688 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347782 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:25:26.353688 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347785 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:25:26.353688 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347788 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:25:26.353688 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347791 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:25:26.353688 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347794 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:25:26.353688 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347796 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:25:26.353688 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347799 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:25:26.353688 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347803 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:25:26.354169 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347806 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:25:26.354169 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347808 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:25:26.354169 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347811 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:25:26.354169 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347813 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:25:26.354169 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347818 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:25:26.354169 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347821 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:25:26.354169 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347823 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:25:26.354169 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347826 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:25:26.354169 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347829 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:25:26.354169 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347831 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:25:26.354169 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347834 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:25:26.354169 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347836 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:25:26.354169 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347839 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:25:26.354169 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347842 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:25:26.354169 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347844 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:25:26.354169 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347847 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:25:26.354169 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347849 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:25:26.354169 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347851 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:25:26.354169 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347854 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:25:26.354711 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347857 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:25:26.354711 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347859 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:25:26.354711 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347862 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:25:26.354711 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347864 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:25:26.354711 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347867 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:25:26.354711 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347870 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:25:26.354711 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347872 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:25:26.354711 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347875 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:25:26.354711 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347879 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:25:26.354711 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347881 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:25:26.354711 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347884 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:25:26.354711 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347886 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:25:26.354711 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347890 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:25:26.354711 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347893 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:25:26.354711 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347895 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:25:26.354711 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347898 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:25:26.354711 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347900 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:25:26.354711 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347904 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:25:26.354711 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347908 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:25:26.354711 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347911 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:25:26.355225 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347914 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:25:26.355225 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347917 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:25:26.355225 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347920 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:25:26.355225 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.347922 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:25:26.355225 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.348740 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 19:25:26.355864 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.355741 2572 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 19:25:26.355895 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.355866 2572 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 19:25:26.355932 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.355923 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:25:26.355932 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.355929 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:25:26.355986 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.355933 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:25:26.355986 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.355937 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:25:26.355986 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.355940 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:25:26.355986 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.355943 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:25:26.355986 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.355946 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:25:26.355986 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.355948 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:25:26.355986 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.355951 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:25:26.355986 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.355953 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:25:26.355986 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.355956 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:25:26.355986 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.355959 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:25:26.355986 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.355961 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:25:26.355986 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.355965 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:25:26.355986 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.355968 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:25:26.355986 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.355971 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:25:26.355986 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.355974 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:25:26.355986 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.355977 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:25:26.355986 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.355980 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:25:26.355986 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.355982 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:25:26.355986 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.355985 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:25:26.356471 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.355988 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:25:26.356471 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.355991 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:25:26.356471 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.355993 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:25:26.356471 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.355996 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:25:26.356471 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.355999 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:25:26.356471 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356001 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:25:26.356471 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356004 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:25:26.356471 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356007 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:25:26.356471 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356010 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:25:26.356471 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356013 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:25:26.356471 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356016 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:25:26.356471 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356019 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:25:26.356471 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356021 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:25:26.356471 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356023 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:25:26.356471 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356026 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:25:26.356471 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356029 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:25:26.356471 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356031 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:25:26.356471 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356034 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:25:26.356471 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356038 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:25:26.356471 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356042 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:25:26.356971 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356044 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:25:26.356971 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356047 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:25:26.356971 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356050 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:25:26.356971 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356052 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:25:26.356971 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356054 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:25:26.356971 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356058 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:25:26.356971 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356060 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:25:26.356971 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356063 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:25:26.356971 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356066 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:25:26.356971 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356068 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:25:26.356971 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356070 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:25:26.356971 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356073 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:25:26.356971 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356075 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:25:26.356971 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356079 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:25:26.356971 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356082 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:25:26.356971 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356084 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:25:26.356971 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356087 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:25:26.356971 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356090 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:25:26.356971 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356093 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:25:26.356971 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356095 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:25:26.357472 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356098 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:25:26.357472 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356100 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:25:26.357472 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356103 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:25:26.357472 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356105 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:25:26.357472 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356108 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:25:26.357472 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356111 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:25:26.357472 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356113 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:25:26.357472 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356116 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:25:26.357472 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356118 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:25:26.357472 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356122 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:25:26.357472 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356125 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:25:26.357472 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356127 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:25:26.357472 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356130 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:25:26.357472 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356133 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:25:26.357472 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356137 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:25:26.357472 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356141 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:25:26.357472 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356143 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:25:26.357472 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356146 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:25:26.357472 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356149 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:25:26.357472 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356152 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:25:26.357966 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356154 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:25:26.357966 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356157 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:25:26.357966 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356159 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:25:26.357966 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356162 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:25:26.357966 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356164 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:25:26.357966 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.356171 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 19:25:26.357966 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356274 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:25:26.357966 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356279 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:25:26.357966 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356282 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:25:26.357966 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356285 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:25:26.357966 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356288 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:25:26.357966 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356290 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:25:26.357966 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356293 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:25:26.357966 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356296 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:25:26.357966 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356298 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:25:26.357966 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356301 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:25:26.358418 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356303 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:25:26.358418 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356306 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:25:26.358418 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356309 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:25:26.358418 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356311 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:25:26.358418 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356314 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:25:26.358418 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356316 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:25:26.358418 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356319 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:25:26.358418 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356322 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:25:26.358418 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356325 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:25:26.358418 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356327 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:25:26.358418 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356331 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:25:26.358418 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356335 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:25:26.358418 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356337 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:25:26.358418 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356340 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:25:26.358418 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356343 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:25:26.358418 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356347 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:25:26.358418 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356350 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:25:26.358418 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356353 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:25:26.358418 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356356 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:25:26.358927 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356358 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:25:26.358927 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356361 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:25:26.358927 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356364 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:25:26.358927 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356366 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:25:26.358927 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356369 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:25:26.358927 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356372 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:25:26.358927 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356374 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:25:26.358927 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356377 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:25:26.358927 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356380 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:25:26.358927 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356382 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:25:26.358927 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356385 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:25:26.358927 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356388 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:25:26.358927 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356390 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:25:26.358927 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356392 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:25:26.358927 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356395 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:25:26.358927 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356398 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:25:26.358927 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356402 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:25:26.358927 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356405 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:25:26.358927 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356408 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:25:26.358927 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356410 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:25:26.359531 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356413 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:25:26.359531 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356416 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:25:26.359531 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356418 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:25:26.359531 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356421 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:25:26.359531 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356423 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:25:26.359531 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356426 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:25:26.359531 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356428 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:25:26.359531 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356431 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:25:26.359531 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356433 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:25:26.359531 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356451 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:25:26.359531 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356456 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:25:26.359531 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356459 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:25:26.359531 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356461 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:25:26.359531 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356464 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:25:26.359531 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356466 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:25:26.359531 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356469 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:25:26.359531 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356472 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:25:26.359531 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356474 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:25:26.359531 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356478 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:25:26.359531 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356480 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:25:26.360030 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356483 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:25:26.360030 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356486 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:25:26.360030 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356488 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:25:26.360030 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356491 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:25:26.360030 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356493 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:25:26.360030 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356496 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:25:26.360030 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356498 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:25:26.360030 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356501 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:25:26.360030 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356503 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:25:26.360030 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356506 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:25:26.360030 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356509 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:25:26.360030 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356512 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:25:26.360030 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356514 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:25:26.360030 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356517 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:25:26.360030 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356520 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:25:26.360030 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356522 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:25:26.360030 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:26.356525 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:25:26.360507 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.356530 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 19:25:26.360507 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.357271 2572 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 19:25:26.360507 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.360149 2572 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 19:25:26.361106 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.361094 2572 server.go:1019] "Starting client certificate rotation" Apr 20 19:25:26.361216 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.361197 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 19:25:26.361272 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.361253 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 19:25:26.386450 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.386415 2572 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 19:25:26.388885 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.388870 2572 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 19:25:26.405088 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.405064 2572 log.go:25] "Validated CRI v1 runtime API" Apr 20 19:25:26.411473 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.411450 2572 log.go:25] "Validated CRI v1 image API" Apr 20 19:25:26.412974 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.412952 2572 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 19:25:26.413830 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.413811 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 19:25:26.420474 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.420454 2572 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 898b5dd7-7c68-430a-b2cc-44cc391cc69b:/dev/nvme0n1p4 e525c139-64e4-4c60-a7eb-f7efca8266b1:/dev/nvme0n1p3] Apr 20 19:25:26.420538 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.420473 2572 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 19:25:26.427249 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.427124 2572 manager.go:217] Machine: {Timestamp:2026-04-20 19:25:26.424875478 +0000 UTC m=+0.415363015 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099610 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec26490416f0149c2423128600b5ea5c SystemUUID:ec264904-16f0-149c-2423-128600b5ea5c BootID:953582b5-a618-411e-a9c5-e7cdd93e70f7 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:32:ac:f2:ea:d3 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:32:ac:f2:ea:d3 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ae:82:a7:c6:c4:e7 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 19:25:26.427249 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.427242 2572 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 19:25:26.427361 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.427329 2572 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 19:25:26.428528 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.428505 2572 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 19:25:26.428679 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.428530 2572 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-131-162.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 19:25:26.428729 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.428688 2572 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 19:25:26.428729 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.428697 2572 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 19:25:26.428729 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.428710 2572 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 19:25:26.428810 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.428729 2572 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 19:25:26.430289 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.430278 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 20 19:25:26.430409 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.430400 2572 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 19:25:26.433032 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.433020 2572 kubelet.go:491] "Attempting to sync node with API server" Apr 20 19:25:26.433071 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.433039 2572 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 19:25:26.433071 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.433055 2572 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 19:25:26.433071 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.433069 2572 kubelet.go:397] "Adding apiserver pod source" Apr 20 19:25:26.433183 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.433097 2572 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 19:25:26.433183 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.433119 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-hcz4c" Apr 20 19:25:26.434591 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.434579 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 19:25:26.434636 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.434597 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 19:25:26.437731 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.437715 2572 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 19:25:26.438493 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.438474 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-hcz4c" Apr 20 19:25:26.439376 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.439359 2572 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 19:25:26.441145 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.441130 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 19:25:26.441217 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.441151 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 19:25:26.441217 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.441172 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 19:25:26.441217 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.441182 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 19:25:26.441217 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.441192 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 19:25:26.441217 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.441208 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 19:25:26.441217 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.441218 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 19:25:26.441460 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.441227 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 19:25:26.441460 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.441237 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 19:25:26.441460 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.441246 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 19:25:26.441460 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.441267 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 19:25:26.441460 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.441281 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 19:25:26.442348 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.442338 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 19:25:26.442399 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.442354 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 19:25:26.448188 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.448156 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:25:26.450134 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.450117 2572 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 19:25:26.450223 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.450175 2572 server.go:1295] "Started kubelet" Apr 20 19:25:26.450284 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.450232 2572 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 19:25:26.450549 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.450508 2572 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 19:25:26.450614 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.450565 2572 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 19:25:26.451541 ip-10-0-131-162 systemd[1]: Started Kubernetes Kubelet. Apr 20 19:25:26.452701 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.452684 2572 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 19:25:26.452870 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.452854 2572 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-131-162.ec2.internal" not found Apr 20 19:25:26.453693 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.453680 2572 server.go:317] "Adding debug handlers to kubelet server" Apr 20 19:25:26.454621 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.454599 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:25:26.462193 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.462175 2572 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 19:25:26.462281 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.462202 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 19:25:26.462877 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.462856 2572 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 19:25:26.462877 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.462874 2572 factory.go:55] Registering systemd factory Apr 20 19:25:26.463013 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.462883 2572 factory.go:223] Registration of the systemd container factory successfully Apr 20 19:25:26.463070 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.463046 2572 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 19:25:26.463070 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.463048 2572 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 19:25:26.463160 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.463074 2572 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 19:25:26.463160 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.463113 2572 factory.go:153] Registering CRI-O factory Apr 20 19:25:26.463160 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.463125 2572 factory.go:223] Registration of the crio container factory successfully Apr 20 19:25:26.463160 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.463143 2572 factory.go:103] Registering Raw factory Apr 20 19:25:26.463160 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.463158 2572 manager.go:1196] Started watching for new ooms in manager Apr 20 19:25:26.463390 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.463219 2572 reconstruct.go:97] "Volume reconstruction finished" Apr 20 19:25:26.463390 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.463225 2572 reconciler.go:26] "Reconciler: start to sync state" Apr 20 19:25:26.463551 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:26.463529 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-162.ec2.internal\" not found" Apr 20 19:25:26.463642 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.463629 2572 manager.go:319] Starting recovery of all containers Apr 20 19:25:26.463799 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:26.463739 2572 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 19:25:26.464506 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.464484 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:25:26.467846 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:26.467823 2572 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-131-162.ec2.internal\" not found" node="ip-10-0-131-162.ec2.internal" Apr 20 19:25:26.467944 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.467881 2572 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-131-162.ec2.internal" not found Apr 20 19:25:26.473921 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.473900 2572 manager.go:324] Recovery completed Apr 20 19:25:26.475661 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:26.475639 2572 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 20 19:25:26.478979 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.478966 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:25:26.480802 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.480788 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-162.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:25:26.480881 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.480821 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-162.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:25:26.480881 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.480832 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-162.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:25:26.481360 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.481345 2572 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 19:25:26.481360 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.481359 2572 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 19:25:26.481517 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.481379 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 20 19:25:26.484269 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.484255 2572 policy_none.go:49] "None policy: Start" Apr 20 19:25:26.484339 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.484275 2572 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 19:25:26.484339 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.484288 2572 state_mem.go:35] "Initializing new in-memory state store" Apr 20 19:25:26.518150 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.518132 2572 manager.go:341] "Starting Device Plugin manager" Apr 20 19:25:26.531843 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:26.518263 2572 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 19:25:26.531843 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.518280 2572 server.go:85] "Starting device plugin registration server" Apr 20 19:25:26.531843 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.518612 2572 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 19:25:26.531843 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.518626 2572 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 19:25:26.531843 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.518704 2572 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 19:25:26.531843 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.518794 2572 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 19:25:26.531843 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.518803 2572 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 19:25:26.531843 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:26.519602 2572 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 19:25:26.531843 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:26.519637 2572 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-162.ec2.internal\" not found" Apr 20 19:25:26.531843 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.526230 2572 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-131-162.ec2.internal" not found Apr 20 19:25:26.596214 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.596102 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 19:25:26.597497 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.597479 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 19:25:26.597573 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.597509 2572 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 19:25:26.597573 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.597531 2572 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 19:25:26.597573 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.597539 2572 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 19:25:26.597726 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:26.597632 2572 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 19:25:26.600983 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.600958 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:25:26.618934 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.618882 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:25:26.620376 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.620357 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-162.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:25:26.620537 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.620388 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-162.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:25:26.620537 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.620402 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-162.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:25:26.620537 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.620426 2572 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-162.ec2.internal" Apr 20 19:25:26.630795 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.630774 2572 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-131-162.ec2.internal" Apr 20 19:25:26.698552 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.698499 2572 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-162.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-131-162.ec2.internal"] Apr 20 19:25:26.700913 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.700894 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-162.ec2.internal" Apr 20 19:25:26.700913 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.700905 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-162.ec2.internal" Apr 20 19:25:26.731050 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.731024 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-162.ec2.internal" Apr 20 19:25:26.735377 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.735360 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-162.ec2.internal" Apr 20 19:25:26.749021 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.749004 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 19:25:26.752046 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.752030 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 19:25:26.765313 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.765288 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/aca2508fa5a89bde3f166bc71272b03f-config\") pod \"kube-apiserver-proxy-ip-10-0-131-162.ec2.internal\" (UID: \"aca2508fa5a89bde3f166bc71272b03f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-162.ec2.internal" Apr 20 19:25:26.765382 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.765315 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7ddc67bd8f4881e0e6709f55e2cff023-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-162.ec2.internal\" (UID: \"7ddc67bd8f4881e0e6709f55e2cff023\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-162.ec2.internal" Apr 20 19:25:26.765382 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.765335 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7ddc67bd8f4881e0e6709f55e2cff023-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-162.ec2.internal\" (UID: \"7ddc67bd8f4881e0e6709f55e2cff023\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-162.ec2.internal" Apr 20 19:25:26.865560 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.865468 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7ddc67bd8f4881e0e6709f55e2cff023-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-162.ec2.internal\" (UID: \"7ddc67bd8f4881e0e6709f55e2cff023\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-162.ec2.internal" Apr 20 19:25:26.865560 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.865507 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7ddc67bd8f4881e0e6709f55e2cff023-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-162.ec2.internal\" (UID: \"7ddc67bd8f4881e0e6709f55e2cff023\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-162.ec2.internal" Apr 20 19:25:26.865560 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.865531 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/aca2508fa5a89bde3f166bc71272b03f-config\") pod \"kube-apiserver-proxy-ip-10-0-131-162.ec2.internal\" (UID: \"aca2508fa5a89bde3f166bc71272b03f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-162.ec2.internal" Apr 20 19:25:26.865767 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.865573 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7ddc67bd8f4881e0e6709f55e2cff023-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-162.ec2.internal\" (UID: \"7ddc67bd8f4881e0e6709f55e2cff023\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-162.ec2.internal" Apr 20 19:25:26.865767 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.865581 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7ddc67bd8f4881e0e6709f55e2cff023-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-162.ec2.internal\" (UID: \"7ddc67bd8f4881e0e6709f55e2cff023\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-162.ec2.internal" Apr 20 19:25:26.865767 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:26.865575 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/aca2508fa5a89bde3f166bc71272b03f-config\") pod \"kube-apiserver-proxy-ip-10-0-131-162.ec2.internal\" (UID: \"aca2508fa5a89bde3f166bc71272b03f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-162.ec2.internal" Apr 20 19:25:27.053976 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.053944 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-162.ec2.internal" Apr 20 19:25:27.054131 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.053946 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-162.ec2.internal" Apr 20 19:25:27.361057 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.361027 2572 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 19:25:27.361722 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.361190 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 19:25:27.361722 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.361203 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 19:25:27.361722 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.361190 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 19:25:27.433506 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.433479 2572 apiserver.go:52] "Watching apiserver" Apr 20 19:25:27.440107 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.440074 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 19:20:26 +0000 UTC" deadline="2027-11-27 17:57:23.280138605 +0000 UTC" Apr 20 19:25:27.440107 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.440102 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14062h31m55.840039095s" Apr 20 19:25:27.440910 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.440887 2572 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 19:25:27.442033 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.442014 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-7cd7d","openshift-network-operator/iptables-alerter-9v7nh","openshift-ovn-kubernetes/ovnkube-node-5ksvj","kube-system/kube-apiserver-proxy-ip-10-0-131-162.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6hlg","openshift-dns/node-resolver-b4sls","openshift-image-registry/node-ca-vvbgf","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-162.ec2.internal","openshift-multus/multus-s7t97","openshift-network-diagnostics/network-check-target-clfqc","kube-system/konnectivity-agent-q97q5","openshift-cluster-node-tuning-operator/tuned-sjscp","openshift-multus/multus-additional-cni-plugins-2z5nt"] Apr 20 19:25:27.443744 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.443535 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cd7d" Apr 20 19:25:27.443744 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:27.443626 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cd7d" podUID="513dd790-7dbf-46da-821a-3493b9941466" Apr 20 19:25:27.444695 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.444675 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-9v7nh" Apr 20 19:25:27.446134 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.446113 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.449511 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.447953 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:25:27.449511 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.448242 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-lr9cl\"" Apr 20 19:25:27.449511 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.448354 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 19:25:27.449511 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.448404 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 19:25:27.449511 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.448798 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 19:25:27.449511 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.448933 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 19:25:27.449511 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.449248 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 19:25:27.449511 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.449356 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-xnv4f\"" Apr 20 19:25:27.449951 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.449558 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 19:25:27.449951 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.449929 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 19:25:27.450319 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.450199 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-b4sls" Apr 20 19:25:27.450399 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.450334 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 19:25:27.450467 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.450272 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6hlg" Apr 20 19:25:27.453047 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.453023 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 19:25:27.453047 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.453042 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-lfkbv\"" Apr 20 19:25:27.453191 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.453103 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 19:25:27.453191 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.453133 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 19:25:27.453191 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.453148 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 19:25:27.453305 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.453212 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-66qhr\"" Apr 20 19:25:27.453305 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.453255 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 19:25:27.453525 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.453510 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vvbgf" Apr 20 19:25:27.454744 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.454726 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.455858 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.455840 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-clfqc" Apr 20 19:25:27.455937 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:27.455892 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-clfqc" podUID="5c4225dd-c1ca-427e-8883-7929ed2c386e" Apr 20 19:25:27.455997 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.455982 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 19:25:27.456107 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.456085 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 19:25:27.456357 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.456341 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-btbzq\"" Apr 20 19:25:27.456508 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.456490 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 19:25:27.457036 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.457011 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 19:25:27.457119 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.457086 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-q97q5" Apr 20 19:25:27.457525 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.457508 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 19:25:27.457613 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.457532 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 19:25:27.457613 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.457542 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-8qcr9\"" Apr 20 19:25:27.457691 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.457632 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 19:25:27.458472 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.458456 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-sjscp" Apr 20 19:25:27.459067 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.459052 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 19:25:27.459470 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.459457 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 19:25:27.459558 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.459534 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-wsd5d\"" Apr 20 19:25:27.459696 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.459682 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2z5nt" Apr 20 19:25:27.460394 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.460379 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 19:25:27.460483 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.460433 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:25:27.460763 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.460749 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-ds5nn\"" Apr 20 19:25:27.461918 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.461899 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 19:25:27.462202 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.462183 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 19:25:27.462294 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.462249 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-xrthd\"" Apr 20 19:25:27.462356 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.462303 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 19:25:27.464673 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.464598 2572 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 19:25:27.467540 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.467378 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6f0d6e92-255c-4940-acda-67d406e4eeee-tmp\") pod \"tuned-sjscp\" (UID: \"6f0d6e92-255c-4940-acda-67d406e4eeee\") " pod="openshift-cluster-node-tuning-operator/tuned-sjscp" Apr 20 19:25:27.467621 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.467554 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4f1d0d9b-42cd-49b6-9d9f-41487c76d136-cni-binary-copy\") pod \"multus-additional-cni-plugins-2z5nt\" (UID: \"4f1d0d9b-42cd-49b6-9d9f-41487c76d136\") " pod="openshift-multus/multus-additional-cni-plugins-2z5nt" Apr 20 19:25:27.467621 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.467574 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/988fcf46-c192-47b7-a3ad-27d4676cf1f2-ovnkube-config\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.467621 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.467590 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/066e3172-90cc-4dbf-9891-089727ab8561-host-var-lib-cni-multus\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.467621 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.467604 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/04c2a819-0254-4203-82e8-e61c4b88f509-host-slash\") pod \"iptables-alerter-9v7nh\" (UID: \"04c2a819-0254-4203-82e8-e61c4b88f509\") " pod="openshift-network-operator/iptables-alerter-9v7nh" Apr 20 19:25:27.467621 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.467619 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/988fcf46-c192-47b7-a3ad-27d4676cf1f2-env-overrides\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.467775 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.467661 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8350cbc8-4d36-4f4a-af5b-45977dd7b9e6-device-dir\") pod \"aws-ebs-csi-driver-node-r6hlg\" (UID: \"8350cbc8-4d36-4f4a-af5b-45977dd7b9e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6hlg" Apr 20 19:25:27.467775 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.467690 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6f0d6e92-255c-4940-acda-67d406e4eeee-run\") pod \"tuned-sjscp\" (UID: \"6f0d6e92-255c-4940-acda-67d406e4eeee\") " pod="openshift-cluster-node-tuning-operator/tuned-sjscp" Apr 20 19:25:27.467775 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.467734 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4f1d0d9b-42cd-49b6-9d9f-41487c76d136-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2z5nt\" (UID: \"4f1d0d9b-42cd-49b6-9d9f-41487c76d136\") " pod="openshift-multus/multus-additional-cni-plugins-2z5nt" Apr 20 19:25:27.467775 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.467759 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/988fcf46-c192-47b7-a3ad-27d4676cf1f2-ovnkube-script-lib\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.467886 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.467777 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/066e3172-90cc-4dbf-9891-089727ab8561-system-cni-dir\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.467886 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.467802 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-host-run-ovn-kubernetes\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.467886 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.467821 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/066e3172-90cc-4dbf-9891-089727ab8561-multus-conf-dir\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.467886 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.467841 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8350cbc8-4d36-4f4a-af5b-45977dd7b9e6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-r6hlg\" (UID: \"8350cbc8-4d36-4f4a-af5b-45977dd7b9e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6hlg" Apr 20 19:25:27.467886 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.467859 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6f0d6e92-255c-4940-acda-67d406e4eeee-etc-systemd\") pod \"tuned-sjscp\" (UID: \"6f0d6e92-255c-4940-acda-67d406e4eeee\") " pod="openshift-cluster-node-tuning-operator/tuned-sjscp" Apr 20 19:25:27.468029 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.467890 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4f1d0d9b-42cd-49b6-9d9f-41487c76d136-os-release\") pod \"multus-additional-cni-plugins-2z5nt\" (UID: \"4f1d0d9b-42cd-49b6-9d9f-41487c76d136\") " pod="openshift-multus/multus-additional-cni-plugins-2z5nt" Apr 20 19:25:27.468029 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.467916 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/066e3172-90cc-4dbf-9891-089727ab8561-cnibin\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.468029 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.467938 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8350cbc8-4d36-4f4a-af5b-45977dd7b9e6-sys-fs\") pod \"aws-ebs-csi-driver-node-r6hlg\" (UID: \"8350cbc8-4d36-4f4a-af5b-45977dd7b9e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6hlg" Apr 20 19:25:27.468029 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.467957 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgrjk\" (UniqueName: \"kubernetes.io/projected/4f1d0d9b-42cd-49b6-9d9f-41487c76d136-kube-api-access-fgrjk\") pod \"multus-additional-cni-plugins-2z5nt\" (UID: \"4f1d0d9b-42cd-49b6-9d9f-41487c76d136\") " pod="openshift-multus/multus-additional-cni-plugins-2z5nt" Apr 20 19:25:27.468029 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.467981 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7phhm\" (UniqueName: \"kubernetes.io/projected/513dd790-7dbf-46da-821a-3493b9941466-kube-api-access-7phhm\") pod \"network-metrics-daemon-7cd7d\" (UID: \"513dd790-7dbf-46da-821a-3493b9941466\") " pod="openshift-multus/network-metrics-daemon-7cd7d" Apr 20 19:25:27.468170 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468029 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-var-lib-openvswitch\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.468170 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468047 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/83a5e765-c988-4980-8534-55e55f1296d7-host\") pod \"node-ca-vvbgf\" (UID: \"83a5e765-c988-4980-8534-55e55f1296d7\") " pod="openshift-image-registry/node-ca-vvbgf" Apr 20 19:25:27.468170 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468061 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/066e3172-90cc-4dbf-9891-089727ab8561-cni-binary-copy\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.468170 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468096 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/066e3172-90cc-4dbf-9891-089727ab8561-multus-socket-dir-parent\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.468170 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468123 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4f1d0d9b-42cd-49b6-9d9f-41487c76d136-cnibin\") pod \"multus-additional-cni-plugins-2z5nt\" (UID: \"4f1d0d9b-42cd-49b6-9d9f-41487c76d136\") " pod="openshift-multus/multus-additional-cni-plugins-2z5nt" Apr 20 19:25:27.468170 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468147 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/513dd790-7dbf-46da-821a-3493b9941466-metrics-certs\") pod \"network-metrics-daemon-7cd7d\" (UID: \"513dd790-7dbf-46da-821a-3493b9941466\") " pod="openshift-multus/network-metrics-daemon-7cd7d" Apr 20 19:25:27.468170 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468167 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-host-cni-netd\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.468359 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468190 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/066e3172-90cc-4dbf-9891-089727ab8561-host-var-lib-cni-bin\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.468359 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468220 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/066e3172-90cc-4dbf-9891-089727ab8561-host-run-multus-certs\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.468359 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468251 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/066e3172-90cc-4dbf-9891-089727ab8561-etc-kubernetes\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.468359 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468267 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6f0d6e92-255c-4940-acda-67d406e4eeee-var-lib-kubelet\") pod \"tuned-sjscp\" (UID: \"6f0d6e92-255c-4940-acda-67d406e4eeee\") " pod="openshift-cluster-node-tuning-operator/tuned-sjscp" Apr 20 19:25:27.468359 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468289 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4f1d0d9b-42cd-49b6-9d9f-41487c76d136-system-cni-dir\") pod \"multus-additional-cni-plugins-2z5nt\" (UID: \"4f1d0d9b-42cd-49b6-9d9f-41487c76d136\") " pod="openshift-multus/multus-additional-cni-plugins-2z5nt" Apr 20 19:25:27.468359 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468315 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4f1d0d9b-42cd-49b6-9d9f-41487c76d136-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2z5nt\" (UID: \"4f1d0d9b-42cd-49b6-9d9f-41487c76d136\") " pod="openshift-multus/multus-additional-cni-plugins-2z5nt" Apr 20 19:25:27.468359 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468341 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/066e3172-90cc-4dbf-9891-089727ab8561-hostroot\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.468573 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468358 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8350cbc8-4d36-4f4a-af5b-45977dd7b9e6-etc-selinux\") pod \"aws-ebs-csi-driver-node-r6hlg\" (UID: \"8350cbc8-4d36-4f4a-af5b-45977dd7b9e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6hlg" Apr 20 19:25:27.468573 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468383 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-host-kubelet\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.468573 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468412 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sfzh\" (UniqueName: \"kubernetes.io/projected/988fcf46-c192-47b7-a3ad-27d4676cf1f2-kube-api-access-8sfzh\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.468573 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468430 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/066e3172-90cc-4dbf-9891-089727ab8561-os-release\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.468573 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468462 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/066e3172-90cc-4dbf-9891-089727ab8561-multus-daemon-config\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.468573 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468476 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6f0d6e92-255c-4940-acda-67d406e4eeee-etc-kubernetes\") pod \"tuned-sjscp\" (UID: \"6f0d6e92-255c-4940-acda-67d406e4eeee\") " pod="openshift-cluster-node-tuning-operator/tuned-sjscp" Apr 20 19:25:27.468573 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468489 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6f0d6e92-255c-4940-acda-67d406e4eeee-etc-sysctl-d\") pod \"tuned-sjscp\" (UID: \"6f0d6e92-255c-4940-acda-67d406e4eeee\") " pod="openshift-cluster-node-tuning-operator/tuned-sjscp" Apr 20 19:25:27.468573 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468502 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6f0d6e92-255c-4940-acda-67d406e4eeee-lib-modules\") pod \"tuned-sjscp\" (UID: \"6f0d6e92-255c-4940-acda-67d406e4eeee\") " pod="openshift-cluster-node-tuning-operator/tuned-sjscp" Apr 20 19:25:27.468573 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468529 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-systemd-units\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.468573 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468552 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/066e3172-90cc-4dbf-9891-089727ab8561-multus-cni-dir\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.468573 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468566 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6f0d6e92-255c-4940-acda-67d406e4eeee-sys\") pod \"tuned-sjscp\" (UID: \"6f0d6e92-255c-4940-acda-67d406e4eeee\") " pod="openshift-cluster-node-tuning-operator/tuned-sjscp" Apr 20 19:25:27.468876 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468582 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-run-ovn\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.468876 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468604 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-log-socket\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.468876 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468630 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b0a25aae-e259-4ad8-b476-5694a4f39d1d-tmp-dir\") pod \"node-resolver-b4sls\" (UID: \"b0a25aae-e259-4ad8-b476-5694a4f39d1d\") " pod="openshift-dns/node-resolver-b4sls" Apr 20 19:25:27.468876 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468658 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8350cbc8-4d36-4f4a-af5b-45977dd7b9e6-socket-dir\") pod \"aws-ebs-csi-driver-node-r6hlg\" (UID: \"8350cbc8-4d36-4f4a-af5b-45977dd7b9e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6hlg" Apr 20 19:25:27.468876 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468679 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ca9e64aa-e049-4e11-b4ec-79ec745fa7c6-agent-certs\") pod \"konnectivity-agent-q97q5\" (UID: \"ca9e64aa-e049-4e11-b4ec-79ec745fa7c6\") " pod="kube-system/konnectivity-agent-q97q5" Apr 20 19:25:27.468876 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468695 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6f0d6e92-255c-4940-acda-67d406e4eeee-etc-sysconfig\") pod \"tuned-sjscp\" (UID: \"6f0d6e92-255c-4940-acda-67d406e4eeee\") " pod="openshift-cluster-node-tuning-operator/tuned-sjscp" Apr 20 19:25:27.468876 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468710 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6f0d6e92-255c-4940-acda-67d406e4eeee-host\") pod \"tuned-sjscp\" (UID: \"6f0d6e92-255c-4940-acda-67d406e4eeee\") " pod="openshift-cluster-node-tuning-operator/tuned-sjscp" Apr 20 19:25:27.468876 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468727 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-run-systemd\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.468876 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468741 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-node-log\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.468876 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468757 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.468876 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468772 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b0a25aae-e259-4ad8-b476-5694a4f39d1d-hosts-file\") pod \"node-resolver-b4sls\" (UID: \"b0a25aae-e259-4ad8-b476-5694a4f39d1d\") " pod="openshift-dns/node-resolver-b4sls" Apr 20 19:25:27.468876 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468794 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-run-openvswitch\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.468876 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468814 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/988fcf46-c192-47b7-a3ad-27d4676cf1f2-ovn-node-metrics-cert\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.468876 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468837 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/83a5e765-c988-4980-8534-55e55f1296d7-serviceca\") pod \"node-ca-vvbgf\" (UID: \"83a5e765-c988-4980-8534-55e55f1296d7\") " pod="openshift-image-registry/node-ca-vvbgf" Apr 20 19:25:27.468876 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468853 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/066e3172-90cc-4dbf-9891-089727ab8561-host-run-netns\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.469293 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468880 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/066e3172-90cc-4dbf-9891-089727ab8561-host-var-lib-kubelet\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.469293 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468914 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhg4x\" (UniqueName: \"kubernetes.io/projected/8350cbc8-4d36-4f4a-af5b-45977dd7b9e6-kube-api-access-hhg4x\") pod \"aws-ebs-csi-driver-node-r6hlg\" (UID: \"8350cbc8-4d36-4f4a-af5b-45977dd7b9e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6hlg" Apr 20 19:25:27.469293 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468930 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6f0d6e92-255c-4940-acda-67d406e4eeee-etc-modprobe-d\") pod \"tuned-sjscp\" (UID: \"6f0d6e92-255c-4940-acda-67d406e4eeee\") " pod="openshift-cluster-node-tuning-operator/tuned-sjscp" Apr 20 19:25:27.469293 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468946 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6f0d6e92-255c-4940-acda-67d406e4eeee-etc-tuned\") pod \"tuned-sjscp\" (UID: \"6f0d6e92-255c-4940-acda-67d406e4eeee\") " pod="openshift-cluster-node-tuning-operator/tuned-sjscp" Apr 20 19:25:27.469293 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468959 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d25kx\" (UniqueName: \"kubernetes.io/projected/b0a25aae-e259-4ad8-b476-5694a4f39d1d-kube-api-access-d25kx\") pod \"node-resolver-b4sls\" (UID: \"b0a25aae-e259-4ad8-b476-5694a4f39d1d\") " pod="openshift-dns/node-resolver-b4sls" Apr 20 19:25:27.469293 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468974 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/066e3172-90cc-4dbf-9891-089727ab8561-host-run-k8s-cni-cncf-io\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.469293 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.468989 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/04c2a819-0254-4203-82e8-e61c4b88f509-iptables-alerter-script\") pod \"iptables-alerter-9v7nh\" (UID: \"04c2a819-0254-4203-82e8-e61c4b88f509\") " pod="openshift-network-operator/iptables-alerter-9v7nh" Apr 20 19:25:27.469293 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.469003 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjvfg\" (UniqueName: \"kubernetes.io/projected/04c2a819-0254-4203-82e8-e61c4b88f509-kube-api-access-kjvfg\") pod \"iptables-alerter-9v7nh\" (UID: \"04c2a819-0254-4203-82e8-e61c4b88f509\") " pod="openshift-network-operator/iptables-alerter-9v7nh" Apr 20 19:25:27.469293 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.469018 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ca9e64aa-e049-4e11-b4ec-79ec745fa7c6-konnectivity-ca\") pod \"konnectivity-agent-q97q5\" (UID: \"ca9e64aa-e049-4e11-b4ec-79ec745fa7c6\") " pod="kube-system/konnectivity-agent-q97q5" Apr 20 19:25:27.469293 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.469032 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6f0d6e92-255c-4940-acda-67d406e4eeee-etc-sysctl-conf\") pod \"tuned-sjscp\" (UID: \"6f0d6e92-255c-4940-acda-67d406e4eeee\") " pod="openshift-cluster-node-tuning-operator/tuned-sjscp" Apr 20 19:25:27.469293 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.469047 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpqwn\" (UniqueName: \"kubernetes.io/projected/6f0d6e92-255c-4940-acda-67d406e4eeee-kube-api-access-cpqwn\") pod \"tuned-sjscp\" (UID: \"6f0d6e92-255c-4940-acda-67d406e4eeee\") " pod="openshift-cluster-node-tuning-operator/tuned-sjscp" Apr 20 19:25:27.469293 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.469064 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4f1d0d9b-42cd-49b6-9d9f-41487c76d136-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2z5nt\" (UID: \"4f1d0d9b-42cd-49b6-9d9f-41487c76d136\") " pod="openshift-multus/multus-additional-cni-plugins-2z5nt" Apr 20 19:25:27.469293 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.469085 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq6fj\" (UniqueName: \"kubernetes.io/projected/83a5e765-c988-4980-8534-55e55f1296d7-kube-api-access-zq6fj\") pod \"node-ca-vvbgf\" (UID: \"83a5e765-c988-4980-8534-55e55f1296d7\") " pod="openshift-image-registry/node-ca-vvbgf" Apr 20 19:25:27.469293 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.469110 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb9qq\" (UniqueName: \"kubernetes.io/projected/066e3172-90cc-4dbf-9891-089727ab8561-kube-api-access-vb9qq\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.469293 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.469123 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8gpq\" (UniqueName: \"kubernetes.io/projected/5c4225dd-c1ca-427e-8883-7929ed2c386e-kube-api-access-z8gpq\") pod \"network-check-target-clfqc\" (UID: \"5c4225dd-c1ca-427e-8883-7929ed2c386e\") " pod="openshift-network-diagnostics/network-check-target-clfqc" Apr 20 19:25:27.469293 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.469146 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8350cbc8-4d36-4f4a-af5b-45977dd7b9e6-registration-dir\") pod \"aws-ebs-csi-driver-node-r6hlg\" (UID: \"8350cbc8-4d36-4f4a-af5b-45977dd7b9e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6hlg" Apr 20 19:25:27.469779 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.469164 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-host-slash\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.469779 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.469180 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-host-run-netns\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.469779 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.469193 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-etc-openvswitch\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.469779 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.469208 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-host-cni-bin\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.472747 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.472727 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 19:25:27.493106 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.493082 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-598mf" Apr 20 19:25:27.503761 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.502811 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-598mf" Apr 20 19:25:27.527634 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:27.527597 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ddc67bd8f4881e0e6709f55e2cff023.slice/crio-25c63ad1560e1b75b4a9a3d08ccbb40e353823c7364d389ff6b273aa56624155 WatchSource:0}: Error finding container 25c63ad1560e1b75b4a9a3d08ccbb40e353823c7364d389ff6b273aa56624155: Status 404 returned error can't find the container with id 25c63ad1560e1b75b4a9a3d08ccbb40e353823c7364d389ff6b273aa56624155 Apr 20 19:25:27.527838 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:27.527825 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaca2508fa5a89bde3f166bc71272b03f.slice/crio-a20dccfe00f71aef961bcb14b5319f697577a5b3dce9564ed0edddd79087e9e0 WatchSource:0}: Error finding container a20dccfe00f71aef961bcb14b5319f697577a5b3dce9564ed0edddd79087e9e0: Status 404 returned error can't find the container with id a20dccfe00f71aef961bcb14b5319f697577a5b3dce9564ed0edddd79087e9e0 Apr 20 19:25:27.533122 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.533102 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 19:25:27.569403 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.569381 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6f0d6e92-255c-4940-acda-67d406e4eeee-sys\") pod \"tuned-sjscp\" (UID: \"6f0d6e92-255c-4940-acda-67d406e4eeee\") " pod="openshift-cluster-node-tuning-operator/tuned-sjscp" Apr 20 19:25:27.569520 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.569410 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-run-ovn\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.569520 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.569426 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-log-socket\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.569520 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.569457 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b0a25aae-e259-4ad8-b476-5694a4f39d1d-tmp-dir\") pod \"node-resolver-b4sls\" (UID: \"b0a25aae-e259-4ad8-b476-5694a4f39d1d\") " pod="openshift-dns/node-resolver-b4sls" Apr 20 19:25:27.569520 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.569482 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8350cbc8-4d36-4f4a-af5b-45977dd7b9e6-socket-dir\") pod \"aws-ebs-csi-driver-node-r6hlg\" (UID: \"8350cbc8-4d36-4f4a-af5b-45977dd7b9e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6hlg" Apr 20 19:25:27.569520 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.569509 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ca9e64aa-e049-4e11-b4ec-79ec745fa7c6-agent-certs\") pod \"konnectivity-agent-q97q5\" (UID: \"ca9e64aa-e049-4e11-b4ec-79ec745fa7c6\") " pod="kube-system/konnectivity-agent-q97q5" Apr 20 19:25:27.569520 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.569514 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-run-ovn\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.569802 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.569517 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6f0d6e92-255c-4940-acda-67d406e4eeee-sys\") pod \"tuned-sjscp\" (UID: \"6f0d6e92-255c-4940-acda-67d406e4eeee\") " pod="openshift-cluster-node-tuning-operator/tuned-sjscp" Apr 20 19:25:27.569802 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.569531 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6f0d6e92-255c-4940-acda-67d406e4eeee-etc-sysconfig\") pod \"tuned-sjscp\" (UID: \"6f0d6e92-255c-4940-acda-67d406e4eeee\") " pod="openshift-cluster-node-tuning-operator/tuned-sjscp" Apr 20 19:25:27.569802 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.569518 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-log-socket\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.569802 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.569561 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6f0d6e92-255c-4940-acda-67d406e4eeee-host\") pod \"tuned-sjscp\" (UID: \"6f0d6e92-255c-4940-acda-67d406e4eeee\") " pod="openshift-cluster-node-tuning-operator/tuned-sjscp" Apr 20 19:25:27.569802 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.569574 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6f0d6e92-255c-4940-acda-67d406e4eeee-etc-sysconfig\") pod \"tuned-sjscp\" (UID: \"6f0d6e92-255c-4940-acda-67d406e4eeee\") " pod="openshift-cluster-node-tuning-operator/tuned-sjscp" Apr 20 19:25:27.569802 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.569581 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-run-systemd\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.569802 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.569605 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6f0d6e92-255c-4940-acda-67d406e4eeee-host\") pod \"tuned-sjscp\" (UID: \"6f0d6e92-255c-4940-acda-67d406e4eeee\") " pod="openshift-cluster-node-tuning-operator/tuned-sjscp" Apr 20 19:25:27.569802 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.569605 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-node-log\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.569802 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.569628 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.569802 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.569626 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8350cbc8-4d36-4f4a-af5b-45977dd7b9e6-socket-dir\") pod \"aws-ebs-csi-driver-node-r6hlg\" (UID: \"8350cbc8-4d36-4f4a-af5b-45977dd7b9e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6hlg" Apr 20 19:25:27.569802 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.569647 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b0a25aae-e259-4ad8-b476-5694a4f39d1d-hosts-file\") pod \"node-resolver-b4sls\" (UID: \"b0a25aae-e259-4ad8-b476-5694a4f39d1d\") " pod="openshift-dns/node-resolver-b4sls" Apr 20 19:25:27.569802 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.569631 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-node-log\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.569802 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.569669 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-run-openvswitch\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.569802 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.569683 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.569802 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.569692 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-run-openvswitch\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.569802 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.569659 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-run-systemd\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.569802 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.569701 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/988fcf46-c192-47b7-a3ad-27d4676cf1f2-ovn-node-metrics-cert\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.569802 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.569697 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b0a25aae-e259-4ad8-b476-5694a4f39d1d-hosts-file\") pod \"node-resolver-b4sls\" (UID: \"b0a25aae-e259-4ad8-b476-5694a4f39d1d\") " pod="openshift-dns/node-resolver-b4sls" Apr 20 19:25:27.572133 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.569736 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/83a5e765-c988-4980-8534-55e55f1296d7-serviceca\") pod \"node-ca-vvbgf\" (UID: \"83a5e765-c988-4980-8534-55e55f1296d7\") " pod="openshift-image-registry/node-ca-vvbgf" Apr 20 19:25:27.572133 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.569765 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/066e3172-90cc-4dbf-9891-089727ab8561-host-run-netns\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.572133 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.569791 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/066e3172-90cc-4dbf-9891-089727ab8561-host-var-lib-kubelet\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.572133 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.569817 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hhg4x\" (UniqueName: \"kubernetes.io/projected/8350cbc8-4d36-4f4a-af5b-45977dd7b9e6-kube-api-access-hhg4x\") pod \"aws-ebs-csi-driver-node-r6hlg\" (UID: \"8350cbc8-4d36-4f4a-af5b-45977dd7b9e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6hlg" Apr 20 19:25:27.572133 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.569822 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b0a25aae-e259-4ad8-b476-5694a4f39d1d-tmp-dir\") pod \"node-resolver-b4sls\" (UID: \"b0a25aae-e259-4ad8-b476-5694a4f39d1d\") " pod="openshift-dns/node-resolver-b4sls" Apr 20 19:25:27.572133 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.569853 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/066e3172-90cc-4dbf-9891-089727ab8561-host-run-netns\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.572133 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.569848 2572 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 19:25:27.572133 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.569871 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/066e3172-90cc-4dbf-9891-089727ab8561-host-var-lib-kubelet\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.572133 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.569904 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6f0d6e92-255c-4940-acda-67d406e4eeee-etc-modprobe-d\") pod \"tuned-sjscp\" (UID: \"6f0d6e92-255c-4940-acda-67d406e4eeee\") " pod="openshift-cluster-node-tuning-operator/tuned-sjscp" Apr 20 19:25:27.572133 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.569939 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6f0d6e92-255c-4940-acda-67d406e4eeee-etc-tuned\") pod \"tuned-sjscp\" (UID: \"6f0d6e92-255c-4940-acda-67d406e4eeee\") " pod="openshift-cluster-node-tuning-operator/tuned-sjscp" Apr 20 19:25:27.572133 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.569963 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d25kx\" (UniqueName: \"kubernetes.io/projected/b0a25aae-e259-4ad8-b476-5694a4f39d1d-kube-api-access-d25kx\") pod \"node-resolver-b4sls\" (UID: \"b0a25aae-e259-4ad8-b476-5694a4f39d1d\") " pod="openshift-dns/node-resolver-b4sls" Apr 20 19:25:27.572133 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.569984 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/066e3172-90cc-4dbf-9891-089727ab8561-host-run-k8s-cni-cncf-io\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.572133 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.570007 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/04c2a819-0254-4203-82e8-e61c4b88f509-iptables-alerter-script\") pod \"iptables-alerter-9v7nh\" (UID: \"04c2a819-0254-4203-82e8-e61c4b88f509\") " pod="openshift-network-operator/iptables-alerter-9v7nh" Apr 20 19:25:27.572133 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.570025 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6f0d6e92-255c-4940-acda-67d406e4eeee-etc-modprobe-d\") pod \"tuned-sjscp\" (UID: \"6f0d6e92-255c-4940-acda-67d406e4eeee\") " pod="openshift-cluster-node-tuning-operator/tuned-sjscp" Apr 20 19:25:27.572133 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.570030 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kjvfg\" (UniqueName: \"kubernetes.io/projected/04c2a819-0254-4203-82e8-e61c4b88f509-kube-api-access-kjvfg\") pod \"iptables-alerter-9v7nh\" (UID: \"04c2a819-0254-4203-82e8-e61c4b88f509\") " pod="openshift-network-operator/iptables-alerter-9v7nh" Apr 20 19:25:27.572133 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.570055 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ca9e64aa-e049-4e11-b4ec-79ec745fa7c6-konnectivity-ca\") pod \"konnectivity-agent-q97q5\" (UID: \"ca9e64aa-e049-4e11-b4ec-79ec745fa7c6\") " pod="kube-system/konnectivity-agent-q97q5" Apr 20 19:25:27.572133 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.570079 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/066e3172-90cc-4dbf-9891-089727ab8561-host-run-k8s-cni-cncf-io\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.572133 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.570120 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6f0d6e92-255c-4940-acda-67d406e4eeee-etc-sysctl-conf\") pod \"tuned-sjscp\" (UID: \"6f0d6e92-255c-4940-acda-67d406e4eeee\") " pod="openshift-cluster-node-tuning-operator/tuned-sjscp" Apr 20 19:25:27.573087 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.570151 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cpqwn\" (UniqueName: \"kubernetes.io/projected/6f0d6e92-255c-4940-acda-67d406e4eeee-kube-api-access-cpqwn\") pod \"tuned-sjscp\" (UID: \"6f0d6e92-255c-4940-acda-67d406e4eeee\") " pod="openshift-cluster-node-tuning-operator/tuned-sjscp" Apr 20 19:25:27.573087 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.570179 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4f1d0d9b-42cd-49b6-9d9f-41487c76d136-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2z5nt\" (UID: \"4f1d0d9b-42cd-49b6-9d9f-41487c76d136\") " pod="openshift-multus/multus-additional-cni-plugins-2z5nt" Apr 20 19:25:27.573087 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.570206 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zq6fj\" (UniqueName: \"kubernetes.io/projected/83a5e765-c988-4980-8534-55e55f1296d7-kube-api-access-zq6fj\") pod \"node-ca-vvbgf\" (UID: \"83a5e765-c988-4980-8534-55e55f1296d7\") " pod="openshift-image-registry/node-ca-vvbgf" Apr 20 19:25:27.573087 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.570231 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vb9qq\" (UniqueName: \"kubernetes.io/projected/066e3172-90cc-4dbf-9891-089727ab8561-kube-api-access-vb9qq\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.573087 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.570257 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8gpq\" (UniqueName: \"kubernetes.io/projected/5c4225dd-c1ca-427e-8883-7929ed2c386e-kube-api-access-z8gpq\") pod \"network-check-target-clfqc\" (UID: \"5c4225dd-c1ca-427e-8883-7929ed2c386e\") " pod="openshift-network-diagnostics/network-check-target-clfqc" Apr 20 19:25:27.573087 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.570265 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6f0d6e92-255c-4940-acda-67d406e4eeee-etc-sysctl-conf\") pod \"tuned-sjscp\" (UID: \"6f0d6e92-255c-4940-acda-67d406e4eeee\") " pod="openshift-cluster-node-tuning-operator/tuned-sjscp" Apr 20 19:25:27.573087 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.570287 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8350cbc8-4d36-4f4a-af5b-45977dd7b9e6-registration-dir\") pod \"aws-ebs-csi-driver-node-r6hlg\" (UID: \"8350cbc8-4d36-4f4a-af5b-45977dd7b9e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6hlg" Apr 20 19:25:27.573087 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.570336 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-host-slash\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.573087 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.570361 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-host-run-netns\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.573087 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.570388 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-etc-openvswitch\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.573087 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.570411 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-host-cni-bin\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.573087 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.570409 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4f1d0d9b-42cd-49b6-9d9f-41487c76d136-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2z5nt\" (UID: \"4f1d0d9b-42cd-49b6-9d9f-41487c76d136\") " pod="openshift-multus/multus-additional-cni-plugins-2z5nt" Apr 20 19:25:27.573087 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.570469 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6f0d6e92-255c-4940-acda-67d406e4eeee-tmp\") pod \"tuned-sjscp\" (UID: \"6f0d6e92-255c-4940-acda-67d406e4eeee\") " pod="openshift-cluster-node-tuning-operator/tuned-sjscp" Apr 20 19:25:27.573087 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.570476 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8350cbc8-4d36-4f4a-af5b-45977dd7b9e6-registration-dir\") pod \"aws-ebs-csi-driver-node-r6hlg\" (UID: \"8350cbc8-4d36-4f4a-af5b-45977dd7b9e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6hlg" Apr 20 19:25:27.573087 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.570490 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-host-run-netns\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.573087 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.570504 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4f1d0d9b-42cd-49b6-9d9f-41487c76d136-cni-binary-copy\") pod \"multus-additional-cni-plugins-2z5nt\" (UID: \"4f1d0d9b-42cd-49b6-9d9f-41487c76d136\") " pod="openshift-multus/multus-additional-cni-plugins-2z5nt" Apr 20 19:25:27.573087 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.570536 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/988fcf46-c192-47b7-a3ad-27d4676cf1f2-ovnkube-config\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.574225 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.570571 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/066e3172-90cc-4dbf-9891-089727ab8561-host-var-lib-cni-multus\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.574225 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.570598 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/04c2a819-0254-4203-82e8-e61c4b88f509-iptables-alerter-script\") pod \"iptables-alerter-9v7nh\" (UID: \"04c2a819-0254-4203-82e8-e61c4b88f509\") " pod="openshift-network-operator/iptables-alerter-9v7nh" Apr 20 19:25:27.574225 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.570600 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/04c2a819-0254-4203-82e8-e61c4b88f509-host-slash\") pod \"iptables-alerter-9v7nh\" (UID: \"04c2a819-0254-4203-82e8-e61c4b88f509\") " pod="openshift-network-operator/iptables-alerter-9v7nh" Apr 20 19:25:27.574225 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.570650 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/04c2a819-0254-4203-82e8-e61c4b88f509-host-slash\") pod \"iptables-alerter-9v7nh\" (UID: \"04c2a819-0254-4203-82e8-e61c4b88f509\") " pod="openshift-network-operator/iptables-alerter-9v7nh" Apr 20 19:25:27.574225 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.570655 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-host-slash\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.574225 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.570669 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/988fcf46-c192-47b7-a3ad-27d4676cf1f2-env-overrides\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.574225 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.570723 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/066e3172-90cc-4dbf-9891-089727ab8561-host-var-lib-cni-multus\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.574225 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.571180 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/988fcf46-c192-47b7-a3ad-27d4676cf1f2-env-overrides\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.574225 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.571232 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/988fcf46-c192-47b7-a3ad-27d4676cf1f2-ovnkube-config\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.574225 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.571279 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8350cbc8-4d36-4f4a-af5b-45977dd7b9e6-device-dir\") pod \"aws-ebs-csi-driver-node-r6hlg\" (UID: \"8350cbc8-4d36-4f4a-af5b-45977dd7b9e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6hlg" Apr 20 19:25:27.574225 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.571305 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6f0d6e92-255c-4940-acda-67d406e4eeee-run\") pod \"tuned-sjscp\" (UID: \"6f0d6e92-255c-4940-acda-67d406e4eeee\") " pod="openshift-cluster-node-tuning-operator/tuned-sjscp" Apr 20 19:25:27.574225 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.571329 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4f1d0d9b-42cd-49b6-9d9f-41487c76d136-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2z5nt\" (UID: \"4f1d0d9b-42cd-49b6-9d9f-41487c76d136\") " pod="openshift-multus/multus-additional-cni-plugins-2z5nt" Apr 20 19:25:27.574225 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.571334 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ca9e64aa-e049-4e11-b4ec-79ec745fa7c6-konnectivity-ca\") pod \"konnectivity-agent-q97q5\" (UID: \"ca9e64aa-e049-4e11-b4ec-79ec745fa7c6\") " pod="kube-system/konnectivity-agent-q97q5" Apr 20 19:25:27.574225 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.571357 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/988fcf46-c192-47b7-a3ad-27d4676cf1f2-ovnkube-script-lib\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.574225 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.571384 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/066e3172-90cc-4dbf-9891-089727ab8561-system-cni-dir\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.574225 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.571413 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-host-run-ovn-kubernetes\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.574225 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.571419 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6f0d6e92-255c-4940-acda-67d406e4eeee-run\") pod \"tuned-sjscp\" (UID: \"6f0d6e92-255c-4940-acda-67d406e4eeee\") " pod="openshift-cluster-node-tuning-operator/tuned-sjscp" Apr 20 19:25:27.574698 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.571458 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/066e3172-90cc-4dbf-9891-089727ab8561-multus-conf-dir\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.574698 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.571472 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8350cbc8-4d36-4f4a-af5b-45977dd7b9e6-device-dir\") pod \"aws-ebs-csi-driver-node-r6hlg\" (UID: \"8350cbc8-4d36-4f4a-af5b-45977dd7b9e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6hlg" Apr 20 19:25:27.574698 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.571485 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8350cbc8-4d36-4f4a-af5b-45977dd7b9e6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-r6hlg\" (UID: \"8350cbc8-4d36-4f4a-af5b-45977dd7b9e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6hlg" Apr 20 19:25:27.574698 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.571522 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-etc-openvswitch\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.574698 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.571546 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8350cbc8-4d36-4f4a-af5b-45977dd7b9e6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-r6hlg\" (UID: \"8350cbc8-4d36-4f4a-af5b-45977dd7b9e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6hlg" Apr 20 19:25:27.574698 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.571592 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/83a5e765-c988-4980-8534-55e55f1296d7-serviceca\") pod \"node-ca-vvbgf\" (UID: \"83a5e765-c988-4980-8534-55e55f1296d7\") " pod="openshift-image-registry/node-ca-vvbgf" Apr 20 19:25:27.574698 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.571594 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6f0d6e92-255c-4940-acda-67d406e4eeee-etc-systemd\") pod \"tuned-sjscp\" (UID: \"6f0d6e92-255c-4940-acda-67d406e4eeee\") " pod="openshift-cluster-node-tuning-operator/tuned-sjscp" Apr 20 19:25:27.574698 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.571639 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4f1d0d9b-42cd-49b6-9d9f-41487c76d136-os-release\") pod \"multus-additional-cni-plugins-2z5nt\" (UID: \"4f1d0d9b-42cd-49b6-9d9f-41487c76d136\") " pod="openshift-multus/multus-additional-cni-plugins-2z5nt" Apr 20 19:25:27.574698 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.571649 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6f0d6e92-255c-4940-acda-67d406e4eeee-etc-systemd\") pod \"tuned-sjscp\" (UID: \"6f0d6e92-255c-4940-acda-67d406e4eeee\") " pod="openshift-cluster-node-tuning-operator/tuned-sjscp" Apr 20 19:25:27.574698 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.571666 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/066e3172-90cc-4dbf-9891-089727ab8561-cnibin\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.574698 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.571691 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8350cbc8-4d36-4f4a-af5b-45977dd7b9e6-sys-fs\") pod \"aws-ebs-csi-driver-node-r6hlg\" (UID: \"8350cbc8-4d36-4f4a-af5b-45977dd7b9e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6hlg" Apr 20 19:25:27.574698 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.571709 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-host-cni-bin\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.574698 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.571716 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fgrjk\" (UniqueName: \"kubernetes.io/projected/4f1d0d9b-42cd-49b6-9d9f-41487c76d136-kube-api-access-fgrjk\") pod \"multus-additional-cni-plugins-2z5nt\" (UID: \"4f1d0d9b-42cd-49b6-9d9f-41487c76d136\") " pod="openshift-multus/multus-additional-cni-plugins-2z5nt" Apr 20 19:25:27.574698 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.571746 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7phhm\" (UniqueName: \"kubernetes.io/projected/513dd790-7dbf-46da-821a-3493b9941466-kube-api-access-7phhm\") pod \"network-metrics-daemon-7cd7d\" (UID: \"513dd790-7dbf-46da-821a-3493b9941466\") " pod="openshift-multus/network-metrics-daemon-7cd7d" Apr 20 19:25:27.574698 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.571749 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-host-run-ovn-kubernetes\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.574698 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.571772 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-var-lib-openvswitch\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.574698 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.571800 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/83a5e765-c988-4980-8534-55e55f1296d7-host\") pod \"node-ca-vvbgf\" (UID: \"83a5e765-c988-4980-8534-55e55f1296d7\") " pod="openshift-image-registry/node-ca-vvbgf" Apr 20 19:25:27.575140 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.571814 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/066e3172-90cc-4dbf-9891-089727ab8561-system-cni-dir\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.575140 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.571825 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/066e3172-90cc-4dbf-9891-089727ab8561-cni-binary-copy\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.575140 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.571852 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/066e3172-90cc-4dbf-9891-089727ab8561-multus-socket-dir-parent\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.575140 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.571877 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8350cbc8-4d36-4f4a-af5b-45977dd7b9e6-sys-fs\") pod \"aws-ebs-csi-driver-node-r6hlg\" (UID: \"8350cbc8-4d36-4f4a-af5b-45977dd7b9e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6hlg" Apr 20 19:25:27.575140 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.571882 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4f1d0d9b-42cd-49b6-9d9f-41487c76d136-cnibin\") pod \"multus-additional-cni-plugins-2z5nt\" (UID: \"4f1d0d9b-42cd-49b6-9d9f-41487c76d136\") " pod="openshift-multus/multus-additional-cni-plugins-2z5nt" Apr 20 19:25:27.575140 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.571911 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/513dd790-7dbf-46da-821a-3493b9941466-metrics-certs\") pod \"network-metrics-daemon-7cd7d\" (UID: \"513dd790-7dbf-46da-821a-3493b9941466\") " pod="openshift-multus/network-metrics-daemon-7cd7d" Apr 20 19:25:27.575140 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.571937 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-host-cni-netd\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.575140 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.571961 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/066e3172-90cc-4dbf-9891-089727ab8561-host-var-lib-cni-bin\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.575140 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.571984 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/066e3172-90cc-4dbf-9891-089727ab8561-host-run-multus-certs\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.575140 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.572010 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/066e3172-90cc-4dbf-9891-089727ab8561-etc-kubernetes\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.575140 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.572037 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6f0d6e92-255c-4940-acda-67d406e4eeee-var-lib-kubelet\") pod \"tuned-sjscp\" (UID: \"6f0d6e92-255c-4940-acda-67d406e4eeee\") " pod="openshift-cluster-node-tuning-operator/tuned-sjscp" Apr 20 19:25:27.575140 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.572061 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4f1d0d9b-42cd-49b6-9d9f-41487c76d136-system-cni-dir\") pod \"multus-additional-cni-plugins-2z5nt\" (UID: \"4f1d0d9b-42cd-49b6-9d9f-41487c76d136\") " pod="openshift-multus/multus-additional-cni-plugins-2z5nt" Apr 20 19:25:27.575140 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.572074 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/988fcf46-c192-47b7-a3ad-27d4676cf1f2-ovnkube-script-lib\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.575140 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.572126 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4f1d0d9b-42cd-49b6-9d9f-41487c76d136-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2z5nt\" (UID: \"4f1d0d9b-42cd-49b6-9d9f-41487c76d136\") " pod="openshift-multus/multus-additional-cni-plugins-2z5nt" Apr 20 19:25:27.575140 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.572139 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/066e3172-90cc-4dbf-9891-089727ab8561-multus-conf-dir\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.575140 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.572157 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/066e3172-90cc-4dbf-9891-089727ab8561-hostroot\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.575140 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.572158 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4f1d0d9b-42cd-49b6-9d9f-41487c76d136-cnibin\") pod \"multus-additional-cni-plugins-2z5nt\" (UID: \"4f1d0d9b-42cd-49b6-9d9f-41487c76d136\") " pod="openshift-multus/multus-additional-cni-plugins-2z5nt" Apr 20 19:25:27.575620 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.572183 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8350cbc8-4d36-4f4a-af5b-45977dd7b9e6-etc-selinux\") pod \"aws-ebs-csi-driver-node-r6hlg\" (UID: \"8350cbc8-4d36-4f4a-af5b-45977dd7b9e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6hlg" Apr 20 19:25:27.575620 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.572307 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4f1d0d9b-42cd-49b6-9d9f-41487c76d136-os-release\") pod \"multus-additional-cni-plugins-2z5nt\" (UID: \"4f1d0d9b-42cd-49b6-9d9f-41487c76d136\") " pod="openshift-multus/multus-additional-cni-plugins-2z5nt" Apr 20 19:25:27.575620 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.572311 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-host-kubelet\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.575620 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.572232 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8350cbc8-4d36-4f4a-af5b-45977dd7b9e6-etc-selinux\") pod \"aws-ebs-csi-driver-node-r6hlg\" (UID: \"8350cbc8-4d36-4f4a-af5b-45977dd7b9e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6hlg" Apr 20 19:25:27.575620 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.572257 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/066e3172-90cc-4dbf-9891-089727ab8561-host-var-lib-cni-bin\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.575620 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.572349 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8sfzh\" (UniqueName: \"kubernetes.io/projected/988fcf46-c192-47b7-a3ad-27d4676cf1f2-kube-api-access-8sfzh\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.575620 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.572366 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/066e3172-90cc-4dbf-9891-089727ab8561-etc-kubernetes\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.575620 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.572378 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/066e3172-90cc-4dbf-9891-089727ab8561-os-release\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.575620 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.572421 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6f0d6e92-255c-4940-acda-67d406e4eeee-var-lib-kubelet\") pod \"tuned-sjscp\" (UID: \"6f0d6e92-255c-4940-acda-67d406e4eeee\") " pod="openshift-cluster-node-tuning-operator/tuned-sjscp" Apr 20 19:25:27.575620 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.572430 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/066e3172-90cc-4dbf-9891-089727ab8561-multus-daemon-config\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.575620 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.572489 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4f1d0d9b-42cd-49b6-9d9f-41487c76d136-system-cni-dir\") pod \"multus-additional-cni-plugins-2z5nt\" (UID: \"4f1d0d9b-42cd-49b6-9d9f-41487c76d136\") " pod="openshift-multus/multus-additional-cni-plugins-2z5nt" Apr 20 19:25:27.575620 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.572224 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-host-cni-netd\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.575620 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.572621 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/066e3172-90cc-4dbf-9891-089727ab8561-hostroot\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.575620 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.572184 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/066e3172-90cc-4dbf-9891-089727ab8561-host-run-multus-certs\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.575620 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:27.572328 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:25:27.575620 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.572744 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6f0d6e92-255c-4940-acda-67d406e4eeee-etc-kubernetes\") pod \"tuned-sjscp\" (UID: \"6f0d6e92-255c-4940-acda-67d406e4eeee\") " pod="openshift-cluster-node-tuning-operator/tuned-sjscp" Apr 20 19:25:27.575620 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.572908 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4f1d0d9b-42cd-49b6-9d9f-41487c76d136-cni-binary-copy\") pod \"multus-additional-cni-plugins-2z5nt\" (UID: \"4f1d0d9b-42cd-49b6-9d9f-41487c76d136\") " pod="openshift-multus/multus-additional-cni-plugins-2z5nt" Apr 20 19:25:27.576101 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:27.573008 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/513dd790-7dbf-46da-821a-3493b9941466-metrics-certs podName:513dd790-7dbf-46da-821a-3493b9941466 nodeName:}" failed. No retries permitted until 2026-04-20 19:25:28.072958927 +0000 UTC m=+2.063446439 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/513dd790-7dbf-46da-821a-3493b9941466-metrics-certs") pod "network-metrics-daemon-7cd7d" (UID: "513dd790-7dbf-46da-821a-3493b9941466") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:25:27.576101 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.573104 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4f1d0d9b-42cd-49b6-9d9f-41487c76d136-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2z5nt\" (UID: \"4f1d0d9b-42cd-49b6-9d9f-41487c76d136\") " pod="openshift-multus/multus-additional-cni-plugins-2z5nt" Apr 20 19:25:27.576101 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.573104 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/066e3172-90cc-4dbf-9891-089727ab8561-os-release\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.576101 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.573155 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-var-lib-openvswitch\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.576101 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.573181 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6f0d6e92-255c-4940-acda-67d406e4eeee-etc-kubernetes\") pod \"tuned-sjscp\" (UID: \"6f0d6e92-255c-4940-acda-67d406e4eeee\") " pod="openshift-cluster-node-tuning-operator/tuned-sjscp" Apr 20 19:25:27.576101 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.573188 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/83a5e765-c988-4980-8534-55e55f1296d7-host\") pod \"node-ca-vvbgf\" (UID: \"83a5e765-c988-4980-8534-55e55f1296d7\") " pod="openshift-image-registry/node-ca-vvbgf" Apr 20 19:25:27.576101 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.573232 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6f0d6e92-255c-4940-acda-67d406e4eeee-etc-sysctl-d\") pod \"tuned-sjscp\" (UID: \"6f0d6e92-255c-4940-acda-67d406e4eeee\") " pod="openshift-cluster-node-tuning-operator/tuned-sjscp" Apr 20 19:25:27.576101 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.573261 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6f0d6e92-255c-4940-acda-67d406e4eeee-lib-modules\") pod \"tuned-sjscp\" (UID: \"6f0d6e92-255c-4940-acda-67d406e4eeee\") " pod="openshift-cluster-node-tuning-operator/tuned-sjscp" Apr 20 19:25:27.576101 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.573286 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-systemd-units\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.576101 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.573406 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6f0d6e92-255c-4940-acda-67d406e4eeee-etc-sysctl-d\") pod \"tuned-sjscp\" (UID: \"6f0d6e92-255c-4940-acda-67d406e4eeee\") " pod="openshift-cluster-node-tuning-operator/tuned-sjscp" Apr 20 19:25:27.576101 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.573504 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6f0d6e92-255c-4940-acda-67d406e4eeee-lib-modules\") pod \"tuned-sjscp\" (UID: \"6f0d6e92-255c-4940-acda-67d406e4eeee\") " pod="openshift-cluster-node-tuning-operator/tuned-sjscp" Apr 20 19:25:27.576101 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.573506 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-systemd-units\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.576101 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.573506 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/066e3172-90cc-4dbf-9891-089727ab8561-cnibin\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.576101 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.573534 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/988fcf46-c192-47b7-a3ad-27d4676cf1f2-host-kubelet\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.576101 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.573561 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/066e3172-90cc-4dbf-9891-089727ab8561-multus-cni-dir\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.576101 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.573650 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/066e3172-90cc-4dbf-9891-089727ab8561-cni-binary-copy\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.576101 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.573703 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4f1d0d9b-42cd-49b6-9d9f-41487c76d136-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2z5nt\" (UID: \"4f1d0d9b-42cd-49b6-9d9f-41487c76d136\") " pod="openshift-multus/multus-additional-cni-plugins-2z5nt" Apr 20 19:25:27.576570 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.573708 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/066e3172-90cc-4dbf-9891-089727ab8561-multus-daemon-config\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.576570 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.573757 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6f0d6e92-255c-4940-acda-67d406e4eeee-tmp\") pod \"tuned-sjscp\" (UID: \"6f0d6e92-255c-4940-acda-67d406e4eeee\") " pod="openshift-cluster-node-tuning-operator/tuned-sjscp" Apr 20 19:25:27.576570 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.573815 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/066e3172-90cc-4dbf-9891-089727ab8561-multus-cni-dir\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.576570 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.573821 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6f0d6e92-255c-4940-acda-67d406e4eeee-etc-tuned\") pod \"tuned-sjscp\" (UID: \"6f0d6e92-255c-4940-acda-67d406e4eeee\") " pod="openshift-cluster-node-tuning-operator/tuned-sjscp" Apr 20 19:25:27.576570 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.573851 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/066e3172-90cc-4dbf-9891-089727ab8561-multus-socket-dir-parent\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.576570 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.573890 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/988fcf46-c192-47b7-a3ad-27d4676cf1f2-ovn-node-metrics-cert\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.576570 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.573991 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ca9e64aa-e049-4e11-b4ec-79ec745fa7c6-agent-certs\") pod \"konnectivity-agent-q97q5\" (UID: \"ca9e64aa-e049-4e11-b4ec-79ec745fa7c6\") " pod="kube-system/konnectivity-agent-q97q5" Apr 20 19:25:27.578054 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:27.577966 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:25:27.578054 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:27.577989 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:25:27.578054 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:27.578003 2572 projected.go:194] Error preparing data for projected volume kube-api-access-z8gpq for pod openshift-network-diagnostics/network-check-target-clfqc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:25:27.578347 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:27.578324 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5c4225dd-c1ca-427e-8883-7929ed2c386e-kube-api-access-z8gpq podName:5c4225dd-c1ca-427e-8883-7929ed2c386e nodeName:}" failed. No retries permitted until 2026-04-20 19:25:28.078280935 +0000 UTC m=+2.068768468 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-z8gpq" (UniqueName: "kubernetes.io/projected/5c4225dd-c1ca-427e-8883-7929ed2c386e-kube-api-access-z8gpq") pod "network-check-target-clfqc" (UID: "5c4225dd-c1ca-427e-8883-7929ed2c386e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:25:27.581495 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.581434 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpqwn\" (UniqueName: \"kubernetes.io/projected/6f0d6e92-255c-4940-acda-67d406e4eeee-kube-api-access-cpqwn\") pod \"tuned-sjscp\" (UID: \"6f0d6e92-255c-4940-acda-67d406e4eeee\") " pod="openshift-cluster-node-tuning-operator/tuned-sjscp" Apr 20 19:25:27.581607 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.581579 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgrjk\" (UniqueName: \"kubernetes.io/projected/4f1d0d9b-42cd-49b6-9d9f-41487c76d136-kube-api-access-fgrjk\") pod \"multus-additional-cni-plugins-2z5nt\" (UID: \"4f1d0d9b-42cd-49b6-9d9f-41487c76d136\") " pod="openshift-multus/multus-additional-cni-plugins-2z5nt" Apr 20 19:25:27.581740 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.581720 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjvfg\" (UniqueName: \"kubernetes.io/projected/04c2a819-0254-4203-82e8-e61c4b88f509-kube-api-access-kjvfg\") pod \"iptables-alerter-9v7nh\" (UID: \"04c2a819-0254-4203-82e8-e61c4b88f509\") " pod="openshift-network-operator/iptables-alerter-9v7nh" Apr 20 19:25:27.583066 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.583041 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d25kx\" (UniqueName: \"kubernetes.io/projected/b0a25aae-e259-4ad8-b476-5694a4f39d1d-kube-api-access-d25kx\") pod \"node-resolver-b4sls\" (UID: \"b0a25aae-e259-4ad8-b476-5694a4f39d1d\") " pod="openshift-dns/node-resolver-b4sls" Apr 20 19:25:27.584026 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.584001 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7phhm\" (UniqueName: \"kubernetes.io/projected/513dd790-7dbf-46da-821a-3493b9941466-kube-api-access-7phhm\") pod \"network-metrics-daemon-7cd7d\" (UID: \"513dd790-7dbf-46da-821a-3493b9941466\") " pod="openshift-multus/network-metrics-daemon-7cd7d" Apr 20 19:25:27.584139 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.584068 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sfzh\" (UniqueName: \"kubernetes.io/projected/988fcf46-c192-47b7-a3ad-27d4676cf1f2-kube-api-access-8sfzh\") pod \"ovnkube-node-5ksvj\" (UID: \"988fcf46-c192-47b7-a3ad-27d4676cf1f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.584220 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.584148 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq6fj\" (UniqueName: \"kubernetes.io/projected/83a5e765-c988-4980-8534-55e55f1296d7-kube-api-access-zq6fj\") pod \"node-ca-vvbgf\" (UID: \"83a5e765-c988-4980-8534-55e55f1296d7\") " pod="openshift-image-registry/node-ca-vvbgf" Apr 20 19:25:27.584278 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.584222 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhg4x\" (UniqueName: \"kubernetes.io/projected/8350cbc8-4d36-4f4a-af5b-45977dd7b9e6-kube-api-access-hhg4x\") pod \"aws-ebs-csi-driver-node-r6hlg\" (UID: \"8350cbc8-4d36-4f4a-af5b-45977dd7b9e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6hlg" Apr 20 19:25:27.584909 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.584893 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb9qq\" (UniqueName: \"kubernetes.io/projected/066e3172-90cc-4dbf-9891-089727ab8561-kube-api-access-vb9qq\") pod \"multus-s7t97\" (UID: \"066e3172-90cc-4dbf-9891-089727ab8561\") " pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.600229 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.600188 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-162.ec2.internal" event={"ID":"7ddc67bd8f4881e0e6709f55e2cff023","Type":"ContainerStarted","Data":"25c63ad1560e1b75b4a9a3d08ccbb40e353823c7364d389ff6b273aa56624155"} Apr 20 19:25:27.601113 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.601093 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-162.ec2.internal" event={"ID":"aca2508fa5a89bde3f166bc71272b03f","Type":"ContainerStarted","Data":"a20dccfe00f71aef961bcb14b5319f697577a5b3dce9564ed0edddd79087e9e0"} Apr 20 19:25:27.782319 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.782217 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-9v7nh" Apr 20 19:25:27.788376 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:27.788351 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04c2a819_0254_4203_82e8_e61c4b88f509.slice/crio-6a376e512cd247c38b3aba61afa764491827065b6e429750cb974334e966c05d WatchSource:0}: Error finding container 6a376e512cd247c38b3aba61afa764491827065b6e429750cb974334e966c05d: Status 404 returned error can't find the container with id 6a376e512cd247c38b3aba61afa764491827065b6e429750cb974334e966c05d Apr 20 19:25:27.798227 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.798209 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:27.804634 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:27.804611 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod988fcf46_c192_47b7_a3ad_27d4676cf1f2.slice/crio-e8f08ea3790702c7c5a28904e3a0002e209370dc8764d02933d267fdd649014d WatchSource:0}: Error finding container e8f08ea3790702c7c5a28904e3a0002e209370dc8764d02933d267fdd649014d: Status 404 returned error can't find the container with id e8f08ea3790702c7c5a28904e3a0002e209370dc8764d02933d267fdd649014d Apr 20 19:25:27.811817 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.811797 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6hlg" Apr 20 19:25:27.819627 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:27.819592 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8350cbc8_4d36_4f4a_af5b_45977dd7b9e6.slice/crio-a2f3a83bab1a22b5bebe8d7174342443d3108e13837b54721aa505936d371f09 WatchSource:0}: Error finding container a2f3a83bab1a22b5bebe8d7174342443d3108e13837b54721aa505936d371f09: Status 404 returned error can't find the container with id a2f3a83bab1a22b5bebe8d7174342443d3108e13837b54721aa505936d371f09 Apr 20 19:25:27.834746 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.834721 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-b4sls" Apr 20 19:25:27.840308 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.840286 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vvbgf" Apr 20 19:25:27.840711 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:27.840688 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0a25aae_e259_4ad8_b476_5694a4f39d1d.slice/crio-0cc1dfb24f112e57b0696618d0f3c5ff1efb4ba0b56d2f76eefdbab238f8ae37 WatchSource:0}: Error finding container 0cc1dfb24f112e57b0696618d0f3c5ff1efb4ba0b56d2f76eefdbab238f8ae37: Status 404 returned error can't find the container with id 0cc1dfb24f112e57b0696618d0f3c5ff1efb4ba0b56d2f76eefdbab238f8ae37 Apr 20 19:25:27.845706 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.845678 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-s7t97" Apr 20 19:25:27.846394 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:27.846376 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83a5e765_c988_4980_8534_55e55f1296d7.slice/crio-e93514b2edb50f27a97c10c27bf1de2d0ad3543db37f420f28b81fc9054f60f7 WatchSource:0}: Error finding container e93514b2edb50f27a97c10c27bf1de2d0ad3543db37f420f28b81fc9054f60f7: Status 404 returned error can't find the container with id e93514b2edb50f27a97c10c27bf1de2d0ad3543db37f420f28b81fc9054f60f7 Apr 20 19:25:27.851281 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.851266 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-q97q5" Apr 20 19:25:27.852826 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:27.852805 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod066e3172_90cc_4dbf_9891_089727ab8561.slice/crio-9ff3125c669015136f805ccd2fc62b83729a520fd14c29976948078ffc52ca67 WatchSource:0}: Error finding container 9ff3125c669015136f805ccd2fc62b83729a520fd14c29976948078ffc52ca67: Status 404 returned error can't find the container with id 9ff3125c669015136f805ccd2fc62b83729a520fd14c29976948078ffc52ca67 Apr 20 19:25:27.856860 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.856840 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-sjscp" Apr 20 19:25:27.857531 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:27.857387 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca9e64aa_e049_4e11_b4ec_79ec745fa7c6.slice/crio-9bdfe2a5e8b08f8ec6ab8a3c74c511094cf6cdf320540f23a0830ac74698334a WatchSource:0}: Error finding container 9bdfe2a5e8b08f8ec6ab8a3c74c511094cf6cdf320540f23a0830ac74698334a: Status 404 returned error can't find the container with id 9bdfe2a5e8b08f8ec6ab8a3c74c511094cf6cdf320540f23a0830ac74698334a Apr 20 19:25:27.861064 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:27.861034 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2z5nt" Apr 20 19:25:27.863905 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:27.863877 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f0d6e92_255c_4940_acda_67d406e4eeee.slice/crio-aeb05565925bcb78a6313ffe82fb012117475287ffcbb860370dfa5045d72e2a WatchSource:0}: Error finding container aeb05565925bcb78a6313ffe82fb012117475287ffcbb860370dfa5045d72e2a: Status 404 returned error can't find the container with id aeb05565925bcb78a6313ffe82fb012117475287ffcbb860370dfa5045d72e2a Apr 20 19:25:27.871997 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:25:27.871970 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f1d0d9b_42cd_49b6_9d9f_41487c76d136.slice/crio-a5e8372e107f90ee9bd0a9ecf9ea06afb4a8f36d614102666e9214f5193c87a7 WatchSource:0}: Error finding container a5e8372e107f90ee9bd0a9ecf9ea06afb4a8f36d614102666e9214f5193c87a7: Status 404 returned error can't find the container with id a5e8372e107f90ee9bd0a9ecf9ea06afb4a8f36d614102666e9214f5193c87a7 Apr 20 19:25:28.077787 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:28.077751 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/513dd790-7dbf-46da-821a-3493b9941466-metrics-certs\") pod \"network-metrics-daemon-7cd7d\" (UID: \"513dd790-7dbf-46da-821a-3493b9941466\") " pod="openshift-multus/network-metrics-daemon-7cd7d" Apr 20 19:25:28.078007 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:28.077977 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:25:28.078128 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:28.078057 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/513dd790-7dbf-46da-821a-3493b9941466-metrics-certs podName:513dd790-7dbf-46da-821a-3493b9941466 nodeName:}" failed. No retries permitted until 2026-04-20 19:25:29.078036804 +0000 UTC m=+3.068524338 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/513dd790-7dbf-46da-821a-3493b9941466-metrics-certs") pod "network-metrics-daemon-7cd7d" (UID: "513dd790-7dbf-46da-821a-3493b9941466") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:25:28.179698 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:28.179077 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8gpq\" (UniqueName: \"kubernetes.io/projected/5c4225dd-c1ca-427e-8883-7929ed2c386e-kube-api-access-z8gpq\") pod \"network-check-target-clfqc\" (UID: \"5c4225dd-c1ca-427e-8883-7929ed2c386e\") " pod="openshift-network-diagnostics/network-check-target-clfqc" Apr 20 19:25:28.179698 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:28.179257 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:25:28.179698 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:28.179275 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:25:28.179698 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:28.179286 2572 projected.go:194] Error preparing data for projected volume kube-api-access-z8gpq for pod openshift-network-diagnostics/network-check-target-clfqc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:25:28.179698 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:28.179340 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5c4225dd-c1ca-427e-8883-7929ed2c386e-kube-api-access-z8gpq podName:5c4225dd-c1ca-427e-8883-7929ed2c386e nodeName:}" failed. No retries permitted until 2026-04-20 19:25:29.179322561 +0000 UTC m=+3.169810077 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-z8gpq" (UniqueName: "kubernetes.io/projected/5c4225dd-c1ca-427e-8883-7929ed2c386e-kube-api-access-z8gpq") pod "network-check-target-clfqc" (UID: "5c4225dd-c1ca-427e-8883-7929ed2c386e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:25:28.199435 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:28.199166 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:25:28.504030 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:28.503932 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 19:20:27 +0000 UTC" deadline="2027-12-19 09:00:13.421556753 +0000 UTC" Apr 20 19:25:28.504030 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:28.503977 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14581h34m44.917584843s" Apr 20 19:25:28.610122 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:28.610092 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cd7d" Apr 20 19:25:28.610296 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:28.610227 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cd7d" podUID="513dd790-7dbf-46da-821a-3493b9941466" Apr 20 19:25:28.618902 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:28.618861 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-q97q5" event={"ID":"ca9e64aa-e049-4e11-b4ec-79ec745fa7c6","Type":"ContainerStarted","Data":"9bdfe2a5e8b08f8ec6ab8a3c74c511094cf6cdf320540f23a0830ac74698334a"} Apr 20 19:25:28.623897 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:28.623824 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vvbgf" event={"ID":"83a5e765-c988-4980-8534-55e55f1296d7","Type":"ContainerStarted","Data":"e93514b2edb50f27a97c10c27bf1de2d0ad3543db37f420f28b81fc9054f60f7"} Apr 20 19:25:28.653282 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:28.653176 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-9v7nh" event={"ID":"04c2a819-0254-4203-82e8-e61c4b88f509","Type":"ContainerStarted","Data":"6a376e512cd247c38b3aba61afa764491827065b6e429750cb974334e966c05d"} Apr 20 19:25:28.667796 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:28.667752 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s7t97" event={"ID":"066e3172-90cc-4dbf-9891-089727ab8561","Type":"ContainerStarted","Data":"9ff3125c669015136f805ccd2fc62b83729a520fd14c29976948078ffc52ca67"} Apr 20 19:25:28.676372 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:28.676325 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-b4sls" event={"ID":"b0a25aae-e259-4ad8-b476-5694a4f39d1d","Type":"ContainerStarted","Data":"0cc1dfb24f112e57b0696618d0f3c5ff1efb4ba0b56d2f76eefdbab238f8ae37"} Apr 20 19:25:28.689786 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:28.689744 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6hlg" event={"ID":"8350cbc8-4d36-4f4a-af5b-45977dd7b9e6","Type":"ContainerStarted","Data":"a2f3a83bab1a22b5bebe8d7174342443d3108e13837b54721aa505936d371f09"} Apr 20 19:25:28.696217 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:28.696181 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" event={"ID":"988fcf46-c192-47b7-a3ad-27d4676cf1f2","Type":"ContainerStarted","Data":"e8f08ea3790702c7c5a28904e3a0002e209370dc8764d02933d267fdd649014d"} Apr 20 19:25:28.707544 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:28.707512 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2z5nt" event={"ID":"4f1d0d9b-42cd-49b6-9d9f-41487c76d136","Type":"ContainerStarted","Data":"a5e8372e107f90ee9bd0a9ecf9ea06afb4a8f36d614102666e9214f5193c87a7"} Apr 20 19:25:28.715716 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:28.715632 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-sjscp" event={"ID":"6f0d6e92-255c-4940-acda-67d406e4eeee","Type":"ContainerStarted","Data":"aeb05565925bcb78a6313ffe82fb012117475287ffcbb860370dfa5045d72e2a"} Apr 20 19:25:28.951645 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:28.951609 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:25:28.952431 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:28.952406 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:25:29.087904 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:29.087863 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/513dd790-7dbf-46da-821a-3493b9941466-metrics-certs\") pod \"network-metrics-daemon-7cd7d\" (UID: \"513dd790-7dbf-46da-821a-3493b9941466\") " pod="openshift-multus/network-metrics-daemon-7cd7d" Apr 20 19:25:29.088073 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:29.088016 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:25:29.088132 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:29.088080 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/513dd790-7dbf-46da-821a-3493b9941466-metrics-certs podName:513dd790-7dbf-46da-821a-3493b9941466 nodeName:}" failed. No retries permitted until 2026-04-20 19:25:31.088061437 +0000 UTC m=+5.078548957 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/513dd790-7dbf-46da-821a-3493b9941466-metrics-certs") pod "network-metrics-daemon-7cd7d" (UID: "513dd790-7dbf-46da-821a-3493b9941466") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:25:29.188992 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:29.188951 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8gpq\" (UniqueName: \"kubernetes.io/projected/5c4225dd-c1ca-427e-8883-7929ed2c386e-kube-api-access-z8gpq\") pod \"network-check-target-clfqc\" (UID: \"5c4225dd-c1ca-427e-8883-7929ed2c386e\") " pod="openshift-network-diagnostics/network-check-target-clfqc" Apr 20 19:25:29.189158 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:29.189119 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:25:29.189158 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:29.189138 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:25:29.189158 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:29.189150 2572 projected.go:194] Error preparing data for projected volume kube-api-access-z8gpq for pod openshift-network-diagnostics/network-check-target-clfqc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:25:29.189326 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:29.189209 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5c4225dd-c1ca-427e-8883-7929ed2c386e-kube-api-access-z8gpq podName:5c4225dd-c1ca-427e-8883-7929ed2c386e nodeName:}" failed. No retries permitted until 2026-04-20 19:25:31.189188624 +0000 UTC m=+5.179676141 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-z8gpq" (UniqueName: "kubernetes.io/projected/5c4225dd-c1ca-427e-8883-7929ed2c386e-kube-api-access-z8gpq") pod "network-check-target-clfqc" (UID: "5c4225dd-c1ca-427e-8883-7929ed2c386e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:25:29.504196 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:29.504150 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 19:20:27 +0000 UTC" deadline="2028-01-17 13:48:27.385408432 +0000 UTC" Apr 20 19:25:29.504196 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:29.504193 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15282h22m57.881218732s" Apr 20 19:25:29.598651 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:29.598616 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-clfqc" Apr 20 19:25:29.598832 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:29.598741 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-clfqc" podUID="5c4225dd-c1ca-427e-8883-7929ed2c386e" Apr 20 19:25:30.598212 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:30.598180 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cd7d" Apr 20 19:25:30.598827 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:30.598335 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cd7d" podUID="513dd790-7dbf-46da-821a-3493b9941466" Apr 20 19:25:31.104658 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:31.104617 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/513dd790-7dbf-46da-821a-3493b9941466-metrics-certs\") pod \"network-metrics-daemon-7cd7d\" (UID: \"513dd790-7dbf-46da-821a-3493b9941466\") " pod="openshift-multus/network-metrics-daemon-7cd7d" Apr 20 19:25:31.104857 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:31.104806 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:25:31.104935 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:31.104888 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/513dd790-7dbf-46da-821a-3493b9941466-metrics-certs podName:513dd790-7dbf-46da-821a-3493b9941466 nodeName:}" failed. No retries permitted until 2026-04-20 19:25:35.104867971 +0000 UTC m=+9.095355498 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/513dd790-7dbf-46da-821a-3493b9941466-metrics-certs") pod "network-metrics-daemon-7cd7d" (UID: "513dd790-7dbf-46da-821a-3493b9941466") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:25:31.205853 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:31.205812 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8gpq\" (UniqueName: \"kubernetes.io/projected/5c4225dd-c1ca-427e-8883-7929ed2c386e-kube-api-access-z8gpq\") pod \"network-check-target-clfqc\" (UID: \"5c4225dd-c1ca-427e-8883-7929ed2c386e\") " pod="openshift-network-diagnostics/network-check-target-clfqc" Apr 20 19:25:31.206071 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:31.206028 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:25:31.206071 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:31.206049 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:25:31.206071 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:31.206062 2572 projected.go:194] Error preparing data for projected volume kube-api-access-z8gpq for pod openshift-network-diagnostics/network-check-target-clfqc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:25:31.206228 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:31.206119 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5c4225dd-c1ca-427e-8883-7929ed2c386e-kube-api-access-z8gpq podName:5c4225dd-c1ca-427e-8883-7929ed2c386e nodeName:}" failed. No retries permitted until 2026-04-20 19:25:35.206102041 +0000 UTC m=+9.196589571 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-z8gpq" (UniqueName: "kubernetes.io/projected/5c4225dd-c1ca-427e-8883-7929ed2c386e-kube-api-access-z8gpq") pod "network-check-target-clfqc" (UID: "5c4225dd-c1ca-427e-8883-7929ed2c386e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:25:31.598045 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:31.597966 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-clfqc" Apr 20 19:25:31.598226 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:31.598103 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-clfqc" podUID="5c4225dd-c1ca-427e-8883-7929ed2c386e" Apr 20 19:25:32.601559 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:32.601526 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cd7d" Apr 20 19:25:32.602065 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:32.601663 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cd7d" podUID="513dd790-7dbf-46da-821a-3493b9941466" Apr 20 19:25:33.598830 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:33.598733 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-clfqc" Apr 20 19:25:33.599017 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:33.598859 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-clfqc" podUID="5c4225dd-c1ca-427e-8883-7929ed2c386e" Apr 20 19:25:34.598407 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:34.598370 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cd7d" Apr 20 19:25:34.598831 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:34.598511 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cd7d" podUID="513dd790-7dbf-46da-821a-3493b9941466" Apr 20 19:25:35.139175 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:35.139102 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/513dd790-7dbf-46da-821a-3493b9941466-metrics-certs\") pod \"network-metrics-daemon-7cd7d\" (UID: \"513dd790-7dbf-46da-821a-3493b9941466\") " pod="openshift-multus/network-metrics-daemon-7cd7d" Apr 20 19:25:35.139351 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:35.139247 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:25:35.139351 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:35.139325 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/513dd790-7dbf-46da-821a-3493b9941466-metrics-certs podName:513dd790-7dbf-46da-821a-3493b9941466 nodeName:}" failed. No retries permitted until 2026-04-20 19:25:43.139303103 +0000 UTC m=+17.129790657 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/513dd790-7dbf-46da-821a-3493b9941466-metrics-certs") pod "network-metrics-daemon-7cd7d" (UID: "513dd790-7dbf-46da-821a-3493b9941466") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:25:35.239679 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:35.239636 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8gpq\" (UniqueName: \"kubernetes.io/projected/5c4225dd-c1ca-427e-8883-7929ed2c386e-kube-api-access-z8gpq\") pod \"network-check-target-clfqc\" (UID: \"5c4225dd-c1ca-427e-8883-7929ed2c386e\") " pod="openshift-network-diagnostics/network-check-target-clfqc" Apr 20 19:25:35.239882 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:35.239801 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:25:35.239882 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:35.239820 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:25:35.239882 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:35.239832 2572 projected.go:194] Error preparing data for projected volume kube-api-access-z8gpq for pod openshift-network-diagnostics/network-check-target-clfqc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:25:35.240044 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:35.239899 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5c4225dd-c1ca-427e-8883-7929ed2c386e-kube-api-access-z8gpq podName:5c4225dd-c1ca-427e-8883-7929ed2c386e nodeName:}" failed. No retries permitted until 2026-04-20 19:25:43.239880507 +0000 UTC m=+17.230368027 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-z8gpq" (UniqueName: "kubernetes.io/projected/5c4225dd-c1ca-427e-8883-7929ed2c386e-kube-api-access-z8gpq") pod "network-check-target-clfqc" (UID: "5c4225dd-c1ca-427e-8883-7929ed2c386e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:25:35.598798 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:35.598555 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-clfqc" Apr 20 19:25:35.598798 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:35.598699 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-clfqc" podUID="5c4225dd-c1ca-427e-8883-7929ed2c386e" Apr 20 19:25:36.599088 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:36.599052 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cd7d" Apr 20 19:25:36.599541 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:36.599201 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cd7d" podUID="513dd790-7dbf-46da-821a-3493b9941466" Apr 20 19:25:37.598514 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:37.598471 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-clfqc" Apr 20 19:25:37.598681 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:37.598611 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-clfqc" podUID="5c4225dd-c1ca-427e-8883-7929ed2c386e" Apr 20 19:25:38.598643 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:38.598611 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cd7d" Apr 20 19:25:38.599106 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:38.598745 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cd7d" podUID="513dd790-7dbf-46da-821a-3493b9941466" Apr 20 19:25:39.598252 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:39.598218 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-clfqc" Apr 20 19:25:39.598414 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:39.598322 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-clfqc" podUID="5c4225dd-c1ca-427e-8883-7929ed2c386e" Apr 20 19:25:40.598371 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:40.598334 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cd7d" Apr 20 19:25:40.598826 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:40.598491 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cd7d" podUID="513dd790-7dbf-46da-821a-3493b9941466" Apr 20 19:25:41.598727 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:41.598694 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-clfqc" Apr 20 19:25:41.599144 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:41.598821 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-clfqc" podUID="5c4225dd-c1ca-427e-8883-7929ed2c386e" Apr 20 19:25:42.598306 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:42.598266 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cd7d" Apr 20 19:25:42.598513 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:42.598485 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cd7d" podUID="513dd790-7dbf-46da-821a-3493b9941466" Apr 20 19:25:43.200136 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:43.200098 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/513dd790-7dbf-46da-821a-3493b9941466-metrics-certs\") pod \"network-metrics-daemon-7cd7d\" (UID: \"513dd790-7dbf-46da-821a-3493b9941466\") " pod="openshift-multus/network-metrics-daemon-7cd7d" Apr 20 19:25:43.200602 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:43.200256 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:25:43.200602 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:43.200319 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/513dd790-7dbf-46da-821a-3493b9941466-metrics-certs podName:513dd790-7dbf-46da-821a-3493b9941466 nodeName:}" failed. No retries permitted until 2026-04-20 19:25:59.20030325 +0000 UTC m=+33.190790768 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/513dd790-7dbf-46da-821a-3493b9941466-metrics-certs") pod "network-metrics-daemon-7cd7d" (UID: "513dd790-7dbf-46da-821a-3493b9941466") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:25:43.301320 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:43.301280 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8gpq\" (UniqueName: \"kubernetes.io/projected/5c4225dd-c1ca-427e-8883-7929ed2c386e-kube-api-access-z8gpq\") pod \"network-check-target-clfqc\" (UID: \"5c4225dd-c1ca-427e-8883-7929ed2c386e\") " pod="openshift-network-diagnostics/network-check-target-clfqc" Apr 20 19:25:43.301486 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:43.301469 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:25:43.301541 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:43.301491 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:25:43.301541 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:43.301504 2572 projected.go:194] Error preparing data for projected volume kube-api-access-z8gpq for pod openshift-network-diagnostics/network-check-target-clfqc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:25:43.301645 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:43.301580 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5c4225dd-c1ca-427e-8883-7929ed2c386e-kube-api-access-z8gpq podName:5c4225dd-c1ca-427e-8883-7929ed2c386e nodeName:}" failed. No retries permitted until 2026-04-20 19:25:59.301546362 +0000 UTC m=+33.292033895 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-z8gpq" (UniqueName: "kubernetes.io/projected/5c4225dd-c1ca-427e-8883-7929ed2c386e-kube-api-access-z8gpq") pod "network-check-target-clfqc" (UID: "5c4225dd-c1ca-427e-8883-7929ed2c386e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:25:43.598767 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:43.598735 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-clfqc" Apr 20 19:25:43.598955 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:43.598867 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-clfqc" podUID="5c4225dd-c1ca-427e-8883-7929ed2c386e" Apr 20 19:25:44.598771 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:44.598735 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cd7d" Apr 20 19:25:44.599203 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:44.598916 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cd7d" podUID="513dd790-7dbf-46da-821a-3493b9941466" Apr 20 19:25:45.360163 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:45.360126 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-fhj74"] Apr 20 19:25:45.401552 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:45.401517 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fhj74" Apr 20 19:25:45.401741 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:45.401620 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fhj74" podUID="90320faf-0727-4631-bdba-64de071c97ba" Apr 20 19:25:45.517170 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:45.517132 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/90320faf-0727-4631-bdba-64de071c97ba-original-pull-secret\") pod \"global-pull-secret-syncer-fhj74\" (UID: \"90320faf-0727-4631-bdba-64de071c97ba\") " pod="kube-system/global-pull-secret-syncer-fhj74" Apr 20 19:25:45.517170 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:45.517175 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/90320faf-0727-4631-bdba-64de071c97ba-dbus\") pod \"global-pull-secret-syncer-fhj74\" (UID: \"90320faf-0727-4631-bdba-64de071c97ba\") " pod="kube-system/global-pull-secret-syncer-fhj74" Apr 20 19:25:45.517390 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:45.517205 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/90320faf-0727-4631-bdba-64de071c97ba-kubelet-config\") pod \"global-pull-secret-syncer-fhj74\" (UID: \"90320faf-0727-4631-bdba-64de071c97ba\") " pod="kube-system/global-pull-secret-syncer-fhj74" Apr 20 19:25:45.598620 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:45.598584 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-clfqc" Apr 20 19:25:45.598769 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:45.598711 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-clfqc" podUID="5c4225dd-c1ca-427e-8883-7929ed2c386e" Apr 20 19:25:45.618311 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:45.618228 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/90320faf-0727-4631-bdba-64de071c97ba-original-pull-secret\") pod \"global-pull-secret-syncer-fhj74\" (UID: \"90320faf-0727-4631-bdba-64de071c97ba\") " pod="kube-system/global-pull-secret-syncer-fhj74" Apr 20 19:25:45.618311 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:45.618267 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/90320faf-0727-4631-bdba-64de071c97ba-dbus\") pod \"global-pull-secret-syncer-fhj74\" (UID: \"90320faf-0727-4631-bdba-64de071c97ba\") " pod="kube-system/global-pull-secret-syncer-fhj74" Apr 20 19:25:45.618789 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:45.618369 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/90320faf-0727-4631-bdba-64de071c97ba-dbus\") pod \"global-pull-secret-syncer-fhj74\" (UID: \"90320faf-0727-4631-bdba-64de071c97ba\") " pod="kube-system/global-pull-secret-syncer-fhj74" Apr 20 19:25:45.618789 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:45.618385 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/90320faf-0727-4631-bdba-64de071c97ba-kubelet-config\") pod \"global-pull-secret-syncer-fhj74\" (UID: \"90320faf-0727-4631-bdba-64de071c97ba\") " pod="kube-system/global-pull-secret-syncer-fhj74" Apr 20 19:25:45.618789 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:45.618433 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/90320faf-0727-4631-bdba-64de071c97ba-kubelet-config\") pod \"global-pull-secret-syncer-fhj74\" (UID: \"90320faf-0727-4631-bdba-64de071c97ba\") " pod="kube-system/global-pull-secret-syncer-fhj74" Apr 20 19:25:45.618789 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:45.618394 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 19:25:45.618789 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:45.618522 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90320faf-0727-4631-bdba-64de071c97ba-original-pull-secret podName:90320faf-0727-4631-bdba-64de071c97ba nodeName:}" failed. No retries permitted until 2026-04-20 19:25:46.118504663 +0000 UTC m=+20.108992175 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/90320faf-0727-4631-bdba-64de071c97ba-original-pull-secret") pod "global-pull-secret-syncer-fhj74" (UID: "90320faf-0727-4631-bdba-64de071c97ba") : object "kube-system"/"original-pull-secret" not registered Apr 20 19:25:46.121539 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:46.121354 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/90320faf-0727-4631-bdba-64de071c97ba-original-pull-secret\") pod \"global-pull-secret-syncer-fhj74\" (UID: \"90320faf-0727-4631-bdba-64de071c97ba\") " pod="kube-system/global-pull-secret-syncer-fhj74" Apr 20 19:25:46.121539 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:46.121526 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 19:25:46.121768 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:46.121604 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90320faf-0727-4631-bdba-64de071c97ba-original-pull-secret podName:90320faf-0727-4631-bdba-64de071c97ba nodeName:}" failed. No retries permitted until 2026-04-20 19:25:47.121584816 +0000 UTC m=+21.112072347 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/90320faf-0727-4631-bdba-64de071c97ba-original-pull-secret") pod "global-pull-secret-syncer-fhj74" (UID: "90320faf-0727-4631-bdba-64de071c97ba") : object "kube-system"/"original-pull-secret" not registered Apr 20 19:25:46.599321 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:46.599290 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cd7d" Apr 20 19:25:46.599621 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:46.599567 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cd7d" podUID="513dd790-7dbf-46da-821a-3493b9941466" Apr 20 19:25:46.760729 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:46.760681 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-162.ec2.internal" event={"ID":"aca2508fa5a89bde3f166bc71272b03f","Type":"ContainerStarted","Data":"829496dbd72b07fa2b21baed01eb940c351436ada38057fda1a9ea952ae6ab7a"} Apr 20 19:25:46.762373 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:46.762345 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s7t97" event={"ID":"066e3172-90cc-4dbf-9891-089727ab8561","Type":"ContainerStarted","Data":"f194833a9e4a1295259def30c0e844d065b45d9a62ff3c6106b8652ecd7b30c0"} Apr 20 19:25:46.764275 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:46.764252 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-b4sls" event={"ID":"b0a25aae-e259-4ad8-b476-5694a4f39d1d","Type":"ContainerStarted","Data":"dc9966a41205f189e2088c82604aecaa10a537fbf7b1051bf686102fdc716ff9"} Apr 20 19:25:46.765702 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:46.765664 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6hlg" event={"ID":"8350cbc8-4d36-4f4a-af5b-45977dd7b9e6","Type":"ContainerStarted","Data":"6d11188156454d900ae12d67a902d55003e84eadf065f564258503829ba315ef"} Apr 20 19:25:46.768915 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:46.768555 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" event={"ID":"988fcf46-c192-47b7-a3ad-27d4676cf1f2","Type":"ContainerStarted","Data":"3d74b5b96d7508d872753e6a77cfa5d9c9438695fbd1062b3e2d9f8de50fbf2b"} Apr 20 19:25:46.768915 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:46.768895 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" event={"ID":"988fcf46-c192-47b7-a3ad-27d4676cf1f2","Type":"ContainerStarted","Data":"cfc3137c035a90969ab39484fbe1289208f3f7ac84c779444b07af40282e67a9"} Apr 20 19:25:46.768915 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:46.768910 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" event={"ID":"988fcf46-c192-47b7-a3ad-27d4676cf1f2","Type":"ContainerStarted","Data":"fe8e52a7012c9da33158ead6e2be5fdd464b08cc293dbbd4ed5e77d1036b9f17"} Apr 20 19:25:46.769104 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:46.768942 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" event={"ID":"988fcf46-c192-47b7-a3ad-27d4676cf1f2","Type":"ContainerStarted","Data":"c08012fcbf9c628bf402dc9d6673a8b04c4f07795ba24079d3a973167f5f7566"} Apr 20 19:25:46.769104 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:46.768960 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" event={"ID":"988fcf46-c192-47b7-a3ad-27d4676cf1f2","Type":"ContainerStarted","Data":"8b7e8c4bd344ebdd9b58d4cfa0cc9dfa98d39eab4d555b067eee12aad72a1386"} Apr 20 19:25:46.769104 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:46.768973 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" event={"ID":"988fcf46-c192-47b7-a3ad-27d4676cf1f2","Type":"ContainerStarted","Data":"74ba2e0c013a35327dfedbc11bb421678dd10050f7600242191a94775cc106a3"} Apr 20 19:25:46.771747 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:46.771727 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2z5nt" event={"ID":"4f1d0d9b-42cd-49b6-9d9f-41487c76d136","Type":"ContainerStarted","Data":"e4e71cdabec21b827d8ffb2dc042bb0e9b2980fb76bcb08f0d5f18578b71c187"} Apr 20 19:25:46.773467 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:46.773195 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-sjscp" event={"ID":"6f0d6e92-255c-4940-acda-67d406e4eeee","Type":"ContainerStarted","Data":"6ab797654f765f0f2452d3cbd1dce4a1f849b9103a89051f3e144b2a2748eeca"} Apr 20 19:25:46.774985 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:46.774956 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-q97q5" event={"ID":"ca9e64aa-e049-4e11-b4ec-79ec745fa7c6","Type":"ContainerStarted","Data":"c28eeb8d414f892689e50a353247ad438d1de6cfe4ea387604d1774ac2c49d5d"} Apr 20 19:25:46.776263 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:46.776234 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vvbgf" event={"ID":"83a5e765-c988-4980-8534-55e55f1296d7","Type":"ContainerStarted","Data":"8e889c7a8b13fb5dd70e07792bba5022eaaec57572441e7df50f857f3f9134af"} Apr 20 19:25:46.828141 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:46.828098 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-162.ec2.internal" podStartSLOduration=20.828084119 podStartE2EDuration="20.828084119s" podCreationTimestamp="2026-04-20 19:25:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:25:46.80058579 +0000 UTC m=+20.791073322" watchObservedRunningTime="2026-04-20 19:25:46.828084119 +0000 UTC m=+20.818571712" Apr 20 19:25:46.828268 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:46.828231 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-b4sls" podStartSLOduration=2.939959048 podStartE2EDuration="20.828225413s" podCreationTimestamp="2026-04-20 19:25:26 +0000 UTC" firstStartedPulling="2026-04-20 19:25:27.843225644 +0000 UTC m=+1.833713157" lastFinishedPulling="2026-04-20 19:25:45.731491998 +0000 UTC m=+19.721979522" observedRunningTime="2026-04-20 19:25:46.828219853 +0000 UTC m=+20.818707369" watchObservedRunningTime="2026-04-20 19:25:46.828225413 +0000 UTC m=+20.818712949" Apr 20 19:25:46.915909 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:46.915863 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-vvbgf" podStartSLOduration=3.059953869 podStartE2EDuration="20.915848196s" podCreationTimestamp="2026-04-20 19:25:26 +0000 UTC" firstStartedPulling="2026-04-20 19:25:27.849915106 +0000 UTC m=+1.840402620" lastFinishedPulling="2026-04-20 19:25:45.705809434 +0000 UTC m=+19.696296947" observedRunningTime="2026-04-20 19:25:46.86900738 +0000 UTC m=+20.859494915" watchObservedRunningTime="2026-04-20 19:25:46.915848196 +0000 UTC m=+20.906336047" Apr 20 19:25:46.949504 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:46.949434 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-s7t97" podStartSLOduration=3.058771274 podStartE2EDuration="20.949419961s" podCreationTimestamp="2026-04-20 19:25:26 +0000 UTC" firstStartedPulling="2026-04-20 19:25:27.854287255 +0000 UTC m=+1.844774769" lastFinishedPulling="2026-04-20 19:25:45.744935928 +0000 UTC m=+19.735423456" observedRunningTime="2026-04-20 19:25:46.915969901 +0000 UTC m=+20.906457417" watchObservedRunningTime="2026-04-20 19:25:46.949419961 +0000 UTC m=+20.939907495" Apr 20 19:25:46.980601 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:46.980552 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-q97q5" podStartSLOduration=3.134224121 podStartE2EDuration="20.98053744s" podCreationTimestamp="2026-04-20 19:25:26 +0000 UTC" firstStartedPulling="2026-04-20 19:25:27.859701906 +0000 UTC m=+1.850189420" lastFinishedPulling="2026-04-20 19:25:45.706015225 +0000 UTC m=+19.696502739" observedRunningTime="2026-04-20 19:25:46.949018451 +0000 UTC m=+20.939505985" watchObservedRunningTime="2026-04-20 19:25:46.98053744 +0000 UTC m=+20.971024956" Apr 20 19:25:47.029046 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:47.028996 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-sjscp" podStartSLOduration=3.169615247 podStartE2EDuration="21.028980675s" podCreationTimestamp="2026-04-20 19:25:26 +0000 UTC" firstStartedPulling="2026-04-20 19:25:27.866919436 +0000 UTC m=+1.857406967" lastFinishedPulling="2026-04-20 19:25:45.726284879 +0000 UTC m=+19.716772395" observedRunningTime="2026-04-20 19:25:46.981099026 +0000 UTC m=+20.971586561" watchObservedRunningTime="2026-04-20 19:25:47.028980675 +0000 UTC m=+21.019468209" Apr 20 19:25:47.129144 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:47.129097 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/90320faf-0727-4631-bdba-64de071c97ba-original-pull-secret\") pod \"global-pull-secret-syncer-fhj74\" (UID: \"90320faf-0727-4631-bdba-64de071c97ba\") " pod="kube-system/global-pull-secret-syncer-fhj74" Apr 20 19:25:47.129314 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:47.129201 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 19:25:47.129314 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:47.129285 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90320faf-0727-4631-bdba-64de071c97ba-original-pull-secret podName:90320faf-0727-4631-bdba-64de071c97ba nodeName:}" failed. No retries permitted until 2026-04-20 19:25:49.129264438 +0000 UTC m=+23.119751956 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/90320faf-0727-4631-bdba-64de071c97ba-original-pull-secret") pod "global-pull-secret-syncer-fhj74" (UID: "90320faf-0727-4631-bdba-64de071c97ba") : object "kube-system"/"original-pull-secret" not registered Apr 20 19:25:47.542633 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:47.542588 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-q97q5" Apr 20 19:25:47.598550 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:47.598386 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-clfqc" Apr 20 19:25:47.598688 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:47.598460 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fhj74" Apr 20 19:25:47.598688 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:47.598625 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-clfqc" podUID="5c4225dd-c1ca-427e-8883-7929ed2c386e" Apr 20 19:25:47.598792 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:47.598716 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fhj74" podUID="90320faf-0727-4631-bdba-64de071c97ba" Apr 20 19:25:47.615868 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:47.615843 2572 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 19:25:47.779917 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:47.779827 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6hlg" event={"ID":"8350cbc8-4d36-4f4a-af5b-45977dd7b9e6","Type":"ContainerStarted","Data":"53ae5751159cf69323ce4ed0677a589743de17dbe27a4e5863b99542e4dd556e"} Apr 20 19:25:47.781144 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:47.781119 2572 generic.go:358] "Generic (PLEG): container finished" podID="7ddc67bd8f4881e0e6709f55e2cff023" containerID="3cb22397ad23251ddc5b531cece5a409520f112c47a6029cc5329496621e5b0a" exitCode=0 Apr 20 19:25:47.781241 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:47.781183 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-162.ec2.internal" event={"ID":"7ddc67bd8f4881e0e6709f55e2cff023","Type":"ContainerDied","Data":"3cb22397ad23251ddc5b531cece5a409520f112c47a6029cc5329496621e5b0a"} Apr 20 19:25:47.782486 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:47.782463 2572 generic.go:358] "Generic (PLEG): container finished" podID="4f1d0d9b-42cd-49b6-9d9f-41487c76d136" containerID="e4e71cdabec21b827d8ffb2dc042bb0e9b2980fb76bcb08f0d5f18578b71c187" exitCode=0 Apr 20 19:25:47.782568 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:47.782538 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2z5nt" event={"ID":"4f1d0d9b-42cd-49b6-9d9f-41487c76d136","Type":"ContainerDied","Data":"e4e71cdabec21b827d8ffb2dc042bb0e9b2980fb76bcb08f0d5f18578b71c187"} Apr 20 19:25:47.783935 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:47.783811 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-9v7nh" event={"ID":"04c2a819-0254-4203-82e8-e61c4b88f509","Type":"ContainerStarted","Data":"9ad0f628364aaf5ae2815c0acf7db996f41761317d8b82bd922bee206da41e05"} Apr 20 19:25:47.853296 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:47.853235 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-9v7nh" podStartSLOduration=3.937589054 podStartE2EDuration="21.853215263s" podCreationTimestamp="2026-04-20 19:25:26 +0000 UTC" firstStartedPulling="2026-04-20 19:25:27.789869648 +0000 UTC m=+1.780357161" lastFinishedPulling="2026-04-20 19:25:45.705495842 +0000 UTC m=+19.695983370" observedRunningTime="2026-04-20 19:25:47.853176457 +0000 UTC m=+21.843663992" watchObservedRunningTime="2026-04-20 19:25:47.853215263 +0000 UTC m=+21.843702799" Apr 20 19:25:48.533500 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:48.533395 2572 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T19:25:47.615865465Z","UUID":"5b45c247-e8fb-4031-943d-5cada41fe60a","Handler":null,"Name":"","Endpoint":""} Apr 20 19:25:48.536214 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:48.536186 2572 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 19:25:48.536214 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:48.536217 2572 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 19:25:48.598487 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:48.598459 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cd7d" Apr 20 19:25:48.598624 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:48.598603 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cd7d" podUID="513dd790-7dbf-46da-821a-3493b9941466" Apr 20 19:25:48.787633 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:48.787593 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6hlg" event={"ID":"8350cbc8-4d36-4f4a-af5b-45977dd7b9e6","Type":"ContainerStarted","Data":"1d3dc0510214547e578a19864adcbeebd2ef01f6b6b22ab8c64c586f32b1bb8e"} Apr 20 19:25:48.790284 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:48.790260 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" event={"ID":"988fcf46-c192-47b7-a3ad-27d4676cf1f2","Type":"ContainerStarted","Data":"bd82a0f124b807351c6cc1ecef1ca9974bdf26c9fa1ecb2fbd3668a418732f13"} Apr 20 19:25:48.791782 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:48.791756 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-162.ec2.internal" event={"ID":"7ddc67bd8f4881e0e6709f55e2cff023","Type":"ContainerStarted","Data":"358e693f573d21adf0ae8c0490f6478e7de0f6e0aafcec1baa6238286d78fd72"} Apr 20 19:25:48.809937 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:48.809892 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6hlg" podStartSLOduration=2.246169083 podStartE2EDuration="22.809878608s" podCreationTimestamp="2026-04-20 19:25:26 +0000 UTC" firstStartedPulling="2026-04-20 19:25:27.821420688 +0000 UTC m=+1.811908202" lastFinishedPulling="2026-04-20 19:25:48.385130209 +0000 UTC m=+22.375617727" observedRunningTime="2026-04-20 19:25:48.80934382 +0000 UTC m=+22.799831376" watchObservedRunningTime="2026-04-20 19:25:48.809878608 +0000 UTC m=+22.800366143" Apr 20 19:25:48.825528 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:48.825485 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-162.ec2.internal" podStartSLOduration=22.825475634 podStartE2EDuration="22.825475634s" podCreationTimestamp="2026-04-20 19:25:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:25:48.824983261 +0000 UTC m=+22.815470807" watchObservedRunningTime="2026-04-20 19:25:48.825475634 +0000 UTC m=+22.815963169" Apr 20 19:25:49.143101 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:49.143066 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/90320faf-0727-4631-bdba-64de071c97ba-original-pull-secret\") pod \"global-pull-secret-syncer-fhj74\" (UID: \"90320faf-0727-4631-bdba-64de071c97ba\") " pod="kube-system/global-pull-secret-syncer-fhj74" Apr 20 19:25:49.143275 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:49.143178 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 19:25:49.143275 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:49.143237 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90320faf-0727-4631-bdba-64de071c97ba-original-pull-secret podName:90320faf-0727-4631-bdba-64de071c97ba nodeName:}" failed. No retries permitted until 2026-04-20 19:25:53.143222583 +0000 UTC m=+27.133710096 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/90320faf-0727-4631-bdba-64de071c97ba-original-pull-secret") pod "global-pull-secret-syncer-fhj74" (UID: "90320faf-0727-4631-bdba-64de071c97ba") : object "kube-system"/"original-pull-secret" not registered Apr 20 19:25:49.598474 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:49.598423 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fhj74" Apr 20 19:25:49.598652 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:49.598423 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-clfqc" Apr 20 19:25:49.598652 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:49.598548 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fhj74" podUID="90320faf-0727-4631-bdba-64de071c97ba" Apr 20 19:25:49.598652 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:49.598607 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-clfqc" podUID="5c4225dd-c1ca-427e-8883-7929ed2c386e" Apr 20 19:25:50.598304 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:50.598263 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cd7d" Apr 20 19:25:50.598966 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:50.598376 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cd7d" podUID="513dd790-7dbf-46da-821a-3493b9941466" Apr 20 19:25:50.798884 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:50.798617 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" event={"ID":"988fcf46-c192-47b7-a3ad-27d4676cf1f2","Type":"ContainerStarted","Data":"b8856214fcdd5b72392b6329dbf46575ce1dbeabc4271b4371262f8f39756789"} Apr 20 19:25:50.799062 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:50.798930 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:50.799062 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:50.798968 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:50.799062 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:50.798980 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:50.815769 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:50.815537 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:50.815769 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:50.815605 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:25:50.841121 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:50.841047 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" podStartSLOduration=6.614115745 podStartE2EDuration="24.841028799s" podCreationTimestamp="2026-04-20 19:25:26 +0000 UTC" firstStartedPulling="2026-04-20 19:25:27.805991446 +0000 UTC m=+1.796478959" lastFinishedPulling="2026-04-20 19:25:46.032904497 +0000 UTC m=+20.023392013" observedRunningTime="2026-04-20 19:25:50.83916866 +0000 UTC m=+24.829656221" watchObservedRunningTime="2026-04-20 19:25:50.841028799 +0000 UTC m=+24.831516335" Apr 20 19:25:51.325359 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:51.325312 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-q97q5" Apr 20 19:25:51.326007 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:51.325986 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-q97q5" Apr 20 19:25:51.598320 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:51.598240 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-clfqc" Apr 20 19:25:51.598735 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:51.598240 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fhj74" Apr 20 19:25:51.598735 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:51.598364 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-clfqc" podUID="5c4225dd-c1ca-427e-8883-7929ed2c386e" Apr 20 19:25:51.598735 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:51.598432 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fhj74" podUID="90320faf-0727-4631-bdba-64de071c97ba" Apr 20 19:25:51.800804 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:51.800780 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-q97q5" Apr 20 19:25:52.597838 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:52.597808 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cd7d" Apr 20 19:25:52.597984 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:52.597954 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cd7d" podUID="513dd790-7dbf-46da-821a-3493b9941466" Apr 20 19:25:52.932639 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:52.932566 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-clfqc"] Apr 20 19:25:52.933071 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:52.932690 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-clfqc" Apr 20 19:25:52.933071 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:52.932772 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-clfqc" podUID="5c4225dd-c1ca-427e-8883-7929ed2c386e" Apr 20 19:25:52.935843 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:52.935805 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7cd7d"] Apr 20 19:25:52.935970 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:52.935921 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cd7d" Apr 20 19:25:52.936064 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:52.936040 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cd7d" podUID="513dd790-7dbf-46da-821a-3493b9941466" Apr 20 19:25:52.936428 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:52.936404 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-fhj74"] Apr 20 19:25:52.936552 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:52.936517 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fhj74" Apr 20 19:25:52.936627 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:52.936601 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fhj74" podUID="90320faf-0727-4631-bdba-64de071c97ba" Apr 20 19:25:53.173920 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:53.173869 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/90320faf-0727-4631-bdba-64de071c97ba-original-pull-secret\") pod \"global-pull-secret-syncer-fhj74\" (UID: \"90320faf-0727-4631-bdba-64de071c97ba\") " pod="kube-system/global-pull-secret-syncer-fhj74" Apr 20 19:25:53.174076 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:53.174008 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 19:25:53.174137 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:53.174087 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90320faf-0727-4631-bdba-64de071c97ba-original-pull-secret podName:90320faf-0727-4631-bdba-64de071c97ba nodeName:}" failed. No retries permitted until 2026-04-20 19:26:01.174066588 +0000 UTC m=+35.164554118 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/90320faf-0727-4631-bdba-64de071c97ba-original-pull-secret") pod "global-pull-secret-syncer-fhj74" (UID: "90320faf-0727-4631-bdba-64de071c97ba") : object "kube-system"/"original-pull-secret" not registered Apr 20 19:25:54.598709 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:54.598674 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-clfqc" Apr 20 19:25:54.599152 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:54.598674 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cd7d" Apr 20 19:25:54.599152 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:54.598802 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-clfqc" podUID="5c4225dd-c1ca-427e-8883-7929ed2c386e" Apr 20 19:25:54.599152 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:54.598810 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fhj74" Apr 20 19:25:54.599152 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:54.598926 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cd7d" podUID="513dd790-7dbf-46da-821a-3493b9941466" Apr 20 19:25:54.599152 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:54.599021 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fhj74" podUID="90320faf-0727-4631-bdba-64de071c97ba" Apr 20 19:25:56.599053 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:56.598846 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-clfqc" Apr 20 19:25:56.599701 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:56.598926 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cd7d" Apr 20 19:25:56.599701 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:56.599112 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-clfqc" podUID="5c4225dd-c1ca-427e-8883-7929ed2c386e" Apr 20 19:25:56.599701 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:56.598946 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fhj74" Apr 20 19:25:56.599701 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:56.599238 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cd7d" podUID="513dd790-7dbf-46da-821a-3493b9941466" Apr 20 19:25:56.599701 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:56.599299 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fhj74" podUID="90320faf-0727-4631-bdba-64de071c97ba" Apr 20 19:25:56.810858 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:56.810820 2572 generic.go:358] "Generic (PLEG): container finished" podID="4f1d0d9b-42cd-49b6-9d9f-41487c76d136" containerID="724ea85e36b029640d1352536c3fbb1971a5c8e4a456f663aa9affce5dd6ca04" exitCode=0 Apr 20 19:25:56.811029 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:56.810863 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2z5nt" event={"ID":"4f1d0d9b-42cd-49b6-9d9f-41487c76d136","Type":"ContainerDied","Data":"724ea85e36b029640d1352536c3fbb1971a5c8e4a456f663aa9affce5dd6ca04"} Apr 20 19:25:57.814890 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:57.814858 2572 generic.go:358] "Generic (PLEG): container finished" podID="4f1d0d9b-42cd-49b6-9d9f-41487c76d136" containerID="ef1e312707ce3a003f3337edec3b00bb1d2235b8072a06ff22fb9779ad777fd8" exitCode=0 Apr 20 19:25:57.815354 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:57.814918 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2z5nt" event={"ID":"4f1d0d9b-42cd-49b6-9d9f-41487c76d136","Type":"ContainerDied","Data":"ef1e312707ce3a003f3337edec3b00bb1d2235b8072a06ff22fb9779ad777fd8"} Apr 20 19:25:58.598213 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:58.598180 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-clfqc" Apr 20 19:25:58.598213 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:58.598209 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fhj74" Apr 20 19:25:58.598429 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:58.598298 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cd7d" Apr 20 19:25:58.598429 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:58.598316 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-clfqc" podUID="5c4225dd-c1ca-427e-8883-7929ed2c386e" Apr 20 19:25:58.598429 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:58.598407 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cd7d" podUID="513dd790-7dbf-46da-821a-3493b9941466" Apr 20 19:25:58.598584 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:58.598509 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fhj74" podUID="90320faf-0727-4631-bdba-64de071c97ba" Apr 20 19:25:58.785882 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:58.785803 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-162.ec2.internal" event="NodeReady" Apr 20 19:25:58.786036 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:58.785936 2572 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 19:25:58.819085 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:58.819050 2572 generic.go:358] "Generic (PLEG): container finished" podID="4f1d0d9b-42cd-49b6-9d9f-41487c76d136" containerID="c8bf37aac65c2d112d251b24e194f2ff9ae9816b93e628fcaef8452835ad2ab8" exitCode=0 Apr 20 19:25:58.819661 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:58.819089 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2z5nt" event={"ID":"4f1d0d9b-42cd-49b6-9d9f-41487c76d136","Type":"ContainerDied","Data":"c8bf37aac65c2d112d251b24e194f2ff9ae9816b93e628fcaef8452835ad2ab8"} Apr 20 19:25:58.836911 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:58.836885 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-kxblw"] Apr 20 19:25:58.839698 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:58.839679 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-92xgv"] Apr 20 19:25:58.839824 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:58.839809 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kxblw" Apr 20 19:25:58.842476 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:58.842456 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-92xgv" Apr 20 19:25:58.842709 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:58.842691 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-sbf4z\"" Apr 20 19:25:58.842779 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:58.842705 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 19:25:58.842963 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:58.842947 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 19:25:58.843098 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:58.843083 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 19:25:58.844651 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:58.844634 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 19:25:58.844922 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:58.844908 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-5vnpt\"" Apr 20 19:25:58.845045 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:58.845032 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 19:25:58.855697 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:58.855674 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kxblw"] Apr 20 19:25:58.856563 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:58.856543 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-92xgv"] Apr 20 19:25:58.916601 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:58.916564 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b9a6ffc3-ed3d-4922-acb0-cf3513a1d431-tmp-dir\") pod \"dns-default-92xgv\" (UID: \"b9a6ffc3-ed3d-4922-acb0-cf3513a1d431\") " pod="openshift-dns/dns-default-92xgv" Apr 20 19:25:58.916793 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:58.916637 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg54r\" (UniqueName: \"kubernetes.io/projected/bc24b476-7aaf-4c95-b13e-44550d15e793-kube-api-access-rg54r\") pod \"ingress-canary-kxblw\" (UID: \"bc24b476-7aaf-4c95-b13e-44550d15e793\") " pod="openshift-ingress-canary/ingress-canary-kxblw" Apr 20 19:25:58.916793 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:58.916717 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc24b476-7aaf-4c95-b13e-44550d15e793-cert\") pod \"ingress-canary-kxblw\" (UID: \"bc24b476-7aaf-4c95-b13e-44550d15e793\") " pod="openshift-ingress-canary/ingress-canary-kxblw" Apr 20 19:25:58.916793 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:58.916754 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9a6ffc3-ed3d-4922-acb0-cf3513a1d431-config-volume\") pod \"dns-default-92xgv\" (UID: \"b9a6ffc3-ed3d-4922-acb0-cf3513a1d431\") " pod="openshift-dns/dns-default-92xgv" Apr 20 19:25:58.916793 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:58.916783 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw9kb\" (UniqueName: \"kubernetes.io/projected/b9a6ffc3-ed3d-4922-acb0-cf3513a1d431-kube-api-access-rw9kb\") pod \"dns-default-92xgv\" (UID: \"b9a6ffc3-ed3d-4922-acb0-cf3513a1d431\") " pod="openshift-dns/dns-default-92xgv" Apr 20 19:25:58.917010 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:58.916874 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b9a6ffc3-ed3d-4922-acb0-cf3513a1d431-metrics-tls\") pod \"dns-default-92xgv\" (UID: \"b9a6ffc3-ed3d-4922-acb0-cf3513a1d431\") " pod="openshift-dns/dns-default-92xgv" Apr 20 19:25:59.018097 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:59.018057 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc24b476-7aaf-4c95-b13e-44550d15e793-cert\") pod \"ingress-canary-kxblw\" (UID: \"bc24b476-7aaf-4c95-b13e-44550d15e793\") " pod="openshift-ingress-canary/ingress-canary-kxblw" Apr 20 19:25:59.018097 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:59.018095 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9a6ffc3-ed3d-4922-acb0-cf3513a1d431-config-volume\") pod \"dns-default-92xgv\" (UID: \"b9a6ffc3-ed3d-4922-acb0-cf3513a1d431\") " pod="openshift-dns/dns-default-92xgv" Apr 20 19:25:59.018336 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:59.018114 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rw9kb\" (UniqueName: \"kubernetes.io/projected/b9a6ffc3-ed3d-4922-acb0-cf3513a1d431-kube-api-access-rw9kb\") pod \"dns-default-92xgv\" (UID: \"b9a6ffc3-ed3d-4922-acb0-cf3513a1d431\") " pod="openshift-dns/dns-default-92xgv" Apr 20 19:25:59.018336 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:59.018219 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:25:59.018336 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:59.018282 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc24b476-7aaf-4c95-b13e-44550d15e793-cert podName:bc24b476-7aaf-4c95-b13e-44550d15e793 nodeName:}" failed. No retries permitted until 2026-04-20 19:25:59.518262773 +0000 UTC m=+33.508750301 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bc24b476-7aaf-4c95-b13e-44550d15e793-cert") pod "ingress-canary-kxblw" (UID: "bc24b476-7aaf-4c95-b13e-44550d15e793") : secret "canary-serving-cert" not found Apr 20 19:25:59.018532 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:59.018371 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b9a6ffc3-ed3d-4922-acb0-cf3513a1d431-metrics-tls\") pod \"dns-default-92xgv\" (UID: \"b9a6ffc3-ed3d-4922-acb0-cf3513a1d431\") " pod="openshift-dns/dns-default-92xgv" Apr 20 19:25:59.018532 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:59.018462 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:25:59.018532 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:59.018487 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b9a6ffc3-ed3d-4922-acb0-cf3513a1d431-tmp-dir\") pod \"dns-default-92xgv\" (UID: \"b9a6ffc3-ed3d-4922-acb0-cf3513a1d431\") " pod="openshift-dns/dns-default-92xgv" Apr 20 19:25:59.018532 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:59.018508 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9a6ffc3-ed3d-4922-acb0-cf3513a1d431-metrics-tls podName:b9a6ffc3-ed3d-4922-acb0-cf3513a1d431 nodeName:}" failed. No retries permitted until 2026-04-20 19:25:59.518495749 +0000 UTC m=+33.508983261 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b9a6ffc3-ed3d-4922-acb0-cf3513a1d431-metrics-tls") pod "dns-default-92xgv" (UID: "b9a6ffc3-ed3d-4922-acb0-cf3513a1d431") : secret "dns-default-metrics-tls" not found Apr 20 19:25:59.018663 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:59.018542 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rg54r\" (UniqueName: \"kubernetes.io/projected/bc24b476-7aaf-4c95-b13e-44550d15e793-kube-api-access-rg54r\") pod \"ingress-canary-kxblw\" (UID: \"bc24b476-7aaf-4c95-b13e-44550d15e793\") " pod="openshift-ingress-canary/ingress-canary-kxblw" Apr 20 19:25:59.018702 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:59.018687 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b9a6ffc3-ed3d-4922-acb0-cf3513a1d431-tmp-dir\") pod \"dns-default-92xgv\" (UID: \"b9a6ffc3-ed3d-4922-acb0-cf3513a1d431\") " pod="openshift-dns/dns-default-92xgv" Apr 20 19:25:59.018771 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:59.018755 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9a6ffc3-ed3d-4922-acb0-cf3513a1d431-config-volume\") pod \"dns-default-92xgv\" (UID: \"b9a6ffc3-ed3d-4922-acb0-cf3513a1d431\") " pod="openshift-dns/dns-default-92xgv" Apr 20 19:25:59.030802 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:59.030779 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw9kb\" (UniqueName: \"kubernetes.io/projected/b9a6ffc3-ed3d-4922-acb0-cf3513a1d431-kube-api-access-rw9kb\") pod \"dns-default-92xgv\" (UID: \"b9a6ffc3-ed3d-4922-acb0-cf3513a1d431\") " pod="openshift-dns/dns-default-92xgv" Apr 20 19:25:59.031059 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:59.031039 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg54r\" (UniqueName: \"kubernetes.io/projected/bc24b476-7aaf-4c95-b13e-44550d15e793-kube-api-access-rg54r\") pod \"ingress-canary-kxblw\" (UID: \"bc24b476-7aaf-4c95-b13e-44550d15e793\") " pod="openshift-ingress-canary/ingress-canary-kxblw" Apr 20 19:25:59.220142 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:59.220104 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/513dd790-7dbf-46da-821a-3493b9941466-metrics-certs\") pod \"network-metrics-daemon-7cd7d\" (UID: \"513dd790-7dbf-46da-821a-3493b9941466\") " pod="openshift-multus/network-metrics-daemon-7cd7d" Apr 20 19:25:59.220285 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:59.220224 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:25:59.220285 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:59.220280 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/513dd790-7dbf-46da-821a-3493b9941466-metrics-certs podName:513dd790-7dbf-46da-821a-3493b9941466 nodeName:}" failed. No retries permitted until 2026-04-20 19:26:31.220267102 +0000 UTC m=+65.210754615 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/513dd790-7dbf-46da-821a-3493b9941466-metrics-certs") pod "network-metrics-daemon-7cd7d" (UID: "513dd790-7dbf-46da-821a-3493b9941466") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:25:59.320885 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:59.320847 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8gpq\" (UniqueName: \"kubernetes.io/projected/5c4225dd-c1ca-427e-8883-7929ed2c386e-kube-api-access-z8gpq\") pod \"network-check-target-clfqc\" (UID: \"5c4225dd-c1ca-427e-8883-7929ed2c386e\") " pod="openshift-network-diagnostics/network-check-target-clfqc" Apr 20 19:25:59.321042 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:59.321021 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:25:59.321109 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:59.321045 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:25:59.321109 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:59.321055 2572 projected.go:194] Error preparing data for projected volume kube-api-access-z8gpq for pod openshift-network-diagnostics/network-check-target-clfqc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:25:59.321190 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:59.321114 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5c4225dd-c1ca-427e-8883-7929ed2c386e-kube-api-access-z8gpq podName:5c4225dd-c1ca-427e-8883-7929ed2c386e nodeName:}" failed. No retries permitted until 2026-04-20 19:26:31.321098907 +0000 UTC m=+65.311586424 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-z8gpq" (UniqueName: "kubernetes.io/projected/5c4225dd-c1ca-427e-8883-7929ed2c386e-kube-api-access-z8gpq") pod "network-check-target-clfqc" (UID: "5c4225dd-c1ca-427e-8883-7929ed2c386e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:25:59.522541 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:59.522456 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc24b476-7aaf-4c95-b13e-44550d15e793-cert\") pod \"ingress-canary-kxblw\" (UID: \"bc24b476-7aaf-4c95-b13e-44550d15e793\") " pod="openshift-ingress-canary/ingress-canary-kxblw" Apr 20 19:25:59.522541 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:25:59.522539 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b9a6ffc3-ed3d-4922-acb0-cf3513a1d431-metrics-tls\") pod \"dns-default-92xgv\" (UID: \"b9a6ffc3-ed3d-4922-acb0-cf3513a1d431\") " pod="openshift-dns/dns-default-92xgv" Apr 20 19:25:59.522754 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:59.522606 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:25:59.522754 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:59.522663 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:25:59.522754 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:59.522676 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc24b476-7aaf-4c95-b13e-44550d15e793-cert podName:bc24b476-7aaf-4c95-b13e-44550d15e793 nodeName:}" failed. No retries permitted until 2026-04-20 19:26:00.522660603 +0000 UTC m=+34.513148121 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bc24b476-7aaf-4c95-b13e-44550d15e793-cert") pod "ingress-canary-kxblw" (UID: "bc24b476-7aaf-4c95-b13e-44550d15e793") : secret "canary-serving-cert" not found Apr 20 19:25:59.522754 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:25:59.522706 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9a6ffc3-ed3d-4922-acb0-cf3513a1d431-metrics-tls podName:b9a6ffc3-ed3d-4922-acb0-cf3513a1d431 nodeName:}" failed. No retries permitted until 2026-04-20 19:26:00.52269315 +0000 UTC m=+34.513180662 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b9a6ffc3-ed3d-4922-acb0-cf3513a1d431-metrics-tls") pod "dns-default-92xgv" (UID: "b9a6ffc3-ed3d-4922-acb0-cf3513a1d431") : secret "dns-default-metrics-tls" not found Apr 20 19:26:00.533144 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:00.533104 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc24b476-7aaf-4c95-b13e-44550d15e793-cert\") pod \"ingress-canary-kxblw\" (UID: \"bc24b476-7aaf-4c95-b13e-44550d15e793\") " pod="openshift-ingress-canary/ingress-canary-kxblw" Apr 20 19:26:00.533611 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:00.533180 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b9a6ffc3-ed3d-4922-acb0-cf3513a1d431-metrics-tls\") pod \"dns-default-92xgv\" (UID: \"b9a6ffc3-ed3d-4922-acb0-cf3513a1d431\") " pod="openshift-dns/dns-default-92xgv" Apr 20 19:26:00.533611 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:26:00.533280 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:26:00.533611 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:26:00.533304 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:26:00.533611 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:26:00.533360 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc24b476-7aaf-4c95-b13e-44550d15e793-cert podName:bc24b476-7aaf-4c95-b13e-44550d15e793 nodeName:}" failed. No retries permitted until 2026-04-20 19:26:02.533339525 +0000 UTC m=+36.523827042 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bc24b476-7aaf-4c95-b13e-44550d15e793-cert") pod "ingress-canary-kxblw" (UID: "bc24b476-7aaf-4c95-b13e-44550d15e793") : secret "canary-serving-cert" not found Apr 20 19:26:00.533611 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:26:00.533384 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9a6ffc3-ed3d-4922-acb0-cf3513a1d431-metrics-tls podName:b9a6ffc3-ed3d-4922-acb0-cf3513a1d431 nodeName:}" failed. No retries permitted until 2026-04-20 19:26:02.533374973 +0000 UTC m=+36.523862486 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b9a6ffc3-ed3d-4922-acb0-cf3513a1d431-metrics-tls") pod "dns-default-92xgv" (UID: "b9a6ffc3-ed3d-4922-acb0-cf3513a1d431") : secret "dns-default-metrics-tls" not found Apr 20 19:26:00.598612 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:00.598398 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-clfqc" Apr 20 19:26:00.598612 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:00.598449 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fhj74" Apr 20 19:26:00.598612 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:00.598487 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cd7d" Apr 20 19:26:00.601421 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:00.601393 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 19:26:00.601665 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:00.601645 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-89kqx\"" Apr 20 19:26:00.601804 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:00.601645 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 19:26:00.602613 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:00.602523 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 19:26:00.602715 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:00.602684 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 19:26:00.602715 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:00.602710 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-v8fpn\"" Apr 20 19:26:01.240549 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:01.240516 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/90320faf-0727-4631-bdba-64de071c97ba-original-pull-secret\") pod \"global-pull-secret-syncer-fhj74\" (UID: \"90320faf-0727-4631-bdba-64de071c97ba\") " pod="kube-system/global-pull-secret-syncer-fhj74" Apr 20 19:26:01.243121 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:01.243095 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/90320faf-0727-4631-bdba-64de071c97ba-original-pull-secret\") pod \"global-pull-secret-syncer-fhj74\" (UID: \"90320faf-0727-4631-bdba-64de071c97ba\") " pod="kube-system/global-pull-secret-syncer-fhj74" Apr 20 19:26:01.511304 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:01.511224 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fhj74" Apr 20 19:26:01.674185 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:01.673855 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-fhj74"] Apr 20 19:26:01.677993 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:26:01.677961 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90320faf_0727_4631_bdba_64de071c97ba.slice/crio-4a99f1b211907db6b4cbded06e61815898ddb9c6095c16b03d3c8e7b41a96821 WatchSource:0}: Error finding container 4a99f1b211907db6b4cbded06e61815898ddb9c6095c16b03d3c8e7b41a96821: Status 404 returned error can't find the container with id 4a99f1b211907db6b4cbded06e61815898ddb9c6095c16b03d3c8e7b41a96821 Apr 20 19:26:01.827035 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:01.826994 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-fhj74" event={"ID":"90320faf-0727-4631-bdba-64de071c97ba","Type":"ContainerStarted","Data":"4a99f1b211907db6b4cbded06e61815898ddb9c6095c16b03d3c8e7b41a96821"} Apr 20 19:26:02.550664 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:02.550556 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc24b476-7aaf-4c95-b13e-44550d15e793-cert\") pod \"ingress-canary-kxblw\" (UID: \"bc24b476-7aaf-4c95-b13e-44550d15e793\") " pod="openshift-ingress-canary/ingress-canary-kxblw" Apr 20 19:26:02.550664 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:02.550645 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b9a6ffc3-ed3d-4922-acb0-cf3513a1d431-metrics-tls\") pod \"dns-default-92xgv\" (UID: \"b9a6ffc3-ed3d-4922-acb0-cf3513a1d431\") " pod="openshift-dns/dns-default-92xgv" Apr 20 19:26:02.550942 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:26:02.550743 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:26:02.550942 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:26:02.550781 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:26:02.550942 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:26:02.550824 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc24b476-7aaf-4c95-b13e-44550d15e793-cert podName:bc24b476-7aaf-4c95-b13e-44550d15e793 nodeName:}" failed. No retries permitted until 2026-04-20 19:26:06.550802927 +0000 UTC m=+40.541290445 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bc24b476-7aaf-4c95-b13e-44550d15e793-cert") pod "ingress-canary-kxblw" (UID: "bc24b476-7aaf-4c95-b13e-44550d15e793") : secret "canary-serving-cert" not found Apr 20 19:26:02.550942 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:26:02.550868 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9a6ffc3-ed3d-4922-acb0-cf3513a1d431-metrics-tls podName:b9a6ffc3-ed3d-4922-acb0-cf3513a1d431 nodeName:}" failed. No retries permitted until 2026-04-20 19:26:06.550850197 +0000 UTC m=+40.541337723 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b9a6ffc3-ed3d-4922-acb0-cf3513a1d431-metrics-tls") pod "dns-default-92xgv" (UID: "b9a6ffc3-ed3d-4922-acb0-cf3513a1d431") : secret "dns-default-metrics-tls" not found Apr 20 19:26:06.589166 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:06.589067 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc24b476-7aaf-4c95-b13e-44550d15e793-cert\") pod \"ingress-canary-kxblw\" (UID: \"bc24b476-7aaf-4c95-b13e-44550d15e793\") " pod="openshift-ingress-canary/ingress-canary-kxblw" Apr 20 19:26:06.589633 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:26:06.589289 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:26:06.589633 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:26:06.589400 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc24b476-7aaf-4c95-b13e-44550d15e793-cert podName:bc24b476-7aaf-4c95-b13e-44550d15e793 nodeName:}" failed. No retries permitted until 2026-04-20 19:26:14.589374972 +0000 UTC m=+48.579862485 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bc24b476-7aaf-4c95-b13e-44550d15e793-cert") pod "ingress-canary-kxblw" (UID: "bc24b476-7aaf-4c95-b13e-44550d15e793") : secret "canary-serving-cert" not found Apr 20 19:26:06.590096 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:06.589806 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b9a6ffc3-ed3d-4922-acb0-cf3513a1d431-metrics-tls\") pod \"dns-default-92xgv\" (UID: \"b9a6ffc3-ed3d-4922-acb0-cf3513a1d431\") " pod="openshift-dns/dns-default-92xgv" Apr 20 19:26:06.590096 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:26:06.589974 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:26:06.590096 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:26:06.590070 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9a6ffc3-ed3d-4922-acb0-cf3513a1d431-metrics-tls podName:b9a6ffc3-ed3d-4922-acb0-cf3513a1d431 nodeName:}" failed. No retries permitted until 2026-04-20 19:26:14.590045265 +0000 UTC m=+48.580532796 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b9a6ffc3-ed3d-4922-acb0-cf3513a1d431-metrics-tls") pod "dns-default-92xgv" (UID: "b9a6ffc3-ed3d-4922-acb0-cf3513a1d431") : secret "dns-default-metrics-tls" not found Apr 20 19:26:06.838842 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:06.838807 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2z5nt" event={"ID":"4f1d0d9b-42cd-49b6-9d9f-41487c76d136","Type":"ContainerStarted","Data":"75f48523ef40fda13dd54ef7be9f622c8b714c26617c3c33e87df8bfd6d08bc0"} Apr 20 19:26:06.840652 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:06.840629 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-fhj74" event={"ID":"90320faf-0727-4631-bdba-64de071c97ba","Type":"ContainerStarted","Data":"1eb273e2133e43fa01d0317e9a820cfb6e3a9afe710dc925b7b198bb3a75896e"} Apr 20 19:26:06.907239 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:06.907163 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-fhj74" podStartSLOduration=17.076333466 podStartE2EDuration="21.907145368s" podCreationTimestamp="2026-04-20 19:25:45 +0000 UTC" firstStartedPulling="2026-04-20 19:26:01.680057818 +0000 UTC m=+35.670545336" lastFinishedPulling="2026-04-20 19:26:06.510869724 +0000 UTC m=+40.501357238" observedRunningTime="2026-04-20 19:26:06.906853785 +0000 UTC m=+40.897341332" watchObservedRunningTime="2026-04-20 19:26:06.907145368 +0000 UTC m=+40.897632903" Apr 20 19:26:07.845057 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:07.845020 2572 generic.go:358] "Generic (PLEG): container finished" podID="4f1d0d9b-42cd-49b6-9d9f-41487c76d136" containerID="75f48523ef40fda13dd54ef7be9f622c8b714c26617c3c33e87df8bfd6d08bc0" exitCode=0 Apr 20 19:26:07.845057 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:07.845048 2572 generic.go:358] "Generic (PLEG): container finished" podID="4f1d0d9b-42cd-49b6-9d9f-41487c76d136" containerID="a29bc67603526843796b0902d9319f4f5bf33e48689e1cb89805fd1e71bcac2e" exitCode=0 Apr 20 19:26:07.845568 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:07.845095 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2z5nt" event={"ID":"4f1d0d9b-42cd-49b6-9d9f-41487c76d136","Type":"ContainerDied","Data":"75f48523ef40fda13dd54ef7be9f622c8b714c26617c3c33e87df8bfd6d08bc0"} Apr 20 19:26:07.845568 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:07.845127 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2z5nt" event={"ID":"4f1d0d9b-42cd-49b6-9d9f-41487c76d136","Type":"ContainerDied","Data":"a29bc67603526843796b0902d9319f4f5bf33e48689e1cb89805fd1e71bcac2e"} Apr 20 19:26:08.849598 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:08.849565 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2z5nt" event={"ID":"4f1d0d9b-42cd-49b6-9d9f-41487c76d136","Type":"ContainerStarted","Data":"6ec036c2d1ab045e476616fab0d752f24c681641329d50b9eab6389b87d088a1"} Apr 20 19:26:08.888403 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:08.888342 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-2z5nt" podStartSLOduration=4.259657539 podStartE2EDuration="42.888324824s" podCreationTimestamp="2026-04-20 19:25:26 +0000 UTC" firstStartedPulling="2026-04-20 19:25:27.873833238 +0000 UTC m=+1.864320756" lastFinishedPulling="2026-04-20 19:26:06.502500529 +0000 UTC m=+40.492988041" observedRunningTime="2026-04-20 19:26:08.888101355 +0000 UTC m=+42.878588891" watchObservedRunningTime="2026-04-20 19:26:08.888324824 +0000 UTC m=+42.878812362" Apr 20 19:26:14.649296 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:14.649260 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc24b476-7aaf-4c95-b13e-44550d15e793-cert\") pod \"ingress-canary-kxblw\" (UID: \"bc24b476-7aaf-4c95-b13e-44550d15e793\") " pod="openshift-ingress-canary/ingress-canary-kxblw" Apr 20 19:26:14.649815 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:14.649322 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b9a6ffc3-ed3d-4922-acb0-cf3513a1d431-metrics-tls\") pod \"dns-default-92xgv\" (UID: \"b9a6ffc3-ed3d-4922-acb0-cf3513a1d431\") " pod="openshift-dns/dns-default-92xgv" Apr 20 19:26:14.649815 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:26:14.649406 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:26:14.649815 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:26:14.649426 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:26:14.649815 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:26:14.649492 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc24b476-7aaf-4c95-b13e-44550d15e793-cert podName:bc24b476-7aaf-4c95-b13e-44550d15e793 nodeName:}" failed. No retries permitted until 2026-04-20 19:26:30.649474275 +0000 UTC m=+64.639961796 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bc24b476-7aaf-4c95-b13e-44550d15e793-cert") pod "ingress-canary-kxblw" (UID: "bc24b476-7aaf-4c95-b13e-44550d15e793") : secret "canary-serving-cert" not found Apr 20 19:26:14.649815 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:26:14.649506 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9a6ffc3-ed3d-4922-acb0-cf3513a1d431-metrics-tls podName:b9a6ffc3-ed3d-4922-acb0-cf3513a1d431 nodeName:}" failed. No retries permitted until 2026-04-20 19:26:30.649500556 +0000 UTC m=+64.639988069 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b9a6ffc3-ed3d-4922-acb0-cf3513a1d431-metrics-tls") pod "dns-default-92xgv" (UID: "b9a6ffc3-ed3d-4922-acb0-cf3513a1d431") : secret "dns-default-metrics-tls" not found Apr 20 19:26:22.813512 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:22.813477 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5ksvj" Apr 20 19:26:30.656455 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:30.656259 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc24b476-7aaf-4c95-b13e-44550d15e793-cert\") pod \"ingress-canary-kxblw\" (UID: \"bc24b476-7aaf-4c95-b13e-44550d15e793\") " pod="openshift-ingress-canary/ingress-canary-kxblw" Apr 20 19:26:30.657029 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:30.656706 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b9a6ffc3-ed3d-4922-acb0-cf3513a1d431-metrics-tls\") pod \"dns-default-92xgv\" (UID: \"b9a6ffc3-ed3d-4922-acb0-cf3513a1d431\") " pod="openshift-dns/dns-default-92xgv" Apr 20 19:26:30.657029 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:26:30.656764 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:26:30.657029 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:26:30.656850 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc24b476-7aaf-4c95-b13e-44550d15e793-cert podName:bc24b476-7aaf-4c95-b13e-44550d15e793 nodeName:}" failed. No retries permitted until 2026-04-20 19:27:02.65681974 +0000 UTC m=+96.647307267 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bc24b476-7aaf-4c95-b13e-44550d15e793-cert") pod "ingress-canary-kxblw" (UID: "bc24b476-7aaf-4c95-b13e-44550d15e793") : secret "canary-serving-cert" not found Apr 20 19:26:30.657029 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:26:30.656861 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:26:30.657029 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:26:30.656925 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9a6ffc3-ed3d-4922-acb0-cf3513a1d431-metrics-tls podName:b9a6ffc3-ed3d-4922-acb0-cf3513a1d431 nodeName:}" failed. No retries permitted until 2026-04-20 19:27:02.656906675 +0000 UTC m=+96.647394191 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b9a6ffc3-ed3d-4922-acb0-cf3513a1d431-metrics-tls") pod "dns-default-92xgv" (UID: "b9a6ffc3-ed3d-4922-acb0-cf3513a1d431") : secret "dns-default-metrics-tls" not found Apr 20 19:26:31.261133 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:31.261097 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/513dd790-7dbf-46da-821a-3493b9941466-metrics-certs\") pod \"network-metrics-daemon-7cd7d\" (UID: \"513dd790-7dbf-46da-821a-3493b9941466\") " pod="openshift-multus/network-metrics-daemon-7cd7d" Apr 20 19:26:31.263961 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:31.263943 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 19:26:31.272101 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:26:31.272083 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 19:26:31.272162 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:26:31.272138 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/513dd790-7dbf-46da-821a-3493b9941466-metrics-certs podName:513dd790-7dbf-46da-821a-3493b9941466 nodeName:}" failed. No retries permitted until 2026-04-20 19:27:35.272121864 +0000 UTC m=+129.262609378 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/513dd790-7dbf-46da-821a-3493b9941466-metrics-certs") pod "network-metrics-daemon-7cd7d" (UID: "513dd790-7dbf-46da-821a-3493b9941466") : secret "metrics-daemon-secret" not found Apr 20 19:26:31.361542 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:31.361508 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8gpq\" (UniqueName: \"kubernetes.io/projected/5c4225dd-c1ca-427e-8883-7929ed2c386e-kube-api-access-z8gpq\") pod \"network-check-target-clfqc\" (UID: \"5c4225dd-c1ca-427e-8883-7929ed2c386e\") " pod="openshift-network-diagnostics/network-check-target-clfqc" Apr 20 19:26:31.364221 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:31.364198 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 19:26:31.374561 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:31.374539 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 19:26:31.385993 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:31.385970 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8gpq\" (UniqueName: \"kubernetes.io/projected/5c4225dd-c1ca-427e-8883-7929ed2c386e-kube-api-access-z8gpq\") pod \"network-check-target-clfqc\" (UID: \"5c4225dd-c1ca-427e-8883-7929ed2c386e\") " pod="openshift-network-diagnostics/network-check-target-clfqc" Apr 20 19:26:31.527677 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:31.527602 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-89kqx\"" Apr 20 19:26:31.535208 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:31.535188 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-clfqc" Apr 20 19:26:31.667124 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:31.667091 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-clfqc"] Apr 20 19:26:31.671058 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:26:31.671027 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c4225dd_c1ca_427e_8883_7929ed2c386e.slice/crio-9e52a0d7a28d2a42931dfbd5613d23fa306e84142f542ce6816293121ac76d1e WatchSource:0}: Error finding container 9e52a0d7a28d2a42931dfbd5613d23fa306e84142f542ce6816293121ac76d1e: Status 404 returned error can't find the container with id 9e52a0d7a28d2a42931dfbd5613d23fa306e84142f542ce6816293121ac76d1e Apr 20 19:26:31.894062 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:31.894030 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-clfqc" event={"ID":"5c4225dd-c1ca-427e-8883-7929ed2c386e","Type":"ContainerStarted","Data":"9e52a0d7a28d2a42931dfbd5613d23fa306e84142f542ce6816293121ac76d1e"} Apr 20 19:26:34.901573 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:34.901542 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-clfqc" event={"ID":"5c4225dd-c1ca-427e-8883-7929ed2c386e","Type":"ContainerStarted","Data":"1eaa483a902a00094e98d6102274f46d432934227f0868b050e7192e0d6e5cb3"} Apr 20 19:26:34.901998 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:34.901679 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-clfqc" Apr 20 19:26:34.919976 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:26:34.919925 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-clfqc" podStartSLOduration=66.235663318 podStartE2EDuration="1m8.919911425s" podCreationTimestamp="2026-04-20 19:25:26 +0000 UTC" firstStartedPulling="2026-04-20 19:26:31.67297284 +0000 UTC m=+65.663460356" lastFinishedPulling="2026-04-20 19:26:34.35722095 +0000 UTC m=+68.347708463" observedRunningTime="2026-04-20 19:26:34.9187682 +0000 UTC m=+68.909255735" watchObservedRunningTime="2026-04-20 19:26:34.919911425 +0000 UTC m=+68.910398957" Apr 20 19:27:02.668855 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:02.668816 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc24b476-7aaf-4c95-b13e-44550d15e793-cert\") pod \"ingress-canary-kxblw\" (UID: \"bc24b476-7aaf-4c95-b13e-44550d15e793\") " pod="openshift-ingress-canary/ingress-canary-kxblw" Apr 20 19:27:02.669265 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:02.668883 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b9a6ffc3-ed3d-4922-acb0-cf3513a1d431-metrics-tls\") pod \"dns-default-92xgv\" (UID: \"b9a6ffc3-ed3d-4922-acb0-cf3513a1d431\") " pod="openshift-dns/dns-default-92xgv" Apr 20 19:27:02.669265 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:27:02.668957 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:27:02.669265 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:27:02.668983 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:27:02.669265 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:27:02.669022 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc24b476-7aaf-4c95-b13e-44550d15e793-cert podName:bc24b476-7aaf-4c95-b13e-44550d15e793 nodeName:}" failed. No retries permitted until 2026-04-20 19:28:06.669005555 +0000 UTC m=+160.659493069 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bc24b476-7aaf-4c95-b13e-44550d15e793-cert") pod "ingress-canary-kxblw" (UID: "bc24b476-7aaf-4c95-b13e-44550d15e793") : secret "canary-serving-cert" not found Apr 20 19:27:02.669265 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:27:02.669036 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9a6ffc3-ed3d-4922-acb0-cf3513a1d431-metrics-tls podName:b9a6ffc3-ed3d-4922-acb0-cf3513a1d431 nodeName:}" failed. No retries permitted until 2026-04-20 19:28:06.669030035 +0000 UTC m=+160.659517548 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b9a6ffc3-ed3d-4922-acb0-cf3513a1d431-metrics-tls") pod "dns-default-92xgv" (UID: "b9a6ffc3-ed3d-4922-acb0-cf3513a1d431") : secret "dns-default-metrics-tls" not found Apr 20 19:27:05.906458 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:05.906412 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-clfqc" Apr 20 19:27:34.292924 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:34.292889 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-8796n"] Apr 20 19:27:34.294835 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:34.294820 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8796n" Apr 20 19:27:34.298401 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:34.298381 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 19:27:34.299479 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:34.299458 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-vgnlm\"" Apr 20 19:27:34.299567 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:34.299485 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 19:27:34.301012 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:34.300992 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 20 19:27:34.301096 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:34.300994 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 20 19:27:34.307278 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:34.307257 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-8796n"] Apr 20 19:27:34.380475 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:34.380419 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4c29\" (UniqueName: \"kubernetes.io/projected/b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0-kube-api-access-j4c29\") pod \"cluster-monitoring-operator-75587bd455-8796n\" (UID: \"b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8796n" Apr 20 19:27:34.380646 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:34.380585 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-8796n\" (UID: \"b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8796n" Apr 20 19:27:34.380646 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:34.380617 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8796n\" (UID: \"b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8796n" Apr 20 19:27:34.481851 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:34.481820 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-8796n\" (UID: \"b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8796n" Apr 20 19:27:34.481851 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:34.481856 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8796n\" (UID: \"b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8796n" Apr 20 19:27:34.482048 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:34.481881 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j4c29\" (UniqueName: \"kubernetes.io/projected/b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0-kube-api-access-j4c29\") pod \"cluster-monitoring-operator-75587bd455-8796n\" (UID: \"b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8796n" Apr 20 19:27:34.482048 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:27:34.481994 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 19:27:34.482130 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:27:34.482073 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0-cluster-monitoring-operator-tls podName:b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0 nodeName:}" failed. No retries permitted until 2026-04-20 19:27:34.982056777 +0000 UTC m=+128.972544289 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-8796n" (UID: "b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0") : secret "cluster-monitoring-operator-tls" not found Apr 20 19:27:34.482507 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:34.482490 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-8796n\" (UID: \"b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8796n" Apr 20 19:27:34.490602 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:34.490580 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4c29\" (UniqueName: \"kubernetes.io/projected/b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0-kube-api-access-j4c29\") pod \"cluster-monitoring-operator-75587bd455-8796n\" (UID: \"b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8796n" Apr 20 19:27:34.985865 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:34.985822 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8796n\" (UID: \"b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8796n" Apr 20 19:27:34.986042 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:27:34.985985 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 19:27:34.986084 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:27:34.986050 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0-cluster-monitoring-operator-tls podName:b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0 nodeName:}" failed. No retries permitted until 2026-04-20 19:27:35.98603433 +0000 UTC m=+129.976521847 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-8796n" (UID: "b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0") : secret "cluster-monitoring-operator-tls" not found Apr 20 19:27:35.288937 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:35.288835 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/513dd790-7dbf-46da-821a-3493b9941466-metrics-certs\") pod \"network-metrics-daemon-7cd7d\" (UID: \"513dd790-7dbf-46da-821a-3493b9941466\") " pod="openshift-multus/network-metrics-daemon-7cd7d" Apr 20 19:27:35.289092 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:27:35.288957 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 19:27:35.289092 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:27:35.289012 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/513dd790-7dbf-46da-821a-3493b9941466-metrics-certs podName:513dd790-7dbf-46da-821a-3493b9941466 nodeName:}" failed. No retries permitted until 2026-04-20 19:29:37.288996699 +0000 UTC m=+251.279484212 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/513dd790-7dbf-46da-821a-3493b9941466-metrics-certs") pod "network-metrics-daemon-7cd7d" (UID: "513dd790-7dbf-46da-821a-3493b9941466") : secret "metrics-daemon-secret" not found Apr 20 19:27:35.995556 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:35.995518 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8796n\" (UID: \"b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8796n" Apr 20 19:27:35.995940 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:27:35.995666 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 19:27:35.995940 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:27:35.995737 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0-cluster-monitoring-operator-tls podName:b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0 nodeName:}" failed. No retries permitted until 2026-04-20 19:27:37.995716637 +0000 UTC m=+131.986204153 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-8796n" (UID: "b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0") : secret "cluster-monitoring-operator-tls" not found Apr 20 19:27:38.010412 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:38.010359 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8796n\" (UID: \"b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8796n" Apr 20 19:27:38.010823 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:27:38.010500 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 19:27:38.010823 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:27:38.010564 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0-cluster-monitoring-operator-tls podName:b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0 nodeName:}" failed. No retries permitted until 2026-04-20 19:27:42.010548523 +0000 UTC m=+136.001036036 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-8796n" (UID: "b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0") : secret "cluster-monitoring-operator-tls" not found Apr 20 19:27:39.461959 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:39.461932 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-b4sls_b0a25aae-e259-4ad8-b476-5694a4f39d1d/dns-node-resolver/0.log" Apr 20 19:27:40.459589 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:40.459561 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vvbgf_83a5e765-c988-4980-8534-55e55f1296d7/node-ca/0.log" Apr 20 19:27:42.041055 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:42.041006 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8796n\" (UID: \"b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8796n" Apr 20 19:27:42.041560 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:27:42.041145 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 19:27:42.041560 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:27:42.041223 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0-cluster-monitoring-operator-tls podName:b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0 nodeName:}" failed. No retries permitted until 2026-04-20 19:27:50.041198573 +0000 UTC m=+144.031686089 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-8796n" (UID: "b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0") : secret "cluster-monitoring-operator-tls" not found Apr 20 19:27:44.153114 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.153079 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-cf4z7"] Apr 20 19:27:44.155019 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.155001 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-cf4z7" Apr 20 19:27:44.157557 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.157531 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-zdsj8\"" Apr 20 19:27:44.157658 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.157593 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 20 19:27:44.157658 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.157598 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:27:44.165686 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.165660 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-cf4z7"] Apr 20 19:27:44.256409 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.256376 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk98j\" (UniqueName: \"kubernetes.io/projected/c42ddb6b-2227-4768-8867-b1506419b88d-kube-api-access-bk98j\") pod \"volume-data-source-validator-7c6cbb6c87-cf4z7\" (UID: \"c42ddb6b-2227-4768-8867-b1506419b88d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-cf4z7" Apr 20 19:27:44.262711 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.262681 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qf28c"] Apr 20 19:27:44.264545 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.264529 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qf28c" Apr 20 19:27:44.268000 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.267964 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 20 19:27:44.268152 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.267999 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 20 19:27:44.268152 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.268019 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:27:44.268152 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.268032 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-fkqm4\"" Apr 20 19:27:44.268332 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.268318 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 20 19:27:44.270374 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.270353 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b74jx"] Apr 20 19:27:44.272379 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.272361 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-qpl5s"] Apr 20 19:27:44.272533 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.272518 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b74jx" Apr 20 19:27:44.274140 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.274124 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-qpl5s" Apr 20 19:27:44.275390 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.275362 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 20 19:27:44.275390 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.275375 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-zzfkv\"" Apr 20 19:27:44.275565 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.275395 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:27:44.275565 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.275369 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 20 19:27:44.275565 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.275454 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 20 19:27:44.276602 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.276589 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:27:44.276656 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.276632 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 20 19:27:44.276852 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.276838 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 20 19:27:44.277107 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.277088 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 20 19:27:44.277107 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.277100 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-jqwz2\"" Apr 20 19:27:44.280308 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.280282 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qf28c"] Apr 20 19:27:44.282689 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.282669 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 20 19:27:44.287111 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.287085 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b74jx"] Apr 20 19:27:44.294734 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.294704 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-qpl5s"] Apr 20 19:27:44.357584 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.357542 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bk98j\" (UniqueName: \"kubernetes.io/projected/c42ddb6b-2227-4768-8867-b1506419b88d-kube-api-access-bk98j\") pod \"volume-data-source-validator-7c6cbb6c87-cf4z7\" (UID: \"c42ddb6b-2227-4768-8867-b1506419b88d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-cf4z7" Apr 20 19:27:44.357757 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.357609 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7a85080-ad4d-4e33-b890-2483a1f5c762-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-b74jx\" (UID: \"e7a85080-ad4d-4e33-b890-2483a1f5c762\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b74jx" Apr 20 19:27:44.357757 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.357634 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7a85080-ad4d-4e33-b890-2483a1f5c762-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-b74jx\" (UID: \"e7a85080-ad4d-4e33-b890-2483a1f5c762\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b74jx" Apr 20 19:27:44.357757 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.357650 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7dn2\" (UniqueName: \"kubernetes.io/projected/e7a85080-ad4d-4e33-b890-2483a1f5c762-kube-api-access-m7dn2\") pod \"kube-storage-version-migrator-operator-6769c5d45-b74jx\" (UID: \"e7a85080-ad4d-4e33-b890-2483a1f5c762\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b74jx" Apr 20 19:27:44.357757 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.357676 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a25d18d6-5add-4c28-a671-0ee5222cb999-trusted-ca\") pod \"console-operator-9d4b6777b-qpl5s\" (UID: \"a25d18d6-5add-4c28-a671-0ee5222cb999\") " pod="openshift-console-operator/console-operator-9d4b6777b-qpl5s" Apr 20 19:27:44.357890 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.357769 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e52f4554-13e1-451b-851d-003e1e091adc-config\") pod \"service-ca-operator-d6fc45fc5-qf28c\" (UID: \"e52f4554-13e1-451b-851d-003e1e091adc\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qf28c" Apr 20 19:27:44.357890 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.357794 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rnx4\" (UniqueName: \"kubernetes.io/projected/e52f4554-13e1-451b-851d-003e1e091adc-kube-api-access-8rnx4\") pod \"service-ca-operator-d6fc45fc5-qf28c\" (UID: \"e52f4554-13e1-451b-851d-003e1e091adc\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qf28c" Apr 20 19:27:44.357890 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.357814 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a25d18d6-5add-4c28-a671-0ee5222cb999-config\") pod \"console-operator-9d4b6777b-qpl5s\" (UID: \"a25d18d6-5add-4c28-a671-0ee5222cb999\") " pod="openshift-console-operator/console-operator-9d4b6777b-qpl5s" Apr 20 19:27:44.357890 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.357831 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a25d18d6-5add-4c28-a671-0ee5222cb999-serving-cert\") pod \"console-operator-9d4b6777b-qpl5s\" (UID: \"a25d18d6-5add-4c28-a671-0ee5222cb999\") " pod="openshift-console-operator/console-operator-9d4b6777b-qpl5s" Apr 20 19:27:44.358011 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.357910 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e52f4554-13e1-451b-851d-003e1e091adc-serving-cert\") pod \"service-ca-operator-d6fc45fc5-qf28c\" (UID: \"e52f4554-13e1-451b-851d-003e1e091adc\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qf28c" Apr 20 19:27:44.358011 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.357953 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nksdb\" (UniqueName: \"kubernetes.io/projected/a25d18d6-5add-4c28-a671-0ee5222cb999-kube-api-access-nksdb\") pod \"console-operator-9d4b6777b-qpl5s\" (UID: \"a25d18d6-5add-4c28-a671-0ee5222cb999\") " pod="openshift-console-operator/console-operator-9d4b6777b-qpl5s" Apr 20 19:27:44.365731 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.365701 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk98j\" (UniqueName: \"kubernetes.io/projected/c42ddb6b-2227-4768-8867-b1506419b88d-kube-api-access-bk98j\") pod \"volume-data-source-validator-7c6cbb6c87-cf4z7\" (UID: \"c42ddb6b-2227-4768-8867-b1506419b88d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-cf4z7" Apr 20 19:27:44.458569 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.458478 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7a85080-ad4d-4e33-b890-2483a1f5c762-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-b74jx\" (UID: \"e7a85080-ad4d-4e33-b890-2483a1f5c762\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b74jx" Apr 20 19:27:44.458569 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.458520 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7a85080-ad4d-4e33-b890-2483a1f5c762-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-b74jx\" (UID: \"e7a85080-ad4d-4e33-b890-2483a1f5c762\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b74jx" Apr 20 19:27:44.458569 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.458551 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m7dn2\" (UniqueName: \"kubernetes.io/projected/e7a85080-ad4d-4e33-b890-2483a1f5c762-kube-api-access-m7dn2\") pod \"kube-storage-version-migrator-operator-6769c5d45-b74jx\" (UID: \"e7a85080-ad4d-4e33-b890-2483a1f5c762\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b74jx" Apr 20 19:27:44.458569 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.458570 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a25d18d6-5add-4c28-a671-0ee5222cb999-trusted-ca\") pod \"console-operator-9d4b6777b-qpl5s\" (UID: \"a25d18d6-5add-4c28-a671-0ee5222cb999\") " pod="openshift-console-operator/console-operator-9d4b6777b-qpl5s" Apr 20 19:27:44.458882 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.458599 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e52f4554-13e1-451b-851d-003e1e091adc-config\") pod \"service-ca-operator-d6fc45fc5-qf28c\" (UID: \"e52f4554-13e1-451b-851d-003e1e091adc\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qf28c" Apr 20 19:27:44.458882 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.458619 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8rnx4\" (UniqueName: \"kubernetes.io/projected/e52f4554-13e1-451b-851d-003e1e091adc-kube-api-access-8rnx4\") pod \"service-ca-operator-d6fc45fc5-qf28c\" (UID: \"e52f4554-13e1-451b-851d-003e1e091adc\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qf28c" Apr 20 19:27:44.458882 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.458637 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a25d18d6-5add-4c28-a671-0ee5222cb999-config\") pod \"console-operator-9d4b6777b-qpl5s\" (UID: \"a25d18d6-5add-4c28-a671-0ee5222cb999\") " pod="openshift-console-operator/console-operator-9d4b6777b-qpl5s" Apr 20 19:27:44.458882 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.458724 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a25d18d6-5add-4c28-a671-0ee5222cb999-serving-cert\") pod \"console-operator-9d4b6777b-qpl5s\" (UID: \"a25d18d6-5add-4c28-a671-0ee5222cb999\") " pod="openshift-console-operator/console-operator-9d4b6777b-qpl5s" Apr 20 19:27:44.458882 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.458783 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e52f4554-13e1-451b-851d-003e1e091adc-serving-cert\") pod \"service-ca-operator-d6fc45fc5-qf28c\" (UID: \"e52f4554-13e1-451b-851d-003e1e091adc\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qf28c" Apr 20 19:27:44.458882 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.458838 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nksdb\" (UniqueName: \"kubernetes.io/projected/a25d18d6-5add-4c28-a671-0ee5222cb999-kube-api-access-nksdb\") pod \"console-operator-9d4b6777b-qpl5s\" (UID: \"a25d18d6-5add-4c28-a671-0ee5222cb999\") " pod="openshift-console-operator/console-operator-9d4b6777b-qpl5s" Apr 20 19:27:44.459244 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.459217 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7a85080-ad4d-4e33-b890-2483a1f5c762-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-b74jx\" (UID: \"e7a85080-ad4d-4e33-b890-2483a1f5c762\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b74jx" Apr 20 19:27:44.459337 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.459242 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e52f4554-13e1-451b-851d-003e1e091adc-config\") pod \"service-ca-operator-d6fc45fc5-qf28c\" (UID: \"e52f4554-13e1-451b-851d-003e1e091adc\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qf28c" Apr 20 19:27:44.459411 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.459392 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a25d18d6-5add-4c28-a671-0ee5222cb999-config\") pod \"console-operator-9d4b6777b-qpl5s\" (UID: \"a25d18d6-5add-4c28-a671-0ee5222cb999\") " pod="openshift-console-operator/console-operator-9d4b6777b-qpl5s" Apr 20 19:27:44.459765 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.459741 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a25d18d6-5add-4c28-a671-0ee5222cb999-trusted-ca\") pod \"console-operator-9d4b6777b-qpl5s\" (UID: \"a25d18d6-5add-4c28-a671-0ee5222cb999\") " pod="openshift-console-operator/console-operator-9d4b6777b-qpl5s" Apr 20 19:27:44.461038 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.461005 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7a85080-ad4d-4e33-b890-2483a1f5c762-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-b74jx\" (UID: \"e7a85080-ad4d-4e33-b890-2483a1f5c762\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b74jx" Apr 20 19:27:44.461152 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.461137 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e52f4554-13e1-451b-851d-003e1e091adc-serving-cert\") pod \"service-ca-operator-d6fc45fc5-qf28c\" (UID: \"e52f4554-13e1-451b-851d-003e1e091adc\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qf28c" Apr 20 19:27:44.461584 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.461568 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a25d18d6-5add-4c28-a671-0ee5222cb999-serving-cert\") pod \"console-operator-9d4b6777b-qpl5s\" (UID: \"a25d18d6-5add-4c28-a671-0ee5222cb999\") " pod="openshift-console-operator/console-operator-9d4b6777b-qpl5s" Apr 20 19:27:44.463495 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.463478 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-cf4z7" Apr 20 19:27:44.466920 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.466901 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nksdb\" (UniqueName: \"kubernetes.io/projected/a25d18d6-5add-4c28-a671-0ee5222cb999-kube-api-access-nksdb\") pod \"console-operator-9d4b6777b-qpl5s\" (UID: \"a25d18d6-5add-4c28-a671-0ee5222cb999\") " pod="openshift-console-operator/console-operator-9d4b6777b-qpl5s" Apr 20 19:27:44.467045 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.466977 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rnx4\" (UniqueName: \"kubernetes.io/projected/e52f4554-13e1-451b-851d-003e1e091adc-kube-api-access-8rnx4\") pod \"service-ca-operator-d6fc45fc5-qf28c\" (UID: \"e52f4554-13e1-451b-851d-003e1e091adc\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qf28c" Apr 20 19:27:44.467370 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.467347 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7dn2\" (UniqueName: \"kubernetes.io/projected/e7a85080-ad4d-4e33-b890-2483a1f5c762-kube-api-access-m7dn2\") pod \"kube-storage-version-migrator-operator-6769c5d45-b74jx\" (UID: \"e7a85080-ad4d-4e33-b890-2483a1f5c762\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b74jx" Apr 20 19:27:44.573920 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.573878 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qf28c" Apr 20 19:27:44.579703 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.579673 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-cf4z7"] Apr 20 19:27:44.582411 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.582385 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b74jx" Apr 20 19:27:44.583519 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:27:44.583485 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc42ddb6b_2227_4768_8867_b1506419b88d.slice/crio-1b9e11b39e7b04fc70c107320c1adad7cc01ac59effb79456943a829b3b0baae WatchSource:0}: Error finding container 1b9e11b39e7b04fc70c107320c1adad7cc01ac59effb79456943a829b3b0baae: Status 404 returned error can't find the container with id 1b9e11b39e7b04fc70c107320c1adad7cc01ac59effb79456943a829b3b0baae Apr 20 19:27:44.588784 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.588761 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-qpl5s" Apr 20 19:27:44.717819 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.717744 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qf28c"] Apr 20 19:27:44.721343 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:27:44.721311 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode52f4554_13e1_451b_851d_003e1e091adc.slice/crio-bfd3cd4fa6d6a70d1795d1da0a6292ff564ef96dca200294bbeb0bf3cee24c53 WatchSource:0}: Error finding container bfd3cd4fa6d6a70d1795d1da0a6292ff564ef96dca200294bbeb0bf3cee24c53: Status 404 returned error can't find the container with id bfd3cd4fa6d6a70d1795d1da0a6292ff564ef96dca200294bbeb0bf3cee24c53 Apr 20 19:27:44.935410 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.935377 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b74jx"] Apr 20 19:27:44.938317 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:44.938286 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-qpl5s"] Apr 20 19:27:44.938747 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:27:44.938710 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7a85080_ad4d_4e33_b890_2483a1f5c762.slice/crio-0feb5f977a69e0934575d19d387b1e992f220a605bcca6ed53955986a50f0fb3 WatchSource:0}: Error finding container 0feb5f977a69e0934575d19d387b1e992f220a605bcca6ed53955986a50f0fb3: Status 404 returned error can't find the container with id 0feb5f977a69e0934575d19d387b1e992f220a605bcca6ed53955986a50f0fb3 Apr 20 19:27:44.941121 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:27:44.941093 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda25d18d6_5add_4c28_a671_0ee5222cb999.slice/crio-b3edef7e6d5ed2bbeb9d1e7edeedf181d9a46015b7e3fc506982d153c4439ec3 WatchSource:0}: Error finding container b3edef7e6d5ed2bbeb9d1e7edeedf181d9a46015b7e3fc506982d153c4439ec3: Status 404 returned error can't find the container with id b3edef7e6d5ed2bbeb9d1e7edeedf181d9a46015b7e3fc506982d153c4439ec3 Apr 20 19:27:45.030796 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:45.030699 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b74jx" event={"ID":"e7a85080-ad4d-4e33-b890-2483a1f5c762","Type":"ContainerStarted","Data":"0feb5f977a69e0934575d19d387b1e992f220a605bcca6ed53955986a50f0fb3"} Apr 20 19:27:45.031735 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:45.031706 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qf28c" event={"ID":"e52f4554-13e1-451b-851d-003e1e091adc","Type":"ContainerStarted","Data":"bfd3cd4fa6d6a70d1795d1da0a6292ff564ef96dca200294bbeb0bf3cee24c53"} Apr 20 19:27:45.032596 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:45.032562 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-qpl5s" event={"ID":"a25d18d6-5add-4c28-a671-0ee5222cb999","Type":"ContainerStarted","Data":"b3edef7e6d5ed2bbeb9d1e7edeedf181d9a46015b7e3fc506982d153c4439ec3"} Apr 20 19:27:45.033486 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:45.033459 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-cf4z7" event={"ID":"c42ddb6b-2227-4768-8867-b1506419b88d","Type":"ContainerStarted","Data":"1b9e11b39e7b04fc70c107320c1adad7cc01ac59effb79456943a829b3b0baae"} Apr 20 19:27:48.042145 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:48.042048 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-cf4z7" event={"ID":"c42ddb6b-2227-4768-8867-b1506419b88d","Type":"ContainerStarted","Data":"1144cc0f09e9847050149ddd00c63891755c4f7febbddfb78e1695a7bb1252d2"} Apr 20 19:27:48.043480 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:48.043428 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b74jx" event={"ID":"e7a85080-ad4d-4e33-b890-2483a1f5c762","Type":"ContainerStarted","Data":"5588799756f23454cf6ba0a2ac6617468198747a824277fa75e33fb8cc234c77"} Apr 20 19:27:48.044613 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:48.044587 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qf28c" event={"ID":"e52f4554-13e1-451b-851d-003e1e091adc","Type":"ContainerStarted","Data":"21213bde81c11727685bc4fffc6acb6eee908d52a8841d7d9105f5e7434e2c22"} Apr 20 19:27:48.046019 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:48.046000 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-qpl5s_a25d18d6-5add-4c28-a671-0ee5222cb999/console-operator/0.log" Apr 20 19:27:48.046213 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:48.046034 2572 generic.go:358] "Generic (PLEG): container finished" podID="a25d18d6-5add-4c28-a671-0ee5222cb999" containerID="c18cbd151169b9d262ae1715cbc6e5ed002eef8648b2070ac8ca859abb4e2421" exitCode=255 Apr 20 19:27:48.046213 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:48.046057 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-qpl5s" event={"ID":"a25d18d6-5add-4c28-a671-0ee5222cb999","Type":"ContainerDied","Data":"c18cbd151169b9d262ae1715cbc6e5ed002eef8648b2070ac8ca859abb4e2421"} Apr 20 19:27:48.046322 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:48.046297 2572 scope.go:117] "RemoveContainer" containerID="c18cbd151169b9d262ae1715cbc6e5ed002eef8648b2070ac8ca859abb4e2421" Apr 20 19:27:48.057077 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:48.057037 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-cf4z7" podStartSLOduration=0.884396539 podStartE2EDuration="4.057023303s" podCreationTimestamp="2026-04-20 19:27:44 +0000 UTC" firstStartedPulling="2026-04-20 19:27:44.58733846 +0000 UTC m=+138.577825973" lastFinishedPulling="2026-04-20 19:27:47.759965204 +0000 UTC m=+141.750452737" observedRunningTime="2026-04-20 19:27:48.056647781 +0000 UTC m=+142.047135320" watchObservedRunningTime="2026-04-20 19:27:48.057023303 +0000 UTC m=+142.047510837" Apr 20 19:27:48.074080 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:48.073699 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b74jx" podStartSLOduration=1.247266447 podStartE2EDuration="4.073680856s" podCreationTimestamp="2026-04-20 19:27:44 +0000 UTC" firstStartedPulling="2026-04-20 19:27:44.940691354 +0000 UTC m=+138.931178872" lastFinishedPulling="2026-04-20 19:27:47.76710575 +0000 UTC m=+141.757593281" observedRunningTime="2026-04-20 19:27:48.073394858 +0000 UTC m=+142.063882406" watchObservedRunningTime="2026-04-20 19:27:48.073680856 +0000 UTC m=+142.064168392" Apr 20 19:27:48.090420 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:48.090331 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qf28c" podStartSLOduration=1.047991956 podStartE2EDuration="4.090310791s" podCreationTimestamp="2026-04-20 19:27:44 +0000 UTC" firstStartedPulling="2026-04-20 19:27:44.723219119 +0000 UTC m=+138.713706632" lastFinishedPulling="2026-04-20 19:27:47.765537937 +0000 UTC m=+141.756025467" observedRunningTime="2026-04-20 19:27:48.089473984 +0000 UTC m=+142.079961521" watchObservedRunningTime="2026-04-20 19:27:48.090310791 +0000 UTC m=+142.080798327" Apr 20 19:27:49.050853 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:49.050823 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-qpl5s_a25d18d6-5add-4c28-a671-0ee5222cb999/console-operator/1.log" Apr 20 19:27:49.051319 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:49.051272 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-qpl5s_a25d18d6-5add-4c28-a671-0ee5222cb999/console-operator/0.log" Apr 20 19:27:49.051403 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:49.051321 2572 generic.go:358] "Generic (PLEG): container finished" podID="a25d18d6-5add-4c28-a671-0ee5222cb999" containerID="839bd4e59ca85fffc5ae39602b88377407438178a3ff16af876be7cc696de0a8" exitCode=255 Apr 20 19:27:49.051473 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:49.051423 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-qpl5s" event={"ID":"a25d18d6-5add-4c28-a671-0ee5222cb999","Type":"ContainerDied","Data":"839bd4e59ca85fffc5ae39602b88377407438178a3ff16af876be7cc696de0a8"} Apr 20 19:27:49.051532 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:49.051479 2572 scope.go:117] "RemoveContainer" containerID="c18cbd151169b9d262ae1715cbc6e5ed002eef8648b2070ac8ca859abb4e2421" Apr 20 19:27:49.051710 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:49.051690 2572 scope.go:117] "RemoveContainer" containerID="839bd4e59ca85fffc5ae39602b88377407438178a3ff16af876be7cc696de0a8" Apr 20 19:27:49.051940 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:27:49.051908 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-qpl5s_openshift-console-operator(a25d18d6-5add-4c28-a671-0ee5222cb999)\"" pod="openshift-console-operator/console-operator-9d4b6777b-qpl5s" podUID="a25d18d6-5add-4c28-a671-0ee5222cb999" Apr 20 19:27:50.055562 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:50.055533 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-qpl5s_a25d18d6-5add-4c28-a671-0ee5222cb999/console-operator/1.log" Apr 20 19:27:50.055947 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:50.055865 2572 scope.go:117] "RemoveContainer" containerID="839bd4e59ca85fffc5ae39602b88377407438178a3ff16af876be7cc696de0a8" Apr 20 19:27:50.056037 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:27:50.056019 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-qpl5s_openshift-console-operator(a25d18d6-5add-4c28-a671-0ee5222cb999)\"" pod="openshift-console-operator/console-operator-9d4b6777b-qpl5s" podUID="a25d18d6-5add-4c28-a671-0ee5222cb999" Apr 20 19:27:50.107310 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:50.107265 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8796n\" (UID: \"b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8796n" Apr 20 19:27:50.107513 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:27:50.107381 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 19:27:50.107513 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:27:50.107436 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0-cluster-monitoring-operator-tls podName:b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0 nodeName:}" failed. No retries permitted until 2026-04-20 19:28:06.107422767 +0000 UTC m=+160.097910280 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-8796n" (UID: "b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0") : secret "cluster-monitoring-operator-tls" not found Apr 20 19:27:50.640045 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:50.640011 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-snltq"] Apr 20 19:27:50.642586 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:50.642563 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-snltq" Apr 20 19:27:50.644982 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:50.644962 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-ppsnv\"" Apr 20 19:27:50.650030 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:50.650004 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-snltq"] Apr 20 19:27:50.813735 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:50.813697 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm7r4\" (UniqueName: \"kubernetes.io/projected/5cbc0b7a-ab5e-40fd-b9f5-accffe0107fc-kube-api-access-wm7r4\") pod \"network-check-source-8894fc9bd-snltq\" (UID: \"5cbc0b7a-ab5e-40fd-b9f5-accffe0107fc\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-snltq" Apr 20 19:27:50.908908 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:50.908821 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-gjghs"] Apr 20 19:27:50.910876 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:50.910861 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-gjghs" Apr 20 19:27:50.914508 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:50.914481 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wm7r4\" (UniqueName: \"kubernetes.io/projected/5cbc0b7a-ab5e-40fd-b9f5-accffe0107fc-kube-api-access-wm7r4\") pod \"network-check-source-8894fc9bd-snltq\" (UID: \"5cbc0b7a-ab5e-40fd-b9f5-accffe0107fc\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-snltq" Apr 20 19:27:50.916004 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:50.915976 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 19:27:50.916122 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:50.916030 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 19:27:50.916183 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:50.916127 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 19:27:50.916607 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:50.916591 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 19:27:50.918667 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:50.918646 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-4kb52\"" Apr 20 19:27:50.932615 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:50.932581 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm7r4\" (UniqueName: \"kubernetes.io/projected/5cbc0b7a-ab5e-40fd-b9f5-accffe0107fc-kube-api-access-wm7r4\") pod \"network-check-source-8894fc9bd-snltq\" (UID: \"5cbc0b7a-ab5e-40fd-b9f5-accffe0107fc\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-snltq" Apr 20 19:27:50.938332 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:50.938304 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-gjghs"] Apr 20 19:27:50.952182 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:50.952142 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-snltq" Apr 20 19:27:51.015250 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:51.015204 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c49e842f-07fc-49d1-a61f-45722a72a1cf-crio-socket\") pod \"insights-runtime-extractor-gjghs\" (UID: \"c49e842f-07fc-49d1-a61f-45722a72a1cf\") " pod="openshift-insights/insights-runtime-extractor-gjghs" Apr 20 19:27:51.015419 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:51.015270 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c49e842f-07fc-49d1-a61f-45722a72a1cf-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gjghs\" (UID: \"c49e842f-07fc-49d1-a61f-45722a72a1cf\") " pod="openshift-insights/insights-runtime-extractor-gjghs" Apr 20 19:27:51.015419 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:51.015311 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c49e842f-07fc-49d1-a61f-45722a72a1cf-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gjghs\" (UID: \"c49e842f-07fc-49d1-a61f-45722a72a1cf\") " pod="openshift-insights/insights-runtime-extractor-gjghs" Apr 20 19:27:51.015419 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:51.015337 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxnsm\" (UniqueName: \"kubernetes.io/projected/c49e842f-07fc-49d1-a61f-45722a72a1cf-kube-api-access-bxnsm\") pod \"insights-runtime-extractor-gjghs\" (UID: \"c49e842f-07fc-49d1-a61f-45722a72a1cf\") " pod="openshift-insights/insights-runtime-extractor-gjghs" Apr 20 19:27:51.015419 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:51.015367 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c49e842f-07fc-49d1-a61f-45722a72a1cf-data-volume\") pod \"insights-runtime-extractor-gjghs\" (UID: \"c49e842f-07fc-49d1-a61f-45722a72a1cf\") " pod="openshift-insights/insights-runtime-extractor-gjghs" Apr 20 19:27:51.078816 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:51.078776 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-snltq"] Apr 20 19:27:51.081726 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:27:51.081697 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cbc0b7a_ab5e_40fd_b9f5_accffe0107fc.slice/crio-d3dca6cb3011059872725d4d60fadc4ddca022af3be27b01f313f9f21a50d412 WatchSource:0}: Error finding container d3dca6cb3011059872725d4d60fadc4ddca022af3be27b01f313f9f21a50d412: Status 404 returned error can't find the container with id d3dca6cb3011059872725d4d60fadc4ddca022af3be27b01f313f9f21a50d412 Apr 20 19:27:51.115732 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:51.115705 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c49e842f-07fc-49d1-a61f-45722a72a1cf-crio-socket\") pod \"insights-runtime-extractor-gjghs\" (UID: \"c49e842f-07fc-49d1-a61f-45722a72a1cf\") " pod="openshift-insights/insights-runtime-extractor-gjghs" Apr 20 19:27:51.115862 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:51.115746 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c49e842f-07fc-49d1-a61f-45722a72a1cf-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gjghs\" (UID: \"c49e842f-07fc-49d1-a61f-45722a72a1cf\") " pod="openshift-insights/insights-runtime-extractor-gjghs" Apr 20 19:27:51.115862 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:51.115769 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c49e842f-07fc-49d1-a61f-45722a72a1cf-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gjghs\" (UID: \"c49e842f-07fc-49d1-a61f-45722a72a1cf\") " pod="openshift-insights/insights-runtime-extractor-gjghs" Apr 20 19:27:51.115862 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:51.115795 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bxnsm\" (UniqueName: \"kubernetes.io/projected/c49e842f-07fc-49d1-a61f-45722a72a1cf-kube-api-access-bxnsm\") pod \"insights-runtime-extractor-gjghs\" (UID: \"c49e842f-07fc-49d1-a61f-45722a72a1cf\") " pod="openshift-insights/insights-runtime-extractor-gjghs" Apr 20 19:27:51.115862 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:51.115823 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c49e842f-07fc-49d1-a61f-45722a72a1cf-data-volume\") pod \"insights-runtime-extractor-gjghs\" (UID: \"c49e842f-07fc-49d1-a61f-45722a72a1cf\") " pod="openshift-insights/insights-runtime-extractor-gjghs" Apr 20 19:27:51.115862 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:51.115832 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c49e842f-07fc-49d1-a61f-45722a72a1cf-crio-socket\") pod \"insights-runtime-extractor-gjghs\" (UID: \"c49e842f-07fc-49d1-a61f-45722a72a1cf\") " pod="openshift-insights/insights-runtime-extractor-gjghs" Apr 20 19:27:51.116096 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:27:51.115956 2572 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 19:27:51.116096 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:27:51.116031 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c49e842f-07fc-49d1-a61f-45722a72a1cf-insights-runtime-extractor-tls podName:c49e842f-07fc-49d1-a61f-45722a72a1cf nodeName:}" failed. No retries permitted until 2026-04-20 19:27:51.616009358 +0000 UTC m=+145.606496890 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/c49e842f-07fc-49d1-a61f-45722a72a1cf-insights-runtime-extractor-tls") pod "insights-runtime-extractor-gjghs" (UID: "c49e842f-07fc-49d1-a61f-45722a72a1cf") : secret "insights-runtime-extractor-tls" not found Apr 20 19:27:51.116256 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:51.116240 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c49e842f-07fc-49d1-a61f-45722a72a1cf-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gjghs\" (UID: \"c49e842f-07fc-49d1-a61f-45722a72a1cf\") " pod="openshift-insights/insights-runtime-extractor-gjghs" Apr 20 19:27:51.116655 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:51.116638 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c49e842f-07fc-49d1-a61f-45722a72a1cf-data-volume\") pod \"insights-runtime-extractor-gjghs\" (UID: \"c49e842f-07fc-49d1-a61f-45722a72a1cf\") " pod="openshift-insights/insights-runtime-extractor-gjghs" Apr 20 19:27:51.124148 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:51.124133 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxnsm\" (UniqueName: \"kubernetes.io/projected/c49e842f-07fc-49d1-a61f-45722a72a1cf-kube-api-access-bxnsm\") pod \"insights-runtime-extractor-gjghs\" (UID: \"c49e842f-07fc-49d1-a61f-45722a72a1cf\") " pod="openshift-insights/insights-runtime-extractor-gjghs" Apr 20 19:27:51.619585 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:51.619548 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c49e842f-07fc-49d1-a61f-45722a72a1cf-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gjghs\" (UID: \"c49e842f-07fc-49d1-a61f-45722a72a1cf\") " pod="openshift-insights/insights-runtime-extractor-gjghs" Apr 20 19:27:51.619744 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:27:51.619712 2572 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 19:27:51.619798 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:27:51.619778 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c49e842f-07fc-49d1-a61f-45722a72a1cf-insights-runtime-extractor-tls podName:c49e842f-07fc-49d1-a61f-45722a72a1cf nodeName:}" failed. No retries permitted until 2026-04-20 19:27:52.619760945 +0000 UTC m=+146.610248458 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/c49e842f-07fc-49d1-a61f-45722a72a1cf-insights-runtime-extractor-tls") pod "insights-runtime-extractor-gjghs" (UID: "c49e842f-07fc-49d1-a61f-45722a72a1cf") : secret "insights-runtime-extractor-tls" not found Apr 20 19:27:51.822405 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:51.822376 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-sbzc8"] Apr 20 19:27:51.824503 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:51.824488 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-sbzc8" Apr 20 19:27:51.827063 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:51.827037 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 20 19:27:51.827181 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:51.827077 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 20 19:27:51.827181 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:51.827103 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 20 19:27:51.828207 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:51.828181 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 20 19:27:51.828318 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:51.828220 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-q9xtx\"" Apr 20 19:27:51.832590 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:51.832571 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-sbzc8"] Apr 20 19:27:51.922485 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:51.922381 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/99cd77d4-5e9a-4aed-a526-4db3a1ad05c0-signing-key\") pod \"service-ca-865cb79987-sbzc8\" (UID: \"99cd77d4-5e9a-4aed-a526-4db3a1ad05c0\") " pod="openshift-service-ca/service-ca-865cb79987-sbzc8" Apr 20 19:27:51.922485 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:51.922428 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/99cd77d4-5e9a-4aed-a526-4db3a1ad05c0-signing-cabundle\") pod \"service-ca-865cb79987-sbzc8\" (UID: \"99cd77d4-5e9a-4aed-a526-4db3a1ad05c0\") " pod="openshift-service-ca/service-ca-865cb79987-sbzc8" Apr 20 19:27:51.922485 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:51.922474 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smgm5\" (UniqueName: \"kubernetes.io/projected/99cd77d4-5e9a-4aed-a526-4db3a1ad05c0-kube-api-access-smgm5\") pod \"service-ca-865cb79987-sbzc8\" (UID: \"99cd77d4-5e9a-4aed-a526-4db3a1ad05c0\") " pod="openshift-service-ca/service-ca-865cb79987-sbzc8" Apr 20 19:27:52.023262 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:52.023217 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/99cd77d4-5e9a-4aed-a526-4db3a1ad05c0-signing-key\") pod \"service-ca-865cb79987-sbzc8\" (UID: \"99cd77d4-5e9a-4aed-a526-4db3a1ad05c0\") " pod="openshift-service-ca/service-ca-865cb79987-sbzc8" Apr 20 19:27:52.023481 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:52.023284 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/99cd77d4-5e9a-4aed-a526-4db3a1ad05c0-signing-cabundle\") pod \"service-ca-865cb79987-sbzc8\" (UID: \"99cd77d4-5e9a-4aed-a526-4db3a1ad05c0\") " pod="openshift-service-ca/service-ca-865cb79987-sbzc8" Apr 20 19:27:52.023481 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:52.023318 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-smgm5\" (UniqueName: \"kubernetes.io/projected/99cd77d4-5e9a-4aed-a526-4db3a1ad05c0-kube-api-access-smgm5\") pod \"service-ca-865cb79987-sbzc8\" (UID: \"99cd77d4-5e9a-4aed-a526-4db3a1ad05c0\") " pod="openshift-service-ca/service-ca-865cb79987-sbzc8" Apr 20 19:27:52.024043 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:52.024023 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/99cd77d4-5e9a-4aed-a526-4db3a1ad05c0-signing-cabundle\") pod \"service-ca-865cb79987-sbzc8\" (UID: \"99cd77d4-5e9a-4aed-a526-4db3a1ad05c0\") " pod="openshift-service-ca/service-ca-865cb79987-sbzc8" Apr 20 19:27:52.025598 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:52.025582 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/99cd77d4-5e9a-4aed-a526-4db3a1ad05c0-signing-key\") pod \"service-ca-865cb79987-sbzc8\" (UID: \"99cd77d4-5e9a-4aed-a526-4db3a1ad05c0\") " pod="openshift-service-ca/service-ca-865cb79987-sbzc8" Apr 20 19:27:52.031162 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:52.031139 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-smgm5\" (UniqueName: \"kubernetes.io/projected/99cd77d4-5e9a-4aed-a526-4db3a1ad05c0-kube-api-access-smgm5\") pod \"service-ca-865cb79987-sbzc8\" (UID: \"99cd77d4-5e9a-4aed-a526-4db3a1ad05c0\") " pod="openshift-service-ca/service-ca-865cb79987-sbzc8" Apr 20 19:27:52.063247 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:52.063205 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-snltq" event={"ID":"5cbc0b7a-ab5e-40fd-b9f5-accffe0107fc","Type":"ContainerStarted","Data":"3cd96b2e96a867e839634dbbd08e57e83c6d18d004eeeb87d18092c6e376769d"} Apr 20 19:27:52.063247 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:52.063241 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-snltq" event={"ID":"5cbc0b7a-ab5e-40fd-b9f5-accffe0107fc","Type":"ContainerStarted","Data":"d3dca6cb3011059872725d4d60fadc4ddca022af3be27b01f313f9f21a50d412"} Apr 20 19:27:52.078150 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:52.078097 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-snltq" podStartSLOduration=2.078080061 podStartE2EDuration="2.078080061s" podCreationTimestamp="2026-04-20 19:27:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:27:52.077200839 +0000 UTC m=+146.067688378" watchObservedRunningTime="2026-04-20 19:27:52.078080061 +0000 UTC m=+146.068567597" Apr 20 19:27:52.134627 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:52.134593 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-sbzc8" Apr 20 19:27:52.253288 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:52.253256 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-sbzc8"] Apr 20 19:27:52.256787 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:27:52.256758 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99cd77d4_5e9a_4aed_a526_4db3a1ad05c0.slice/crio-8017f83710707e10d3d0ee4315762e3f6897bb8d687f30c44316bac2780d3111 WatchSource:0}: Error finding container 8017f83710707e10d3d0ee4315762e3f6897bb8d687f30c44316bac2780d3111: Status 404 returned error can't find the container with id 8017f83710707e10d3d0ee4315762e3f6897bb8d687f30c44316bac2780d3111 Apr 20 19:27:52.627167 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:52.627123 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c49e842f-07fc-49d1-a61f-45722a72a1cf-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gjghs\" (UID: \"c49e842f-07fc-49d1-a61f-45722a72a1cf\") " pod="openshift-insights/insights-runtime-extractor-gjghs" Apr 20 19:27:52.627378 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:27:52.627289 2572 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 19:27:52.627378 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:27:52.627366 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c49e842f-07fc-49d1-a61f-45722a72a1cf-insights-runtime-extractor-tls podName:c49e842f-07fc-49d1-a61f-45722a72a1cf nodeName:}" failed. No retries permitted until 2026-04-20 19:27:54.627346947 +0000 UTC m=+148.617834463 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/c49e842f-07fc-49d1-a61f-45722a72a1cf-insights-runtime-extractor-tls") pod "insights-runtime-extractor-gjghs" (UID: "c49e842f-07fc-49d1-a61f-45722a72a1cf") : secret "insights-runtime-extractor-tls" not found Apr 20 19:27:53.071456 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:53.071400 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-sbzc8" event={"ID":"99cd77d4-5e9a-4aed-a526-4db3a1ad05c0","Type":"ContainerStarted","Data":"d47f574a42b074b36d8c41c694577d3808a18137ecaf73d8e2e402908c089341"} Apr 20 19:27:53.071456 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:53.071460 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-sbzc8" event={"ID":"99cd77d4-5e9a-4aed-a526-4db3a1ad05c0","Type":"ContainerStarted","Data":"8017f83710707e10d3d0ee4315762e3f6897bb8d687f30c44316bac2780d3111"} Apr 20 19:27:53.087725 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:53.087679 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-sbzc8" podStartSLOduration=2.087665602 podStartE2EDuration="2.087665602s" podCreationTimestamp="2026-04-20 19:27:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:27:53.087075868 +0000 UTC m=+147.077563404" watchObservedRunningTime="2026-04-20 19:27:53.087665602 +0000 UTC m=+147.078153144" Apr 20 19:27:54.589552 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:54.589507 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-qpl5s" Apr 20 19:27:54.589552 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:54.589554 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-qpl5s" Apr 20 19:27:54.590078 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:54.590036 2572 scope.go:117] "RemoveContainer" containerID="839bd4e59ca85fffc5ae39602b88377407438178a3ff16af876be7cc696de0a8" Apr 20 19:27:54.590270 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:27:54.590247 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-qpl5s_openshift-console-operator(a25d18d6-5add-4c28-a671-0ee5222cb999)\"" pod="openshift-console-operator/console-operator-9d4b6777b-qpl5s" podUID="a25d18d6-5add-4c28-a671-0ee5222cb999" Apr 20 19:27:54.643231 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:54.643188 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c49e842f-07fc-49d1-a61f-45722a72a1cf-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gjghs\" (UID: \"c49e842f-07fc-49d1-a61f-45722a72a1cf\") " pod="openshift-insights/insights-runtime-extractor-gjghs" Apr 20 19:27:54.643409 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:27:54.643371 2572 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 19:27:54.643510 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:27:54.643497 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c49e842f-07fc-49d1-a61f-45722a72a1cf-insights-runtime-extractor-tls podName:c49e842f-07fc-49d1-a61f-45722a72a1cf nodeName:}" failed. No retries permitted until 2026-04-20 19:27:58.643474201 +0000 UTC m=+152.633961726 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/c49e842f-07fc-49d1-a61f-45722a72a1cf-insights-runtime-extractor-tls") pod "insights-runtime-extractor-gjghs" (UID: "c49e842f-07fc-49d1-a61f-45722a72a1cf") : secret "insights-runtime-extractor-tls" not found Apr 20 19:27:58.679982 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:27:58.679937 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c49e842f-07fc-49d1-a61f-45722a72a1cf-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gjghs\" (UID: \"c49e842f-07fc-49d1-a61f-45722a72a1cf\") " pod="openshift-insights/insights-runtime-extractor-gjghs" Apr 20 19:27:58.680371 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:27:58.680076 2572 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 19:27:58.680371 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:27:58.680137 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c49e842f-07fc-49d1-a61f-45722a72a1cf-insights-runtime-extractor-tls podName:c49e842f-07fc-49d1-a61f-45722a72a1cf nodeName:}" failed. No retries permitted until 2026-04-20 19:28:06.68012049 +0000 UTC m=+160.670608009 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/c49e842f-07fc-49d1-a61f-45722a72a1cf-insights-runtime-extractor-tls") pod "insights-runtime-extractor-gjghs" (UID: "c49e842f-07fc-49d1-a61f-45722a72a1cf") : secret "insights-runtime-extractor-tls" not found Apr 20 19:28:01.850624 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:28:01.850572 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-kxblw" podUID="bc24b476-7aaf-4c95-b13e-44550d15e793" Apr 20 19:28:01.855755 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:28:01.855703 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-92xgv" podUID="b9a6ffc3-ed3d-4922-acb0-cf3513a1d431" Apr 20 19:28:02.093355 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:02.093320 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-92xgv" Apr 20 19:28:02.093355 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:02.093348 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kxblw" Apr 20 19:28:03.618895 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:28:03.618849 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-7cd7d" podUID="513dd790-7dbf-46da-821a-3493b9941466" Apr 20 19:28:06.137908 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:06.137850 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8796n\" (UID: \"b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8796n" Apr 20 19:28:06.140384 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:06.140349 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8796n\" (UID: \"b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8796n" Apr 20 19:28:06.403168 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:06.403077 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8796n" Apr 20 19:28:06.523214 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:06.523182 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-8796n"] Apr 20 19:28:06.526727 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:28:06.526695 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6b24d33_cf74_4a2b_8f57_d0f3ecadabd0.slice/crio-6ba44e95d47bb1d9d6d98b840df736c46263aed0f0d6a8ea286b68da4dde3bfa WatchSource:0}: Error finding container 6ba44e95d47bb1d9d6d98b840df736c46263aed0f0d6a8ea286b68da4dde3bfa: Status 404 returned error can't find the container with id 6ba44e95d47bb1d9d6d98b840df736c46263aed0f0d6a8ea286b68da4dde3bfa Apr 20 19:28:06.743469 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:06.743373 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc24b476-7aaf-4c95-b13e-44550d15e793-cert\") pod \"ingress-canary-kxblw\" (UID: \"bc24b476-7aaf-4c95-b13e-44550d15e793\") " pod="openshift-ingress-canary/ingress-canary-kxblw" Apr 20 19:28:06.743469 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:06.743418 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b9a6ffc3-ed3d-4922-acb0-cf3513a1d431-metrics-tls\") pod \"dns-default-92xgv\" (UID: \"b9a6ffc3-ed3d-4922-acb0-cf3513a1d431\") " pod="openshift-dns/dns-default-92xgv" Apr 20 19:28:06.743650 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:06.743560 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c49e842f-07fc-49d1-a61f-45722a72a1cf-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gjghs\" (UID: \"c49e842f-07fc-49d1-a61f-45722a72a1cf\") " pod="openshift-insights/insights-runtime-extractor-gjghs" Apr 20 19:28:06.746025 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:06.746002 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c49e842f-07fc-49d1-a61f-45722a72a1cf-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gjghs\" (UID: \"c49e842f-07fc-49d1-a61f-45722a72a1cf\") " pod="openshift-insights/insights-runtime-extractor-gjghs" Apr 20 19:28:06.746142 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:06.746059 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc24b476-7aaf-4c95-b13e-44550d15e793-cert\") pod \"ingress-canary-kxblw\" (UID: \"bc24b476-7aaf-4c95-b13e-44550d15e793\") " pod="openshift-ingress-canary/ingress-canary-kxblw" Apr 20 19:28:06.746142 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:06.746065 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b9a6ffc3-ed3d-4922-acb0-cf3513a1d431-metrics-tls\") pod \"dns-default-92xgv\" (UID: \"b9a6ffc3-ed3d-4922-acb0-cf3513a1d431\") " pod="openshift-dns/dns-default-92xgv" Apr 20 19:28:06.819160 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:06.819135 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-gjghs" Apr 20 19:28:06.897174 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:06.897147 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-5vnpt\"" Apr 20 19:28:06.897337 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:06.897197 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-sbf4z\"" Apr 20 19:28:06.905097 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:06.905072 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-92xgv" Apr 20 19:28:06.905097 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:06.905091 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kxblw" Apr 20 19:28:06.964951 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:06.964898 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-gjghs"] Apr 20 19:28:06.977644 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:28:06.977609 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc49e842f_07fc_49d1_a61f_45722a72a1cf.slice/crio-25c8c4f390bf73774a981d3cafe51dd11fa2d8142b73909574ad15219ea7695e WatchSource:0}: Error finding container 25c8c4f390bf73774a981d3cafe51dd11fa2d8142b73909574ad15219ea7695e: Status 404 returned error can't find the container with id 25c8c4f390bf73774a981d3cafe51dd11fa2d8142b73909574ad15219ea7695e Apr 20 19:28:07.058059 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:07.058029 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-92xgv"] Apr 20 19:28:07.061604 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:28:07.061577 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9a6ffc3_ed3d_4922_acb0_cf3513a1d431.slice/crio-2fb1cf2a22b92e49e82d33a4801b72befc6dc0287827891d3c9aba88de058c9e WatchSource:0}: Error finding container 2fb1cf2a22b92e49e82d33a4801b72befc6dc0287827891d3c9aba88de058c9e: Status 404 returned error can't find the container with id 2fb1cf2a22b92e49e82d33a4801b72befc6dc0287827891d3c9aba88de058c9e Apr 20 19:28:07.078320 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:07.078285 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kxblw"] Apr 20 19:28:07.081957 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:28:07.081932 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc24b476_7aaf_4c95_b13e_44550d15e793.slice/crio-20ed0b16ecd623f1483d7de3c623030176539be542fbc8bd0ceb04a3009c6ad7 WatchSource:0}: Error finding container 20ed0b16ecd623f1483d7de3c623030176539be542fbc8bd0ceb04a3009c6ad7: Status 404 returned error can't find the container with id 20ed0b16ecd623f1483d7de3c623030176539be542fbc8bd0ceb04a3009c6ad7 Apr 20 19:28:07.107686 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:07.107649 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kxblw" event={"ID":"bc24b476-7aaf-4c95-b13e-44550d15e793","Type":"ContainerStarted","Data":"20ed0b16ecd623f1483d7de3c623030176539be542fbc8bd0ceb04a3009c6ad7"} Apr 20 19:28:07.108890 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:07.108864 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gjghs" event={"ID":"c49e842f-07fc-49d1-a61f-45722a72a1cf","Type":"ContainerStarted","Data":"471c9f9e03fabcda2bcc551942e956c76fb544ca7030380b69c8a53a511b9568"} Apr 20 19:28:07.108970 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:07.108893 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gjghs" event={"ID":"c49e842f-07fc-49d1-a61f-45722a72a1cf","Type":"ContainerStarted","Data":"25c8c4f390bf73774a981d3cafe51dd11fa2d8142b73909574ad15219ea7695e"} Apr 20 19:28:07.109836 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:07.109815 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8796n" event={"ID":"b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0","Type":"ContainerStarted","Data":"6ba44e95d47bb1d9d6d98b840df736c46263aed0f0d6a8ea286b68da4dde3bfa"} Apr 20 19:28:07.110586 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:07.110568 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-92xgv" event={"ID":"b9a6ffc3-ed3d-4922-acb0-cf3513a1d431","Type":"ContainerStarted","Data":"2fb1cf2a22b92e49e82d33a4801b72befc6dc0287827891d3c9aba88de058c9e"} Apr 20 19:28:08.115815 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:08.115772 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gjghs" event={"ID":"c49e842f-07fc-49d1-a61f-45722a72a1cf","Type":"ContainerStarted","Data":"362b081c8d6ea6796d34a125b9d0210a61e1ef78e57f94d350eb79f32b3cc02f"} Apr 20 19:28:09.602662 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:09.598964 2572 scope.go:117] "RemoveContainer" containerID="839bd4e59ca85fffc5ae39602b88377407438178a3ff16af876be7cc696de0a8" Apr 20 19:28:10.123931 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:10.123825 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-92xgv" event={"ID":"b9a6ffc3-ed3d-4922-acb0-cf3513a1d431","Type":"ContainerStarted","Data":"226635083a5a648eac7f707ff0ec5ff6e5fe1f0a77386f4aa1730913f20cbccc"} Apr 20 19:28:10.123931 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:10.123875 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-92xgv" event={"ID":"b9a6ffc3-ed3d-4922-acb0-cf3513a1d431","Type":"ContainerStarted","Data":"35db8818a4b9929d6aa0f9a8985abde564bfa7ac03dde9d7da5198b499fdccf5"} Apr 20 19:28:10.123931 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:10.123919 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-92xgv" Apr 20 19:28:10.125179 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:10.125157 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kxblw" event={"ID":"bc24b476-7aaf-4c95-b13e-44550d15e793","Type":"ContainerStarted","Data":"57c3fe0a9aa4fbca81743188c49ced373a606df3b0ba8f51b398cfdc6fdc903a"} Apr 20 19:28:10.126956 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:10.126932 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gjghs" event={"ID":"c49e842f-07fc-49d1-a61f-45722a72a1cf","Type":"ContainerStarted","Data":"26c8a1cd9e498345beaca6babc7028882717a106a9cdc33ce65b0bde072a7373"} Apr 20 19:28:10.128688 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:10.128667 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-qpl5s_a25d18d6-5add-4c28-a671-0ee5222cb999/console-operator/1.log" Apr 20 19:28:10.128793 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:10.128742 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-qpl5s" event={"ID":"a25d18d6-5add-4c28-a671-0ee5222cb999","Type":"ContainerStarted","Data":"8d8f22edd7fd4ee876c118cfed04a6d7b0d39a8a5f1194cef6f2ae5dab1c98f6"} Apr 20 19:28:10.129034 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:10.128994 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-qpl5s" Apr 20 19:28:10.130176 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:10.130156 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8796n" event={"ID":"b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0","Type":"ContainerStarted","Data":"8d17f2fde668d38a38c312f50bf25633110f7818b74bac12e328c4dd4e0eeb76"} Apr 20 19:28:10.141237 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:10.141195 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-92xgv" podStartSLOduration=129.508693004 podStartE2EDuration="2m12.141183135s" podCreationTimestamp="2026-04-20 19:25:58 +0000 UTC" firstStartedPulling="2026-04-20 19:28:07.063590876 +0000 UTC m=+161.054078392" lastFinishedPulling="2026-04-20 19:28:09.696080998 +0000 UTC m=+163.686568523" observedRunningTime="2026-04-20 19:28:10.139611651 +0000 UTC m=+164.130099187" watchObservedRunningTime="2026-04-20 19:28:10.141183135 +0000 UTC m=+164.131670662" Apr 20 19:28:10.154641 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:10.154600 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-kxblw" podStartSLOduration=129.54247857 podStartE2EDuration="2m12.154588609s" podCreationTimestamp="2026-04-20 19:25:58 +0000 UTC" firstStartedPulling="2026-04-20 19:28:07.083967568 +0000 UTC m=+161.074455098" lastFinishedPulling="2026-04-20 19:28:09.696077613 +0000 UTC m=+163.686565137" observedRunningTime="2026-04-20 19:28:10.154291867 +0000 UTC m=+164.144779404" watchObservedRunningTime="2026-04-20 19:28:10.154588609 +0000 UTC m=+164.145076143" Apr 20 19:28:10.232681 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:10.232623 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-gjghs" podStartSLOduration=17.55261033 podStartE2EDuration="20.232603553s" podCreationTimestamp="2026-04-20 19:27:50 +0000 UTC" firstStartedPulling="2026-04-20 19:28:07.051615409 +0000 UTC m=+161.042102927" lastFinishedPulling="2026-04-20 19:28:09.731608633 +0000 UTC m=+163.722096150" observedRunningTime="2026-04-20 19:28:10.232152162 +0000 UTC m=+164.222639698" watchObservedRunningTime="2026-04-20 19:28:10.232603553 +0000 UTC m=+164.223091088" Apr 20 19:28:10.233628 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:10.233587 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-qpl5s" podStartSLOduration=23.410284289 podStartE2EDuration="26.233575824s" podCreationTimestamp="2026-04-20 19:27:44 +0000 UTC" firstStartedPulling="2026-04-20 19:27:44.943060518 +0000 UTC m=+138.933548031" lastFinishedPulling="2026-04-20 19:27:47.766352035 +0000 UTC m=+141.756839566" observedRunningTime="2026-04-20 19:28:10.201416133 +0000 UTC m=+164.191903668" watchObservedRunningTime="2026-04-20 19:28:10.233575824 +0000 UTC m=+164.224063411" Apr 20 19:28:10.248966 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:10.248924 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8796n" podStartSLOduration=33.081870311 podStartE2EDuration="36.248908733s" podCreationTimestamp="2026-04-20 19:27:34 +0000 UTC" firstStartedPulling="2026-04-20 19:28:06.52875805 +0000 UTC m=+160.519245567" lastFinishedPulling="2026-04-20 19:28:09.69579646 +0000 UTC m=+163.686283989" observedRunningTime="2026-04-20 19:28:10.247874957 +0000 UTC m=+164.238362493" watchObservedRunningTime="2026-04-20 19:28:10.248908733 +0000 UTC m=+164.239396270" Apr 20 19:28:10.471232 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:10.471158 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-qpl5s" Apr 20 19:28:14.009845 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.009808 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bldkt"] Apr 20 19:28:14.011743 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.011722 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bldkt" Apr 20 19:28:14.013005 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.012980 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-b69bfb95-snnfp"] Apr 20 19:28:14.014820 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.014803 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-62r2g\"" Apr 20 19:28:14.014916 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.014838 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-7xljr"] Apr 20 19:28:14.014984 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.014966 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 20 19:28:14.015033 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.015008 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-b69bfb95-snnfp" Apr 20 19:28:14.016715 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.016699 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-7xljr" Apr 20 19:28:14.022034 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.022018 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 20 19:28:14.022120 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.022092 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-94hbn\"" Apr 20 19:28:14.022120 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.022025 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 20 19:28:14.022285 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.022268 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 19:28:14.022367 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.022353 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-8vchn\"" Apr 20 19:28:14.022492 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.022480 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 19:28:14.023473 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.023432 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 19:28:14.027816 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.027799 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 19:28:14.031299 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.031147 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bldkt"] Apr 20 19:28:14.034832 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.034813 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-b69bfb95-snnfp"] Apr 20 19:28:14.035821 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.035804 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-7xljr"] Apr 20 19:28:14.108762 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.108731 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d5f453f-1b70-4078-8a96-844251489d5c-trusted-ca\") pod \"image-registry-b69bfb95-snnfp\" (UID: \"6d5f453f-1b70-4078-8a96-844251489d5c\") " pod="openshift-image-registry/image-registry-b69bfb95-snnfp" Apr 20 19:28:14.108928 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.108834 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6d5f453f-1b70-4078-8a96-844251489d5c-ca-trust-extracted\") pod \"image-registry-b69bfb95-snnfp\" (UID: \"6d5f453f-1b70-4078-8a96-844251489d5c\") " pod="openshift-image-registry/image-registry-b69bfb95-snnfp" Apr 20 19:28:14.108928 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.108867 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6d5f453f-1b70-4078-8a96-844251489d5c-bound-sa-token\") pod \"image-registry-b69bfb95-snnfp\" (UID: \"6d5f453f-1b70-4078-8a96-844251489d5c\") " pod="openshift-image-registry/image-registry-b69bfb95-snnfp" Apr 20 19:28:14.108928 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.108894 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6d5f453f-1b70-4078-8a96-844251489d5c-installation-pull-secrets\") pod \"image-registry-b69bfb95-snnfp\" (UID: \"6d5f453f-1b70-4078-8a96-844251489d5c\") " pod="openshift-image-registry/image-registry-b69bfb95-snnfp" Apr 20 19:28:14.108928 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.108924 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjjlg\" (UniqueName: \"kubernetes.io/projected/ecc56b03-14e4-4238-8c9a-7974d6774b23-kube-api-access-bjjlg\") pod \"downloads-6bcc868b7-7xljr\" (UID: \"ecc56b03-14e4-4238-8c9a-7974d6774b23\") " pod="openshift-console/downloads-6bcc868b7-7xljr" Apr 20 19:28:14.109096 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.108944 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6d5f453f-1b70-4078-8a96-844251489d5c-image-registry-private-configuration\") pod \"image-registry-b69bfb95-snnfp\" (UID: \"6d5f453f-1b70-4078-8a96-844251489d5c\") " pod="openshift-image-registry/image-registry-b69bfb95-snnfp" Apr 20 19:28:14.109096 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.108960 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6d5f453f-1b70-4078-8a96-844251489d5c-registry-tls\") pod \"image-registry-b69bfb95-snnfp\" (UID: \"6d5f453f-1b70-4078-8a96-844251489d5c\") " pod="openshift-image-registry/image-registry-b69bfb95-snnfp" Apr 20 19:28:14.109096 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.108979 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7wgl\" (UniqueName: \"kubernetes.io/projected/6d5f453f-1b70-4078-8a96-844251489d5c-kube-api-access-k7wgl\") pod \"image-registry-b69bfb95-snnfp\" (UID: \"6d5f453f-1b70-4078-8a96-844251489d5c\") " pod="openshift-image-registry/image-registry-b69bfb95-snnfp" Apr 20 19:28:14.109096 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.109009 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6d5f453f-1b70-4078-8a96-844251489d5c-registry-certificates\") pod \"image-registry-b69bfb95-snnfp\" (UID: \"6d5f453f-1b70-4078-8a96-844251489d5c\") " pod="openshift-image-registry/image-registry-b69bfb95-snnfp" Apr 20 19:28:14.109096 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.109032 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/ea330750-040a-4755-8c76-18743a732d31-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-bldkt\" (UID: \"ea330750-040a-4755-8c76-18743a732d31\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bldkt" Apr 20 19:28:14.210148 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.210110 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6d5f453f-1b70-4078-8a96-844251489d5c-bound-sa-token\") pod \"image-registry-b69bfb95-snnfp\" (UID: \"6d5f453f-1b70-4078-8a96-844251489d5c\") " pod="openshift-image-registry/image-registry-b69bfb95-snnfp" Apr 20 19:28:14.210356 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.210156 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6d5f453f-1b70-4078-8a96-844251489d5c-installation-pull-secrets\") pod \"image-registry-b69bfb95-snnfp\" (UID: \"6d5f453f-1b70-4078-8a96-844251489d5c\") " pod="openshift-image-registry/image-registry-b69bfb95-snnfp" Apr 20 19:28:14.210356 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.210190 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bjjlg\" (UniqueName: \"kubernetes.io/projected/ecc56b03-14e4-4238-8c9a-7974d6774b23-kube-api-access-bjjlg\") pod \"downloads-6bcc868b7-7xljr\" (UID: \"ecc56b03-14e4-4238-8c9a-7974d6774b23\") " pod="openshift-console/downloads-6bcc868b7-7xljr" Apr 20 19:28:14.210356 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.210214 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6d5f453f-1b70-4078-8a96-844251489d5c-image-registry-private-configuration\") pod \"image-registry-b69bfb95-snnfp\" (UID: \"6d5f453f-1b70-4078-8a96-844251489d5c\") " pod="openshift-image-registry/image-registry-b69bfb95-snnfp" Apr 20 19:28:14.210356 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.210237 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6d5f453f-1b70-4078-8a96-844251489d5c-registry-tls\") pod \"image-registry-b69bfb95-snnfp\" (UID: \"6d5f453f-1b70-4078-8a96-844251489d5c\") " pod="openshift-image-registry/image-registry-b69bfb95-snnfp" Apr 20 19:28:14.210713 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.210401 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k7wgl\" (UniqueName: \"kubernetes.io/projected/6d5f453f-1b70-4078-8a96-844251489d5c-kube-api-access-k7wgl\") pod \"image-registry-b69bfb95-snnfp\" (UID: \"6d5f453f-1b70-4078-8a96-844251489d5c\") " pod="openshift-image-registry/image-registry-b69bfb95-snnfp" Apr 20 19:28:14.210713 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.210485 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6d5f453f-1b70-4078-8a96-844251489d5c-registry-certificates\") pod \"image-registry-b69bfb95-snnfp\" (UID: \"6d5f453f-1b70-4078-8a96-844251489d5c\") " pod="openshift-image-registry/image-registry-b69bfb95-snnfp" Apr 20 19:28:14.210713 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.210521 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/ea330750-040a-4755-8c76-18743a732d31-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-bldkt\" (UID: \"ea330750-040a-4755-8c76-18743a732d31\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bldkt" Apr 20 19:28:14.210713 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.210551 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d5f453f-1b70-4078-8a96-844251489d5c-trusted-ca\") pod \"image-registry-b69bfb95-snnfp\" (UID: \"6d5f453f-1b70-4078-8a96-844251489d5c\") " pod="openshift-image-registry/image-registry-b69bfb95-snnfp" Apr 20 19:28:14.210713 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.210646 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6d5f453f-1b70-4078-8a96-844251489d5c-ca-trust-extracted\") pod \"image-registry-b69bfb95-snnfp\" (UID: \"6d5f453f-1b70-4078-8a96-844251489d5c\") " pod="openshift-image-registry/image-registry-b69bfb95-snnfp" Apr 20 19:28:14.210985 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.210966 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6d5f453f-1b70-4078-8a96-844251489d5c-ca-trust-extracted\") pod \"image-registry-b69bfb95-snnfp\" (UID: \"6d5f453f-1b70-4078-8a96-844251489d5c\") " pod="openshift-image-registry/image-registry-b69bfb95-snnfp" Apr 20 19:28:14.211416 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.211390 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6d5f453f-1b70-4078-8a96-844251489d5c-registry-certificates\") pod \"image-registry-b69bfb95-snnfp\" (UID: \"6d5f453f-1b70-4078-8a96-844251489d5c\") " pod="openshift-image-registry/image-registry-b69bfb95-snnfp" Apr 20 19:28:14.211648 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.211627 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d5f453f-1b70-4078-8a96-844251489d5c-trusted-ca\") pod \"image-registry-b69bfb95-snnfp\" (UID: \"6d5f453f-1b70-4078-8a96-844251489d5c\") " pod="openshift-image-registry/image-registry-b69bfb95-snnfp" Apr 20 19:28:14.213039 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.213013 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6d5f453f-1b70-4078-8a96-844251489d5c-registry-tls\") pod \"image-registry-b69bfb95-snnfp\" (UID: \"6d5f453f-1b70-4078-8a96-844251489d5c\") " pod="openshift-image-registry/image-registry-b69bfb95-snnfp" Apr 20 19:28:14.213132 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.213037 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6d5f453f-1b70-4078-8a96-844251489d5c-installation-pull-secrets\") pod \"image-registry-b69bfb95-snnfp\" (UID: \"6d5f453f-1b70-4078-8a96-844251489d5c\") " pod="openshift-image-registry/image-registry-b69bfb95-snnfp" Apr 20 19:28:14.213176 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.213142 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/ea330750-040a-4755-8c76-18743a732d31-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-bldkt\" (UID: \"ea330750-040a-4755-8c76-18743a732d31\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bldkt" Apr 20 19:28:14.213176 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.213160 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6d5f453f-1b70-4078-8a96-844251489d5c-image-registry-private-configuration\") pod \"image-registry-b69bfb95-snnfp\" (UID: \"6d5f453f-1b70-4078-8a96-844251489d5c\") " pod="openshift-image-registry/image-registry-b69bfb95-snnfp" Apr 20 19:28:14.219314 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.219278 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6d5f453f-1b70-4078-8a96-844251489d5c-bound-sa-token\") pod \"image-registry-b69bfb95-snnfp\" (UID: \"6d5f453f-1b70-4078-8a96-844251489d5c\") " pod="openshift-image-registry/image-registry-b69bfb95-snnfp" Apr 20 19:28:14.219411 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.219340 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjjlg\" (UniqueName: \"kubernetes.io/projected/ecc56b03-14e4-4238-8c9a-7974d6774b23-kube-api-access-bjjlg\") pod \"downloads-6bcc868b7-7xljr\" (UID: \"ecc56b03-14e4-4238-8c9a-7974d6774b23\") " pod="openshift-console/downloads-6bcc868b7-7xljr" Apr 20 19:28:14.219411 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.219359 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7wgl\" (UniqueName: \"kubernetes.io/projected/6d5f453f-1b70-4078-8a96-844251489d5c-kube-api-access-k7wgl\") pod \"image-registry-b69bfb95-snnfp\" (UID: \"6d5f453f-1b70-4078-8a96-844251489d5c\") " pod="openshift-image-registry/image-registry-b69bfb95-snnfp" Apr 20 19:28:14.322014 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.321983 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bldkt" Apr 20 19:28:14.328788 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.328757 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-b69bfb95-snnfp" Apr 20 19:28:14.334426 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.334402 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-7xljr" Apr 20 19:28:14.459579 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.459550 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bldkt"] Apr 20 19:28:14.462351 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:28:14.462321 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea330750_040a_4755_8c76_18743a732d31.slice/crio-5de9d78d94681e31cfe08c934c56f8912556daaeb45af05f9d51aebec4abab81 WatchSource:0}: Error finding container 5de9d78d94681e31cfe08c934c56f8912556daaeb45af05f9d51aebec4abab81: Status 404 returned error can't find the container with id 5de9d78d94681e31cfe08c934c56f8912556daaeb45af05f9d51aebec4abab81 Apr 20 19:28:14.689213 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.689143 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-7xljr"] Apr 20 19:28:14.692065 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:14.692042 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-b69bfb95-snnfp"] Apr 20 19:28:14.692248 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:28:14.692227 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecc56b03_14e4_4238_8c9a_7974d6774b23.slice/crio-b474f311a3a4a600aa4bc8f69c65e20729d151c7345f497dbb9930130449e8e4 WatchSource:0}: Error finding container b474f311a3a4a600aa4bc8f69c65e20729d151c7345f497dbb9930130449e8e4: Status 404 returned error can't find the container with id b474f311a3a4a600aa4bc8f69c65e20729d151c7345f497dbb9930130449e8e4 Apr 20 19:28:14.694660 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:28:14.694634 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d5f453f_1b70_4078_8a96_844251489d5c.slice/crio-7c946a0cb2ca1fe2432e088b9b8bc5a01eb135224a7236230223cd95ed0beeb4 WatchSource:0}: Error finding container 7c946a0cb2ca1fe2432e088b9b8bc5a01eb135224a7236230223cd95ed0beeb4: Status 404 returned error can't find the container with id 7c946a0cb2ca1fe2432e088b9b8bc5a01eb135224a7236230223cd95ed0beeb4 Apr 20 19:28:15.152316 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:15.152197 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-7xljr" event={"ID":"ecc56b03-14e4-4238-8c9a-7974d6774b23","Type":"ContainerStarted","Data":"b474f311a3a4a600aa4bc8f69c65e20729d151c7345f497dbb9930130449e8e4"} Apr 20 19:28:15.153743 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:15.153712 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bldkt" event={"ID":"ea330750-040a-4755-8c76-18743a732d31","Type":"ContainerStarted","Data":"5de9d78d94681e31cfe08c934c56f8912556daaeb45af05f9d51aebec4abab81"} Apr 20 19:28:15.155045 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:15.155013 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-b69bfb95-snnfp" event={"ID":"6d5f453f-1b70-4078-8a96-844251489d5c","Type":"ContainerStarted","Data":"9bdb69481eda4ad0a2d417a9bf925f93622e26fdbcbdfded9be0bc124952ad42"} Apr 20 19:28:15.155157 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:15.155054 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-b69bfb95-snnfp" event={"ID":"6d5f453f-1b70-4078-8a96-844251489d5c","Type":"ContainerStarted","Data":"7c946a0cb2ca1fe2432e088b9b8bc5a01eb135224a7236230223cd95ed0beeb4"} Apr 20 19:28:15.155387 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:15.155363 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-b69bfb95-snnfp" Apr 20 19:28:15.173982 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:15.173923 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-b69bfb95-snnfp" podStartSLOduration=2.173904018 podStartE2EDuration="2.173904018s" podCreationTimestamp="2026-04-20 19:28:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:28:15.17242702 +0000 UTC m=+169.162914552" watchObservedRunningTime="2026-04-20 19:28:15.173904018 +0000 UTC m=+169.164391554" Apr 20 19:28:16.159255 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:16.159211 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bldkt" event={"ID":"ea330750-040a-4755-8c76-18743a732d31","Type":"ContainerStarted","Data":"64579ffec8e0af34802da433bd2241a2e8982f6973cb181aa52a3fe2b93a83b1"} Apr 20 19:28:16.175322 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:16.175264 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bldkt" podStartSLOduration=2.217911662 podStartE2EDuration="3.17524654s" podCreationTimestamp="2026-04-20 19:28:13 +0000 UTC" firstStartedPulling="2026-04-20 19:28:14.464204913 +0000 UTC m=+168.454692426" lastFinishedPulling="2026-04-20 19:28:15.421539773 +0000 UTC m=+169.412027304" observedRunningTime="2026-04-20 19:28:16.175035823 +0000 UTC m=+170.165523358" watchObservedRunningTime="2026-04-20 19:28:16.17524654 +0000 UTC m=+170.165734076" Apr 20 19:28:16.600837 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:16.600802 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cd7d" Apr 20 19:28:17.162808 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:17.162777 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bldkt" Apr 20 19:28:17.168142 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:17.168116 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bldkt" Apr 20 19:28:20.136268 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:20.136227 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-92xgv" Apr 20 19:28:21.162964 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.162931 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-ff7c97478-zk79v"] Apr 20 19:28:21.200327 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.200284 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-ff7c97478-zk79v"] Apr 20 19:28:21.200531 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.200481 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-ff7c97478-zk79v" Apr 20 19:28:21.204098 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.203981 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 20 19:28:21.204924 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.204898 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 20 19:28:21.205065 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.204937 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 20 19:28:21.205065 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.205004 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-wvcd6\"" Apr 20 19:28:21.205263 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.205242 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 20 19:28:21.205375 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.205293 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 20 19:28:21.277429 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.277391 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b696b86-2f2e-4d02-b224-efa87b93bc24-service-ca\") pod \"console-ff7c97478-zk79v\" (UID: \"0b696b86-2f2e-4d02-b224-efa87b93bc24\") " pod="openshift-console/console-ff7c97478-zk79v" Apr 20 19:28:21.277667 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.277475 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0b696b86-2f2e-4d02-b224-efa87b93bc24-console-oauth-config\") pod \"console-ff7c97478-zk79v\" (UID: \"0b696b86-2f2e-4d02-b224-efa87b93bc24\") " pod="openshift-console/console-ff7c97478-zk79v" Apr 20 19:28:21.277667 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.277526 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0b696b86-2f2e-4d02-b224-efa87b93bc24-console-config\") pod \"console-ff7c97478-zk79v\" (UID: \"0b696b86-2f2e-4d02-b224-efa87b93bc24\") " pod="openshift-console/console-ff7c97478-zk79v" Apr 20 19:28:21.277667 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.277568 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0b696b86-2f2e-4d02-b224-efa87b93bc24-oauth-serving-cert\") pod \"console-ff7c97478-zk79v\" (UID: \"0b696b86-2f2e-4d02-b224-efa87b93bc24\") " pod="openshift-console/console-ff7c97478-zk79v" Apr 20 19:28:21.277667 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.277605 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0b696b86-2f2e-4d02-b224-efa87b93bc24-console-serving-cert\") pod \"console-ff7c97478-zk79v\" (UID: \"0b696b86-2f2e-4d02-b224-efa87b93bc24\") " pod="openshift-console/console-ff7c97478-zk79v" Apr 20 19:28:21.277667 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.277658 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z489v\" (UniqueName: \"kubernetes.io/projected/0b696b86-2f2e-4d02-b224-efa87b93bc24-kube-api-access-z489v\") pod \"console-ff7c97478-zk79v\" (UID: \"0b696b86-2f2e-4d02-b224-efa87b93bc24\") " pod="openshift-console/console-ff7c97478-zk79v" Apr 20 19:28:21.378124 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.378074 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0b696b86-2f2e-4d02-b224-efa87b93bc24-console-serving-cert\") pod \"console-ff7c97478-zk79v\" (UID: \"0b696b86-2f2e-4d02-b224-efa87b93bc24\") " pod="openshift-console/console-ff7c97478-zk79v" Apr 20 19:28:21.378307 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.378143 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z489v\" (UniqueName: \"kubernetes.io/projected/0b696b86-2f2e-4d02-b224-efa87b93bc24-kube-api-access-z489v\") pod \"console-ff7c97478-zk79v\" (UID: \"0b696b86-2f2e-4d02-b224-efa87b93bc24\") " pod="openshift-console/console-ff7c97478-zk79v" Apr 20 19:28:21.378307 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.378198 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b696b86-2f2e-4d02-b224-efa87b93bc24-service-ca\") pod \"console-ff7c97478-zk79v\" (UID: \"0b696b86-2f2e-4d02-b224-efa87b93bc24\") " pod="openshift-console/console-ff7c97478-zk79v" Apr 20 19:28:21.378307 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.378235 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0b696b86-2f2e-4d02-b224-efa87b93bc24-console-oauth-config\") pod \"console-ff7c97478-zk79v\" (UID: \"0b696b86-2f2e-4d02-b224-efa87b93bc24\") " pod="openshift-console/console-ff7c97478-zk79v" Apr 20 19:28:21.378307 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.378251 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0b696b86-2f2e-4d02-b224-efa87b93bc24-console-config\") pod \"console-ff7c97478-zk79v\" (UID: \"0b696b86-2f2e-4d02-b224-efa87b93bc24\") " pod="openshift-console/console-ff7c97478-zk79v" Apr 20 19:28:21.378307 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.378269 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0b696b86-2f2e-4d02-b224-efa87b93bc24-oauth-serving-cert\") pod \"console-ff7c97478-zk79v\" (UID: \"0b696b86-2f2e-4d02-b224-efa87b93bc24\") " pod="openshift-console/console-ff7c97478-zk79v" Apr 20 19:28:21.379111 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.379072 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0b696b86-2f2e-4d02-b224-efa87b93bc24-oauth-serving-cert\") pod \"console-ff7c97478-zk79v\" (UID: \"0b696b86-2f2e-4d02-b224-efa87b93bc24\") " pod="openshift-console/console-ff7c97478-zk79v" Apr 20 19:28:21.379342 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.379313 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b696b86-2f2e-4d02-b224-efa87b93bc24-service-ca\") pod \"console-ff7c97478-zk79v\" (UID: \"0b696b86-2f2e-4d02-b224-efa87b93bc24\") " pod="openshift-console/console-ff7c97478-zk79v" Apr 20 19:28:21.379756 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.379730 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0b696b86-2f2e-4d02-b224-efa87b93bc24-console-config\") pod \"console-ff7c97478-zk79v\" (UID: \"0b696b86-2f2e-4d02-b224-efa87b93bc24\") " pod="openshift-console/console-ff7c97478-zk79v" Apr 20 19:28:21.381676 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.381654 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0b696b86-2f2e-4d02-b224-efa87b93bc24-console-oauth-config\") pod \"console-ff7c97478-zk79v\" (UID: \"0b696b86-2f2e-4d02-b224-efa87b93bc24\") " pod="openshift-console/console-ff7c97478-zk79v" Apr 20 19:28:21.381790 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.381700 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0b696b86-2f2e-4d02-b224-efa87b93bc24-console-serving-cert\") pod \"console-ff7c97478-zk79v\" (UID: \"0b696b86-2f2e-4d02-b224-efa87b93bc24\") " pod="openshift-console/console-ff7c97478-zk79v" Apr 20 19:28:21.388570 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.388543 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z489v\" (UniqueName: \"kubernetes.io/projected/0b696b86-2f2e-4d02-b224-efa87b93bc24-kube-api-access-z489v\") pod \"console-ff7c97478-zk79v\" (UID: \"0b696b86-2f2e-4d02-b224-efa87b93bc24\") " pod="openshift-console/console-ff7c97478-zk79v" Apr 20 19:28:21.511923 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.511841 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-ff7c97478-zk79v" Apr 20 19:28:21.664129 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.664101 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-ff7c97478-zk79v"] Apr 20 19:28:21.666478 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:28:21.666423 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b696b86_2f2e_4d02_b224_efa87b93bc24.slice/crio-798483943bbb16abe098a2de560af99124d706d64058acad264e272c2b63f93d WatchSource:0}: Error finding container 798483943bbb16abe098a2de560af99124d706d64058acad264e272c2b63f93d: Status 404 returned error can't find the container with id 798483943bbb16abe098a2de560af99124d706d64058acad264e272c2b63f93d Apr 20 19:28:21.741412 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.741330 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-rq7fq"] Apr 20 19:28:21.781024 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.780626 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-sdmgq"] Apr 20 19:28:21.782160 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.781358 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-rq7fq" Apr 20 19:28:21.786528 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.786285 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-f8zn6\"" Apr 20 19:28:21.786528 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.786306 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 20 19:28:21.786785 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.786765 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 20 19:28:21.786845 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.786805 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 20 19:28:21.787300 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.787282 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 19:28:21.793402 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.793378 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-rq7fq"] Apr 20 19:28:21.793587 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.793571 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-sdmgq" Apr 20 19:28:21.796182 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.796160 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 19:28:21.796690 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.796671 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 19:28:21.796793 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.796746 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 19:28:21.797429 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.797394 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-t69jz\"" Apr 20 19:28:21.881841 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.881796 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b7295835-02aa-4369-a6a8-e0d1bab163a9-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-rq7fq\" (UID: \"b7295835-02aa-4369-a6a8-e0d1bab163a9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rq7fq" Apr 20 19:28:21.882016 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.881847 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/71f4e361-c121-46fd-bedb-ab6e0d2489a4-root\") pod \"node-exporter-sdmgq\" (UID: \"71f4e361-c121-46fd-bedb-ab6e0d2489a4\") " pod="openshift-monitoring/node-exporter-sdmgq" Apr 20 19:28:21.882016 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.881902 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/b7295835-02aa-4369-a6a8-e0d1bab163a9-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-rq7fq\" (UID: \"b7295835-02aa-4369-a6a8-e0d1bab163a9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rq7fq" Apr 20 19:28:21.882016 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.881932 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d589w\" (UniqueName: \"kubernetes.io/projected/71f4e361-c121-46fd-bedb-ab6e0d2489a4-kube-api-access-d589w\") pod \"node-exporter-sdmgq\" (UID: \"71f4e361-c121-46fd-bedb-ab6e0d2489a4\") " pod="openshift-monitoring/node-exporter-sdmgq" Apr 20 19:28:21.882016 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.881964 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/71f4e361-c121-46fd-bedb-ab6e0d2489a4-metrics-client-ca\") pod \"node-exporter-sdmgq\" (UID: \"71f4e361-c121-46fd-bedb-ab6e0d2489a4\") " pod="openshift-monitoring/node-exporter-sdmgq" Apr 20 19:28:21.882016 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.882015 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/71f4e361-c121-46fd-bedb-ab6e0d2489a4-node-exporter-tls\") pod \"node-exporter-sdmgq\" (UID: \"71f4e361-c121-46fd-bedb-ab6e0d2489a4\") " pod="openshift-monitoring/node-exporter-sdmgq" Apr 20 19:28:21.882332 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.882042 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/71f4e361-c121-46fd-bedb-ab6e0d2489a4-node-exporter-accelerators-collector-config\") pod \"node-exporter-sdmgq\" (UID: \"71f4e361-c121-46fd-bedb-ab6e0d2489a4\") " pod="openshift-monitoring/node-exporter-sdmgq" Apr 20 19:28:21.882332 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.882070 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/71f4e361-c121-46fd-bedb-ab6e0d2489a4-node-exporter-textfile\") pod \"node-exporter-sdmgq\" (UID: \"71f4e361-c121-46fd-bedb-ab6e0d2489a4\") " pod="openshift-monitoring/node-exporter-sdmgq" Apr 20 19:28:21.882332 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.882110 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b7295835-02aa-4369-a6a8-e0d1bab163a9-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-rq7fq\" (UID: \"b7295835-02aa-4369-a6a8-e0d1bab163a9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rq7fq" Apr 20 19:28:21.882332 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.882139 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b7295835-02aa-4369-a6a8-e0d1bab163a9-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-rq7fq\" (UID: \"b7295835-02aa-4369-a6a8-e0d1bab163a9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rq7fq" Apr 20 19:28:21.882332 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.882173 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/b7295835-02aa-4369-a6a8-e0d1bab163a9-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-rq7fq\" (UID: \"b7295835-02aa-4369-a6a8-e0d1bab163a9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rq7fq" Apr 20 19:28:21.882332 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.882203 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/71f4e361-c121-46fd-bedb-ab6e0d2489a4-sys\") pod \"node-exporter-sdmgq\" (UID: \"71f4e361-c121-46fd-bedb-ab6e0d2489a4\") " pod="openshift-monitoring/node-exporter-sdmgq" Apr 20 19:28:21.882332 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.882227 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgfk7\" (UniqueName: \"kubernetes.io/projected/b7295835-02aa-4369-a6a8-e0d1bab163a9-kube-api-access-hgfk7\") pod \"kube-state-metrics-69db897b98-rq7fq\" (UID: \"b7295835-02aa-4369-a6a8-e0d1bab163a9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rq7fq" Apr 20 19:28:21.882332 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.882263 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/71f4e361-c121-46fd-bedb-ab6e0d2489a4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-sdmgq\" (UID: \"71f4e361-c121-46fd-bedb-ab6e0d2489a4\") " pod="openshift-monitoring/node-exporter-sdmgq" Apr 20 19:28:21.882332 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.882296 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/71f4e361-c121-46fd-bedb-ab6e0d2489a4-node-exporter-wtmp\") pod \"node-exporter-sdmgq\" (UID: \"71f4e361-c121-46fd-bedb-ab6e0d2489a4\") " pod="openshift-monitoring/node-exporter-sdmgq" Apr 20 19:28:21.983292 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.983251 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b7295835-02aa-4369-a6a8-e0d1bab163a9-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-rq7fq\" (UID: \"b7295835-02aa-4369-a6a8-e0d1bab163a9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rq7fq" Apr 20 19:28:21.983504 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.983311 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/b7295835-02aa-4369-a6a8-e0d1bab163a9-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-rq7fq\" (UID: \"b7295835-02aa-4369-a6a8-e0d1bab163a9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rq7fq" Apr 20 19:28:21.983504 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.983342 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/71f4e361-c121-46fd-bedb-ab6e0d2489a4-sys\") pod \"node-exporter-sdmgq\" (UID: \"71f4e361-c121-46fd-bedb-ab6e0d2489a4\") " pod="openshift-monitoring/node-exporter-sdmgq" Apr 20 19:28:21.983504 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.983365 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hgfk7\" (UniqueName: \"kubernetes.io/projected/b7295835-02aa-4369-a6a8-e0d1bab163a9-kube-api-access-hgfk7\") pod \"kube-state-metrics-69db897b98-rq7fq\" (UID: \"b7295835-02aa-4369-a6a8-e0d1bab163a9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rq7fq" Apr 20 19:28:21.983504 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.983394 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/71f4e361-c121-46fd-bedb-ab6e0d2489a4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-sdmgq\" (UID: \"71f4e361-c121-46fd-bedb-ab6e0d2489a4\") " pod="openshift-monitoring/node-exporter-sdmgq" Apr 20 19:28:21.983504 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.983430 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/71f4e361-c121-46fd-bedb-ab6e0d2489a4-node-exporter-wtmp\") pod \"node-exporter-sdmgq\" (UID: \"71f4e361-c121-46fd-bedb-ab6e0d2489a4\") " pod="openshift-monitoring/node-exporter-sdmgq" Apr 20 19:28:21.983504 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.983473 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b7295835-02aa-4369-a6a8-e0d1bab163a9-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-rq7fq\" (UID: \"b7295835-02aa-4369-a6a8-e0d1bab163a9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rq7fq" Apr 20 19:28:21.983504 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.983505 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/71f4e361-c121-46fd-bedb-ab6e0d2489a4-root\") pod \"node-exporter-sdmgq\" (UID: \"71f4e361-c121-46fd-bedb-ab6e0d2489a4\") " pod="openshift-monitoring/node-exporter-sdmgq" Apr 20 19:28:21.983958 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.983550 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/b7295835-02aa-4369-a6a8-e0d1bab163a9-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-rq7fq\" (UID: \"b7295835-02aa-4369-a6a8-e0d1bab163a9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rq7fq" Apr 20 19:28:21.983958 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.983581 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d589w\" (UniqueName: \"kubernetes.io/projected/71f4e361-c121-46fd-bedb-ab6e0d2489a4-kube-api-access-d589w\") pod \"node-exporter-sdmgq\" (UID: \"71f4e361-c121-46fd-bedb-ab6e0d2489a4\") " pod="openshift-monitoring/node-exporter-sdmgq" Apr 20 19:28:21.983958 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.983614 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/71f4e361-c121-46fd-bedb-ab6e0d2489a4-metrics-client-ca\") pod \"node-exporter-sdmgq\" (UID: \"71f4e361-c121-46fd-bedb-ab6e0d2489a4\") " pod="openshift-monitoring/node-exporter-sdmgq" Apr 20 19:28:21.983958 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.983667 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/71f4e361-c121-46fd-bedb-ab6e0d2489a4-node-exporter-tls\") pod \"node-exporter-sdmgq\" (UID: \"71f4e361-c121-46fd-bedb-ab6e0d2489a4\") " pod="openshift-monitoring/node-exporter-sdmgq" Apr 20 19:28:21.983958 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.983692 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/71f4e361-c121-46fd-bedb-ab6e0d2489a4-node-exporter-accelerators-collector-config\") pod \"node-exporter-sdmgq\" (UID: \"71f4e361-c121-46fd-bedb-ab6e0d2489a4\") " pod="openshift-monitoring/node-exporter-sdmgq" Apr 20 19:28:21.983958 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.983752 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/71f4e361-c121-46fd-bedb-ab6e0d2489a4-root\") pod \"node-exporter-sdmgq\" (UID: \"71f4e361-c121-46fd-bedb-ab6e0d2489a4\") " pod="openshift-monitoring/node-exporter-sdmgq" Apr 20 19:28:21.984381 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:28:21.984251 2572 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 19:28:21.984381 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:28:21.984312 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71f4e361-c121-46fd-bedb-ab6e0d2489a4-node-exporter-tls podName:71f4e361-c121-46fd-bedb-ab6e0d2489a4 nodeName:}" failed. No retries permitted until 2026-04-20 19:28:22.484293497 +0000 UTC m=+176.474781015 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/71f4e361-c121-46fd-bedb-ab6e0d2489a4-node-exporter-tls") pod "node-exporter-sdmgq" (UID: "71f4e361-c121-46fd-bedb-ab6e0d2489a4") : secret "node-exporter-tls" not found Apr 20 19:28:21.984381 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.984337 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/71f4e361-c121-46fd-bedb-ab6e0d2489a4-metrics-client-ca\") pod \"node-exporter-sdmgq\" (UID: \"71f4e361-c121-46fd-bedb-ab6e0d2489a4\") " pod="openshift-monitoring/node-exporter-sdmgq" Apr 20 19:28:21.984971 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.984945 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/b7295835-02aa-4369-a6a8-e0d1bab163a9-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-rq7fq\" (UID: \"b7295835-02aa-4369-a6a8-e0d1bab163a9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rq7fq" Apr 20 19:28:21.985059 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.985011 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/71f4e361-c121-46fd-bedb-ab6e0d2489a4-sys\") pod \"node-exporter-sdmgq\" (UID: \"71f4e361-c121-46fd-bedb-ab6e0d2489a4\") " pod="openshift-monitoring/node-exporter-sdmgq" Apr 20 19:28:21.985454 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.985421 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/71f4e361-c121-46fd-bedb-ab6e0d2489a4-node-exporter-wtmp\") pod \"node-exporter-sdmgq\" (UID: \"71f4e361-c121-46fd-bedb-ab6e0d2489a4\") " pod="openshift-monitoring/node-exporter-sdmgq" Apr 20 19:28:21.985562 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.985544 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/71f4e361-c121-46fd-bedb-ab6e0d2489a4-node-exporter-textfile\") pod \"node-exporter-sdmgq\" (UID: \"71f4e361-c121-46fd-bedb-ab6e0d2489a4\") " pod="openshift-monitoring/node-exporter-sdmgq" Apr 20 19:28:21.985620 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.985587 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b7295835-02aa-4369-a6a8-e0d1bab163a9-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-rq7fq\" (UID: \"b7295835-02aa-4369-a6a8-e0d1bab163a9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rq7fq" Apr 20 19:28:21.986164 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.986142 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b7295835-02aa-4369-a6a8-e0d1bab163a9-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-rq7fq\" (UID: \"b7295835-02aa-4369-a6a8-e0d1bab163a9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rq7fq" Apr 20 19:28:21.986370 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.986353 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/71f4e361-c121-46fd-bedb-ab6e0d2489a4-node-exporter-textfile\") pod \"node-exporter-sdmgq\" (UID: \"71f4e361-c121-46fd-bedb-ab6e0d2489a4\") " pod="openshift-monitoring/node-exporter-sdmgq" Apr 20 19:28:21.988479 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.988416 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b7295835-02aa-4369-a6a8-e0d1bab163a9-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-rq7fq\" (UID: \"b7295835-02aa-4369-a6a8-e0d1bab163a9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rq7fq" Apr 20 19:28:21.989000 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.988981 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b7295835-02aa-4369-a6a8-e0d1bab163a9-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-rq7fq\" (UID: \"b7295835-02aa-4369-a6a8-e0d1bab163a9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rq7fq" Apr 20 19:28:21.989711 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.989585 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/71f4e361-c121-46fd-bedb-ab6e0d2489a4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-sdmgq\" (UID: \"71f4e361-c121-46fd-bedb-ab6e0d2489a4\") " pod="openshift-monitoring/node-exporter-sdmgq" Apr 20 19:28:21.995686 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.994987 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d589w\" (UniqueName: \"kubernetes.io/projected/71f4e361-c121-46fd-bedb-ab6e0d2489a4-kube-api-access-d589w\") pod \"node-exporter-sdmgq\" (UID: \"71f4e361-c121-46fd-bedb-ab6e0d2489a4\") " pod="openshift-monitoring/node-exporter-sdmgq" Apr 20 19:28:21.996021 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.995923 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgfk7\" (UniqueName: \"kubernetes.io/projected/b7295835-02aa-4369-a6a8-e0d1bab163a9-kube-api-access-hgfk7\") pod \"kube-state-metrics-69db897b98-rq7fq\" (UID: \"b7295835-02aa-4369-a6a8-e0d1bab163a9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rq7fq" Apr 20 19:28:21.997719 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.997671 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/71f4e361-c121-46fd-bedb-ab6e0d2489a4-node-exporter-accelerators-collector-config\") pod \"node-exporter-sdmgq\" (UID: \"71f4e361-c121-46fd-bedb-ab6e0d2489a4\") " pod="openshift-monitoring/node-exporter-sdmgq" Apr 20 19:28:21.997719 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:21.997686 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/b7295835-02aa-4369-a6a8-e0d1bab163a9-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-rq7fq\" (UID: \"b7295835-02aa-4369-a6a8-e0d1bab163a9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rq7fq" Apr 20 19:28:22.095110 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:22.095068 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-rq7fq" Apr 20 19:28:22.180172 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:22.180125 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-ff7c97478-zk79v" event={"ID":"0b696b86-2f2e-4d02-b224-efa87b93bc24","Type":"ContainerStarted","Data":"798483943bbb16abe098a2de560af99124d706d64058acad264e272c2b63f93d"} Apr 20 19:28:22.241132 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:22.241103 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-rq7fq"] Apr 20 19:28:22.262974 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:28:22.262926 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7295835_02aa_4369_a6a8_e0d1bab163a9.slice/crio-e7c75ca1535e40146a55b967c27eb7b90624583c1f9b8f3b331b56ad7aad2fe5 WatchSource:0}: Error finding container e7c75ca1535e40146a55b967c27eb7b90624583c1f9b8f3b331b56ad7aad2fe5: Status 404 returned error can't find the container with id e7c75ca1535e40146a55b967c27eb7b90624583c1f9b8f3b331b56ad7aad2fe5 Apr 20 19:28:22.490236 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:22.490145 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/71f4e361-c121-46fd-bedb-ab6e0d2489a4-node-exporter-tls\") pod \"node-exporter-sdmgq\" (UID: \"71f4e361-c121-46fd-bedb-ab6e0d2489a4\") " pod="openshift-monitoring/node-exporter-sdmgq" Apr 20 19:28:22.493869 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:22.493799 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/71f4e361-c121-46fd-bedb-ab6e0d2489a4-node-exporter-tls\") pod \"node-exporter-sdmgq\" (UID: \"71f4e361-c121-46fd-bedb-ab6e0d2489a4\") " pod="openshift-monitoring/node-exporter-sdmgq" Apr 20 19:28:22.706660 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:22.705826 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-sdmgq" Apr 20 19:28:22.726146 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:28:22.726100 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71f4e361_c121_46fd_bedb_ab6e0d2489a4.slice/crio-46f809609afea37eca20c7e8606d53f9c745199b482055081c77d9f338d35288 WatchSource:0}: Error finding container 46f809609afea37eca20c7e8606d53f9c745199b482055081c77d9f338d35288: Status 404 returned error can't find the container with id 46f809609afea37eca20c7e8606d53f9c745199b482055081c77d9f338d35288 Apr 20 19:28:23.185182 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:23.185100 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-sdmgq" event={"ID":"71f4e361-c121-46fd-bedb-ab6e0d2489a4","Type":"ContainerStarted","Data":"46f809609afea37eca20c7e8606d53f9c745199b482055081c77d9f338d35288"} Apr 20 19:28:23.186862 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:23.186828 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-rq7fq" event={"ID":"b7295835-02aa-4369-a6a8-e0d1bab163a9","Type":"ContainerStarted","Data":"e7c75ca1535e40146a55b967c27eb7b90624583c1f9b8f3b331b56ad7aad2fe5"} Apr 20 19:28:23.808293 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:23.808221 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-98cd4df66-gw2jt"] Apr 20 19:28:23.812891 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:23.812865 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-98cd4df66-gw2jt" Apr 20 19:28:23.815799 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:23.815768 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 20 19:28:23.815929 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:23.815839 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 20 19:28:23.815929 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:23.815881 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 20 19:28:23.815929 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:23.815879 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 20 19:28:23.816574 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:23.815992 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-qvlpg\"" Apr 20 19:28:23.816574 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:23.816123 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 20 19:28:23.816574 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:23.816159 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-2uk5037vu63tc\"" Apr 20 19:28:23.824656 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:23.822858 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-98cd4df66-gw2jt"] Apr 20 19:28:23.903855 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:23.903819 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/a558665e-4811-4d1f-b02d-c3e1dc6a92c5-secret-thanos-querier-tls\") pod \"thanos-querier-98cd4df66-gw2jt\" (UID: \"a558665e-4811-4d1f-b02d-c3e1dc6a92c5\") " pod="openshift-monitoring/thanos-querier-98cd4df66-gw2jt" Apr 20 19:28:23.904015 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:23.903868 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a558665e-4811-4d1f-b02d-c3e1dc6a92c5-metrics-client-ca\") pod \"thanos-querier-98cd4df66-gw2jt\" (UID: \"a558665e-4811-4d1f-b02d-c3e1dc6a92c5\") " pod="openshift-monitoring/thanos-querier-98cd4df66-gw2jt" Apr 20 19:28:23.904015 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:23.903901 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/a558665e-4811-4d1f-b02d-c3e1dc6a92c5-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-98cd4df66-gw2jt\" (UID: \"a558665e-4811-4d1f-b02d-c3e1dc6a92c5\") " pod="openshift-monitoring/thanos-querier-98cd4df66-gw2jt" Apr 20 19:28:23.904122 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:23.904016 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a558665e-4811-4d1f-b02d-c3e1dc6a92c5-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-98cd4df66-gw2jt\" (UID: \"a558665e-4811-4d1f-b02d-c3e1dc6a92c5\") " pod="openshift-monitoring/thanos-querier-98cd4df66-gw2jt" Apr 20 19:28:23.904122 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:23.904079 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmngn\" (UniqueName: \"kubernetes.io/projected/a558665e-4811-4d1f-b02d-c3e1dc6a92c5-kube-api-access-rmngn\") pod \"thanos-querier-98cd4df66-gw2jt\" (UID: \"a558665e-4811-4d1f-b02d-c3e1dc6a92c5\") " pod="openshift-monitoring/thanos-querier-98cd4df66-gw2jt" Apr 20 19:28:23.904122 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:23.904114 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a558665e-4811-4d1f-b02d-c3e1dc6a92c5-secret-grpc-tls\") pod \"thanos-querier-98cd4df66-gw2jt\" (UID: \"a558665e-4811-4d1f-b02d-c3e1dc6a92c5\") " pod="openshift-monitoring/thanos-querier-98cd4df66-gw2jt" Apr 20 19:28:23.904262 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:23.904173 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a558665e-4811-4d1f-b02d-c3e1dc6a92c5-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-98cd4df66-gw2jt\" (UID: \"a558665e-4811-4d1f-b02d-c3e1dc6a92c5\") " pod="openshift-monitoring/thanos-querier-98cd4df66-gw2jt" Apr 20 19:28:23.904262 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:23.904202 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/a558665e-4811-4d1f-b02d-c3e1dc6a92c5-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-98cd4df66-gw2jt\" (UID: \"a558665e-4811-4d1f-b02d-c3e1dc6a92c5\") " pod="openshift-monitoring/thanos-querier-98cd4df66-gw2jt" Apr 20 19:28:24.005615 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:24.005582 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/a558665e-4811-4d1f-b02d-c3e1dc6a92c5-secret-thanos-querier-tls\") pod \"thanos-querier-98cd4df66-gw2jt\" (UID: \"a558665e-4811-4d1f-b02d-c3e1dc6a92c5\") " pod="openshift-monitoring/thanos-querier-98cd4df66-gw2jt" Apr 20 19:28:24.005823 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:24.005628 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a558665e-4811-4d1f-b02d-c3e1dc6a92c5-metrics-client-ca\") pod \"thanos-querier-98cd4df66-gw2jt\" (UID: \"a558665e-4811-4d1f-b02d-c3e1dc6a92c5\") " pod="openshift-monitoring/thanos-querier-98cd4df66-gw2jt" Apr 20 19:28:24.005900 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:24.005816 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/a558665e-4811-4d1f-b02d-c3e1dc6a92c5-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-98cd4df66-gw2jt\" (UID: \"a558665e-4811-4d1f-b02d-c3e1dc6a92c5\") " pod="openshift-monitoring/thanos-querier-98cd4df66-gw2jt" Apr 20 19:28:24.005900 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:24.005891 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a558665e-4811-4d1f-b02d-c3e1dc6a92c5-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-98cd4df66-gw2jt\" (UID: \"a558665e-4811-4d1f-b02d-c3e1dc6a92c5\") " pod="openshift-monitoring/thanos-querier-98cd4df66-gw2jt" Apr 20 19:28:24.006006 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:24.005949 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rmngn\" (UniqueName: \"kubernetes.io/projected/a558665e-4811-4d1f-b02d-c3e1dc6a92c5-kube-api-access-rmngn\") pod \"thanos-querier-98cd4df66-gw2jt\" (UID: \"a558665e-4811-4d1f-b02d-c3e1dc6a92c5\") " pod="openshift-monitoring/thanos-querier-98cd4df66-gw2jt" Apr 20 19:28:24.006006 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:24.005987 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a558665e-4811-4d1f-b02d-c3e1dc6a92c5-secret-grpc-tls\") pod \"thanos-querier-98cd4df66-gw2jt\" (UID: \"a558665e-4811-4d1f-b02d-c3e1dc6a92c5\") " pod="openshift-monitoring/thanos-querier-98cd4df66-gw2jt" Apr 20 19:28:24.006107 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:24.006044 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a558665e-4811-4d1f-b02d-c3e1dc6a92c5-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-98cd4df66-gw2jt\" (UID: \"a558665e-4811-4d1f-b02d-c3e1dc6a92c5\") " pod="openshift-monitoring/thanos-querier-98cd4df66-gw2jt" Apr 20 19:28:24.006107 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:24.006073 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/a558665e-4811-4d1f-b02d-c3e1dc6a92c5-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-98cd4df66-gw2jt\" (UID: \"a558665e-4811-4d1f-b02d-c3e1dc6a92c5\") " pod="openshift-monitoring/thanos-querier-98cd4df66-gw2jt" Apr 20 19:28:24.006594 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:24.006475 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a558665e-4811-4d1f-b02d-c3e1dc6a92c5-metrics-client-ca\") pod \"thanos-querier-98cd4df66-gw2jt\" (UID: \"a558665e-4811-4d1f-b02d-c3e1dc6a92c5\") " pod="openshift-monitoring/thanos-querier-98cd4df66-gw2jt" Apr 20 19:28:24.008854 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:24.008802 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/a558665e-4811-4d1f-b02d-c3e1dc6a92c5-secret-thanos-querier-tls\") pod \"thanos-querier-98cd4df66-gw2jt\" (UID: \"a558665e-4811-4d1f-b02d-c3e1dc6a92c5\") " pod="openshift-monitoring/thanos-querier-98cd4df66-gw2jt" Apr 20 19:28:24.009114 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:24.009093 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a558665e-4811-4d1f-b02d-c3e1dc6a92c5-secret-grpc-tls\") pod \"thanos-querier-98cd4df66-gw2jt\" (UID: \"a558665e-4811-4d1f-b02d-c3e1dc6a92c5\") " pod="openshift-monitoring/thanos-querier-98cd4df66-gw2jt" Apr 20 19:28:24.009534 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:24.009512 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a558665e-4811-4d1f-b02d-c3e1dc6a92c5-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-98cd4df66-gw2jt\" (UID: \"a558665e-4811-4d1f-b02d-c3e1dc6a92c5\") " pod="openshift-monitoring/thanos-querier-98cd4df66-gw2jt" Apr 20 19:28:24.009641 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:24.009619 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/a558665e-4811-4d1f-b02d-c3e1dc6a92c5-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-98cd4df66-gw2jt\" (UID: \"a558665e-4811-4d1f-b02d-c3e1dc6a92c5\") " pod="openshift-monitoring/thanos-querier-98cd4df66-gw2jt" Apr 20 19:28:24.009701 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:24.009619 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a558665e-4811-4d1f-b02d-c3e1dc6a92c5-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-98cd4df66-gw2jt\" (UID: \"a558665e-4811-4d1f-b02d-c3e1dc6a92c5\") " pod="openshift-monitoring/thanos-querier-98cd4df66-gw2jt" Apr 20 19:28:24.009999 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:24.009981 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/a558665e-4811-4d1f-b02d-c3e1dc6a92c5-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-98cd4df66-gw2jt\" (UID: \"a558665e-4811-4d1f-b02d-c3e1dc6a92c5\") " pod="openshift-monitoring/thanos-querier-98cd4df66-gw2jt" Apr 20 19:28:24.014473 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:24.014433 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmngn\" (UniqueName: \"kubernetes.io/projected/a558665e-4811-4d1f-b02d-c3e1dc6a92c5-kube-api-access-rmngn\") pod \"thanos-querier-98cd4df66-gw2jt\" (UID: \"a558665e-4811-4d1f-b02d-c3e1dc6a92c5\") " pod="openshift-monitoring/thanos-querier-98cd4df66-gw2jt" Apr 20 19:28:24.129240 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:24.129201 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-98cd4df66-gw2jt" Apr 20 19:28:26.957988 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:26.957948 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-5cf88d5bcf-5p6r6"] Apr 20 19:28:26.963479 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:26.963432 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5cf88d5bcf-5p6r6" Apr 20 19:28:26.966214 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:26.966077 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 20 19:28:26.966214 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:26.966107 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 20 19:28:26.966214 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:26.966138 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 20 19:28:26.966214 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:26.966167 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 20 19:28:26.966488 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:26.966266 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-gw2rf\"" Apr 20 19:28:26.966488 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:26.966389 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 20 19:28:26.972019 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:26.971997 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 20 19:28:26.973418 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:26.973390 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5cf88d5bcf-5p6r6"] Apr 20 19:28:27.136659 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:27.136617 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72ead48b-c34a-4cb6-80ad-02aa7e0bf463-serving-certs-ca-bundle\") pod \"telemeter-client-5cf88d5bcf-5p6r6\" (UID: \"72ead48b-c34a-4cb6-80ad-02aa7e0bf463\") " pod="openshift-monitoring/telemeter-client-5cf88d5bcf-5p6r6" Apr 20 19:28:27.136844 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:27.136671 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/72ead48b-c34a-4cb6-80ad-02aa7e0bf463-secret-telemeter-client\") pod \"telemeter-client-5cf88d5bcf-5p6r6\" (UID: \"72ead48b-c34a-4cb6-80ad-02aa7e0bf463\") " pod="openshift-monitoring/telemeter-client-5cf88d5bcf-5p6r6" Apr 20 19:28:27.136844 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:27.136748 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/72ead48b-c34a-4cb6-80ad-02aa7e0bf463-metrics-client-ca\") pod \"telemeter-client-5cf88d5bcf-5p6r6\" (UID: \"72ead48b-c34a-4cb6-80ad-02aa7e0bf463\") " pod="openshift-monitoring/telemeter-client-5cf88d5bcf-5p6r6" Apr 20 19:28:27.136844 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:27.136795 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72ead48b-c34a-4cb6-80ad-02aa7e0bf463-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5cf88d5bcf-5p6r6\" (UID: \"72ead48b-c34a-4cb6-80ad-02aa7e0bf463\") " pod="openshift-monitoring/telemeter-client-5cf88d5bcf-5p6r6" Apr 20 19:28:27.136844 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:27.136838 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgw2n\" (UniqueName: \"kubernetes.io/projected/72ead48b-c34a-4cb6-80ad-02aa7e0bf463-kube-api-access-hgw2n\") pod \"telemeter-client-5cf88d5bcf-5p6r6\" (UID: \"72ead48b-c34a-4cb6-80ad-02aa7e0bf463\") " pod="openshift-monitoring/telemeter-client-5cf88d5bcf-5p6r6" Apr 20 19:28:27.137002 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:27.136941 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/72ead48b-c34a-4cb6-80ad-02aa7e0bf463-telemeter-client-tls\") pod \"telemeter-client-5cf88d5bcf-5p6r6\" (UID: \"72ead48b-c34a-4cb6-80ad-02aa7e0bf463\") " pod="openshift-monitoring/telemeter-client-5cf88d5bcf-5p6r6" Apr 20 19:28:27.137002 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:27.136994 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/72ead48b-c34a-4cb6-80ad-02aa7e0bf463-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5cf88d5bcf-5p6r6\" (UID: \"72ead48b-c34a-4cb6-80ad-02aa7e0bf463\") " pod="openshift-monitoring/telemeter-client-5cf88d5bcf-5p6r6" Apr 20 19:28:27.137074 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:27.137048 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/72ead48b-c34a-4cb6-80ad-02aa7e0bf463-federate-client-tls\") pod \"telemeter-client-5cf88d5bcf-5p6r6\" (UID: \"72ead48b-c34a-4cb6-80ad-02aa7e0bf463\") " pod="openshift-monitoring/telemeter-client-5cf88d5bcf-5p6r6" Apr 20 19:28:27.238418 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:27.238333 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/72ead48b-c34a-4cb6-80ad-02aa7e0bf463-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5cf88d5bcf-5p6r6\" (UID: \"72ead48b-c34a-4cb6-80ad-02aa7e0bf463\") " pod="openshift-monitoring/telemeter-client-5cf88d5bcf-5p6r6" Apr 20 19:28:27.238418 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:27.238386 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/72ead48b-c34a-4cb6-80ad-02aa7e0bf463-federate-client-tls\") pod \"telemeter-client-5cf88d5bcf-5p6r6\" (UID: \"72ead48b-c34a-4cb6-80ad-02aa7e0bf463\") " pod="openshift-monitoring/telemeter-client-5cf88d5bcf-5p6r6" Apr 20 19:28:27.238663 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:27.238453 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72ead48b-c34a-4cb6-80ad-02aa7e0bf463-serving-certs-ca-bundle\") pod \"telemeter-client-5cf88d5bcf-5p6r6\" (UID: \"72ead48b-c34a-4cb6-80ad-02aa7e0bf463\") " pod="openshift-monitoring/telemeter-client-5cf88d5bcf-5p6r6" Apr 20 19:28:27.238663 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:27.238486 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/72ead48b-c34a-4cb6-80ad-02aa7e0bf463-secret-telemeter-client\") pod \"telemeter-client-5cf88d5bcf-5p6r6\" (UID: \"72ead48b-c34a-4cb6-80ad-02aa7e0bf463\") " pod="openshift-monitoring/telemeter-client-5cf88d5bcf-5p6r6" Apr 20 19:28:27.238663 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:27.238520 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/72ead48b-c34a-4cb6-80ad-02aa7e0bf463-metrics-client-ca\") pod \"telemeter-client-5cf88d5bcf-5p6r6\" (UID: \"72ead48b-c34a-4cb6-80ad-02aa7e0bf463\") " pod="openshift-monitoring/telemeter-client-5cf88d5bcf-5p6r6" Apr 20 19:28:27.238663 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:27.238555 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72ead48b-c34a-4cb6-80ad-02aa7e0bf463-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5cf88d5bcf-5p6r6\" (UID: \"72ead48b-c34a-4cb6-80ad-02aa7e0bf463\") " pod="openshift-monitoring/telemeter-client-5cf88d5bcf-5p6r6" Apr 20 19:28:27.238663 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:27.238604 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hgw2n\" (UniqueName: \"kubernetes.io/projected/72ead48b-c34a-4cb6-80ad-02aa7e0bf463-kube-api-access-hgw2n\") pod \"telemeter-client-5cf88d5bcf-5p6r6\" (UID: \"72ead48b-c34a-4cb6-80ad-02aa7e0bf463\") " pod="openshift-monitoring/telemeter-client-5cf88d5bcf-5p6r6" Apr 20 19:28:27.238663 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:27.238653 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/72ead48b-c34a-4cb6-80ad-02aa7e0bf463-telemeter-client-tls\") pod \"telemeter-client-5cf88d5bcf-5p6r6\" (UID: \"72ead48b-c34a-4cb6-80ad-02aa7e0bf463\") " pod="openshift-monitoring/telemeter-client-5cf88d5bcf-5p6r6" Apr 20 19:28:27.239485 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:27.239392 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/72ead48b-c34a-4cb6-80ad-02aa7e0bf463-metrics-client-ca\") pod \"telemeter-client-5cf88d5bcf-5p6r6\" (UID: \"72ead48b-c34a-4cb6-80ad-02aa7e0bf463\") " pod="openshift-monitoring/telemeter-client-5cf88d5bcf-5p6r6" Apr 20 19:28:27.239612 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:27.239515 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72ead48b-c34a-4cb6-80ad-02aa7e0bf463-serving-certs-ca-bundle\") pod \"telemeter-client-5cf88d5bcf-5p6r6\" (UID: \"72ead48b-c34a-4cb6-80ad-02aa7e0bf463\") " pod="openshift-monitoring/telemeter-client-5cf88d5bcf-5p6r6" Apr 20 19:28:27.239890 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:27.239866 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72ead48b-c34a-4cb6-80ad-02aa7e0bf463-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5cf88d5bcf-5p6r6\" (UID: \"72ead48b-c34a-4cb6-80ad-02aa7e0bf463\") " pod="openshift-monitoring/telemeter-client-5cf88d5bcf-5p6r6" Apr 20 19:28:27.241372 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:27.241322 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/72ead48b-c34a-4cb6-80ad-02aa7e0bf463-federate-client-tls\") pod \"telemeter-client-5cf88d5bcf-5p6r6\" (UID: \"72ead48b-c34a-4cb6-80ad-02aa7e0bf463\") " pod="openshift-monitoring/telemeter-client-5cf88d5bcf-5p6r6" Apr 20 19:28:27.241530 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:27.241507 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/72ead48b-c34a-4cb6-80ad-02aa7e0bf463-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5cf88d5bcf-5p6r6\" (UID: \"72ead48b-c34a-4cb6-80ad-02aa7e0bf463\") " pod="openshift-monitoring/telemeter-client-5cf88d5bcf-5p6r6" Apr 20 19:28:27.241664 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:27.241637 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/72ead48b-c34a-4cb6-80ad-02aa7e0bf463-secret-telemeter-client\") pod \"telemeter-client-5cf88d5bcf-5p6r6\" (UID: \"72ead48b-c34a-4cb6-80ad-02aa7e0bf463\") " pod="openshift-monitoring/telemeter-client-5cf88d5bcf-5p6r6" Apr 20 19:28:27.241719 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:27.241699 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/72ead48b-c34a-4cb6-80ad-02aa7e0bf463-telemeter-client-tls\") pod \"telemeter-client-5cf88d5bcf-5p6r6\" (UID: \"72ead48b-c34a-4cb6-80ad-02aa7e0bf463\") " pod="openshift-monitoring/telemeter-client-5cf88d5bcf-5p6r6" Apr 20 19:28:27.247020 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:27.246994 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgw2n\" (UniqueName: \"kubernetes.io/projected/72ead48b-c34a-4cb6-80ad-02aa7e0bf463-kube-api-access-hgw2n\") pod \"telemeter-client-5cf88d5bcf-5p6r6\" (UID: \"72ead48b-c34a-4cb6-80ad-02aa7e0bf463\") " pod="openshift-monitoring/telemeter-client-5cf88d5bcf-5p6r6" Apr 20 19:28:27.275995 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:27.275953 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5cf88d5bcf-5p6r6" Apr 20 19:28:27.948419 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:27.948379 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 19:28:27.953712 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:27.953686 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:27.956424 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:27.956400 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 20 19:28:27.956559 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:27.956461 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-tncz6\"" Apr 20 19:28:27.956750 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:27.956670 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 20 19:28:27.956750 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:27.956717 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 20 19:28:27.956976 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:27.956828 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 20 19:28:27.956976 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:27.956910 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 20 19:28:27.957084 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:27.956991 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 20 19:28:27.957084 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:27.957014 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-51sjjv57t37t6\"" Apr 20 19:28:27.957174 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:27.957087 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 20 19:28:27.957320 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:27.957297 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 20 19:28:27.957423 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:27.957355 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 20 19:28:27.957423 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:27.957416 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 20 19:28:27.957551 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:27.957461 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 20 19:28:27.957551 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:27.957479 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 20 19:28:27.959833 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:27.959745 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 20 19:28:27.962803 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:27.962783 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 19:28:28.047598 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.047555 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.047789 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.047618 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/269f1398-7daa-4518-b6a5-65f56c2971de-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.047789 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.047647 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs2bv\" (UniqueName: \"kubernetes.io/projected/269f1398-7daa-4518-b6a5-65f56c2971de-kube-api-access-cs2bv\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.047789 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.047686 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.047789 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.047716 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/269f1398-7daa-4518-b6a5-65f56c2971de-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.047789 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.047741 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/269f1398-7daa-4518-b6a5-65f56c2971de-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.048054 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.047838 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-config\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.048054 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.047907 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/269f1398-7daa-4518-b6a5-65f56c2971de-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.048054 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.047941 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.048201 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.048065 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/269f1398-7daa-4518-b6a5-65f56c2971de-config-out\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.048268 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.048244 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/269f1398-7daa-4518-b6a5-65f56c2971de-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.048790 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.048743 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-web-config\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.048922 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.048801 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.048922 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.048828 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.048922 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.048844 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/269f1398-7daa-4518-b6a5-65f56c2971de-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.048922 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.048876 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/269f1398-7daa-4518-b6a5-65f56c2971de-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.049107 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.048941 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.049107 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.048991 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.150324 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.150286 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.150520 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.150335 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/269f1398-7daa-4518-b6a5-65f56c2971de-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.150520 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.150364 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/269f1398-7daa-4518-b6a5-65f56c2971de-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.150520 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.150394 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-config\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.150700 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.150540 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/269f1398-7daa-4518-b6a5-65f56c2971de-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.150700 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.150568 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.150700 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.150593 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/269f1398-7daa-4518-b6a5-65f56c2971de-config-out\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.150700 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.150614 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/269f1398-7daa-4518-b6a5-65f56c2971de-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.150700 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.150645 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-web-config\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.150700 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.150667 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.150700 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.150692 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.151036 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.150716 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/269f1398-7daa-4518-b6a5-65f56c2971de-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.151036 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.150753 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/269f1398-7daa-4518-b6a5-65f56c2971de-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.151036 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.150818 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.151036 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.150875 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.151036 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.150910 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/269f1398-7daa-4518-b6a5-65f56c2971de-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.151036 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.150923 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.151036 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.150949 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/269f1398-7daa-4518-b6a5-65f56c2971de-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.151036 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.150973 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cs2bv\" (UniqueName: \"kubernetes.io/projected/269f1398-7daa-4518-b6a5-65f56c2971de-kube-api-access-cs2bv\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.151428 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.151191 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/269f1398-7daa-4518-b6a5-65f56c2971de-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.151428 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.151224 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/269f1398-7daa-4518-b6a5-65f56c2971de-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.152610 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.152240 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/269f1398-7daa-4518-b6a5-65f56c2971de-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.153641 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.153614 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.154238 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.154212 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/269f1398-7daa-4518-b6a5-65f56c2971de-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.154339 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.154214 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-config\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.154861 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.154837 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-web-config\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.155797 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.155489 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.155797 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.155589 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.155937 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.155878 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/269f1398-7daa-4518-b6a5-65f56c2971de-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.156225 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.156200 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.156341 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.156324 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/269f1398-7daa-4518-b6a5-65f56c2971de-config-out\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.156551 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.156531 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.157229 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.157178 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.158114 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.158072 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.158674 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.158652 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/269f1398-7daa-4518-b6a5-65f56c2971de-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.159694 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.159667 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cs2bv\" (UniqueName: \"kubernetes.io/projected/269f1398-7daa-4518-b6a5-65f56c2971de-kube-api-access-cs2bv\") pod \"prometheus-k8s-0\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:28.266915 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:28.266835 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:32.166362 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:32.166325 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 19:28:32.170095 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:28:32.170051 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod269f1398_7daa_4518_b6a5_65f56c2971de.slice/crio-57ed1b37c2bd15efb0aee8ae38cfcd1284f105e371052383c788c68320d6557c WatchSource:0}: Error finding container 57ed1b37c2bd15efb0aee8ae38cfcd1284f105e371052383c788c68320d6557c: Status 404 returned error can't find the container with id 57ed1b37c2bd15efb0aee8ae38cfcd1284f105e371052383c788c68320d6557c Apr 20 19:28:32.177689 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:32.177124 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5cf88d5bcf-5p6r6"] Apr 20 19:28:32.217829 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:32.217760 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"269f1398-7daa-4518-b6a5-65f56c2971de","Type":"ContainerStarted","Data":"57ed1b37c2bd15efb0aee8ae38cfcd1284f105e371052383c788c68320d6557c"} Apr 20 19:28:32.220431 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:32.219603 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-ff7c97478-zk79v" event={"ID":"0b696b86-2f2e-4d02-b224-efa87b93bc24","Type":"ContainerStarted","Data":"ce6a16ba5c07d7774711f64caaf6cb517480f2c0c75d0b6680e74e5f01327e3d"} Apr 20 19:28:32.221827 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:32.221774 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5cf88d5bcf-5p6r6" event={"ID":"72ead48b-c34a-4cb6-80ad-02aa7e0bf463","Type":"ContainerStarted","Data":"9b6ae2a211e4644368e79eb65aae50d2cd973d40d4e3de0b7fed8b141ff328af"} Apr 20 19:28:32.225077 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:32.224845 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-rq7fq" event={"ID":"b7295835-02aa-4369-a6a8-e0d1bab163a9","Type":"ContainerStarted","Data":"625928b041d03f2848d0b71ef67c086748417c836463b7f2270a10581542bb93"} Apr 20 19:28:32.225077 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:32.224878 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-rq7fq" event={"ID":"b7295835-02aa-4369-a6a8-e0d1bab163a9","Type":"ContainerStarted","Data":"b67379aa8dbb17737adf75967be9750a49f70bde65415f844ae9668380eb2512"} Apr 20 19:28:32.236622 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:32.229351 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-7xljr" event={"ID":"ecc56b03-14e4-4238-8c9a-7974d6774b23","Type":"ContainerStarted","Data":"26158f566f36fc5a015c3d78826980e77d0a051ec2ca34cd672d4ed91d31ea51"} Apr 20 19:28:32.236622 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:32.234630 2572 patch_prober.go:28] interesting pod/downloads-6bcc868b7-7xljr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.133.0.18:8080/\": dial tcp 10.133.0.18:8080: connect: connection refused" start-of-body= Apr 20 19:28:32.236622 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:32.234686 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-6bcc868b7-7xljr" podUID="ecc56b03-14e4-4238-8c9a-7974d6774b23" containerName="download-server" probeResult="failure" output="Get \"http://10.133.0.18:8080/\": dial tcp 10.133.0.18:8080: connect: connection refused" Apr 20 19:28:32.236622 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:32.235043 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-7xljr" Apr 20 19:28:32.239846 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:32.239222 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-sdmgq" event={"ID":"71f4e361-c121-46fd-bedb-ab6e0d2489a4","Type":"ContainerStarted","Data":"ca58582c165a22c40d5b23775176f57abf4399478da3e18c93a3a1e81a4f7173"} Apr 20 19:28:32.241100 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:32.241052 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-ff7c97478-zk79v" podStartSLOduration=0.981820648 podStartE2EDuration="11.241036146s" podCreationTimestamp="2026-04-20 19:28:21 +0000 UTC" firstStartedPulling="2026-04-20 19:28:21.670112744 +0000 UTC m=+175.660600261" lastFinishedPulling="2026-04-20 19:28:31.929328242 +0000 UTC m=+185.919815759" observedRunningTime="2026-04-20 19:28:32.238336473 +0000 UTC m=+186.228824021" watchObservedRunningTime="2026-04-20 19:28:32.241036146 +0000 UTC m=+186.231523681" Apr 20 19:28:32.257166 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:32.257109 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-7xljr" podStartSLOduration=1.9743479910000001 podStartE2EDuration="19.257090452s" podCreationTimestamp="2026-04-20 19:28:13 +0000 UTC" firstStartedPulling="2026-04-20 19:28:14.694031876 +0000 UTC m=+168.684519392" lastFinishedPulling="2026-04-20 19:28:31.97677434 +0000 UTC m=+185.967261853" observedRunningTime="2026-04-20 19:28:32.256472428 +0000 UTC m=+186.246959961" watchObservedRunningTime="2026-04-20 19:28:32.257090452 +0000 UTC m=+186.247577988" Apr 20 19:28:32.372194 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:32.372165 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-98cd4df66-gw2jt"] Apr 20 19:28:32.374609 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:28:32.374578 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda558665e_4811_4d1f_b02d_c3e1dc6a92c5.slice/crio-9ad867d5574cbb7e564c617e1d5d14902db97003435cc039ac8cfc129f0bec57 WatchSource:0}: Error finding container 9ad867d5574cbb7e564c617e1d5d14902db97003435cc039ac8cfc129f0bec57: Status 404 returned error can't find the container with id 9ad867d5574cbb7e564c617e1d5d14902db97003435cc039ac8cfc129f0bec57 Apr 20 19:28:33.248215 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:33.248129 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-rq7fq" event={"ID":"b7295835-02aa-4369-a6a8-e0d1bab163a9","Type":"ContainerStarted","Data":"855530ab327781a113664c2c0bb6964df8f213e2fb1986f5ccab207386d4d570"} Apr 20 19:28:33.253099 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:33.253067 2572 generic.go:358] "Generic (PLEG): container finished" podID="71f4e361-c121-46fd-bedb-ab6e0d2489a4" containerID="ca58582c165a22c40d5b23775176f57abf4399478da3e18c93a3a1e81a4f7173" exitCode=0 Apr 20 19:28:33.253274 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:33.253170 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-sdmgq" event={"ID":"71f4e361-c121-46fd-bedb-ab6e0d2489a4","Type":"ContainerDied","Data":"ca58582c165a22c40d5b23775176f57abf4399478da3e18c93a3a1e81a4f7173"} Apr 20 19:28:33.256267 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:33.256241 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-98cd4df66-gw2jt" event={"ID":"a558665e-4811-4d1f-b02d-c3e1dc6a92c5","Type":"ContainerStarted","Data":"9ad867d5574cbb7e564c617e1d5d14902db97003435cc039ac8cfc129f0bec57"} Apr 20 19:28:33.268259 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:33.268209 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-7xljr" Apr 20 19:28:33.269810 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:33.269755 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-rq7fq" podStartSLOduration=2.622524964 podStartE2EDuration="12.269738426s" podCreationTimestamp="2026-04-20 19:28:21 +0000 UTC" firstStartedPulling="2026-04-20 19:28:22.26518141 +0000 UTC m=+176.255668926" lastFinishedPulling="2026-04-20 19:28:31.912394861 +0000 UTC m=+185.902882388" observedRunningTime="2026-04-20 19:28:33.265940346 +0000 UTC m=+187.256427884" watchObservedRunningTime="2026-04-20 19:28:33.269738426 +0000 UTC m=+187.260225964" Apr 20 19:28:34.263269 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:34.263176 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-sdmgq" event={"ID":"71f4e361-c121-46fd-bedb-ab6e0d2489a4","Type":"ContainerStarted","Data":"7d18c001e8067d187e35c1e7e07c4d71176b03f436a0428ccf96832dea9bb5ea"} Apr 20 19:28:34.263269 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:34.263223 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-sdmgq" event={"ID":"71f4e361-c121-46fd-bedb-ab6e0d2489a4","Type":"ContainerStarted","Data":"7ef150087026b709a023dc9c68c6d164f46703849895bb91b3159838f928b4c2"} Apr 20 19:28:34.269196 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:34.269117 2572 generic.go:358] "Generic (PLEG): container finished" podID="269f1398-7daa-4518-b6a5-65f56c2971de" containerID="f62e7a89f63e49b55041e818a991a2298b6ae22a0521dfd5f2b935ffb308e5bf" exitCode=0 Apr 20 19:28:34.269516 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:34.269387 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"269f1398-7daa-4518-b6a5-65f56c2971de","Type":"ContainerDied","Data":"f62e7a89f63e49b55041e818a991a2298b6ae22a0521dfd5f2b935ffb308e5bf"} Apr 20 19:28:34.294191 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:34.294119 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-sdmgq" podStartSLOduration=4.101993724 podStartE2EDuration="13.294099659s" podCreationTimestamp="2026-04-20 19:28:21 +0000 UTC" firstStartedPulling="2026-04-20 19:28:22.73301037 +0000 UTC m=+176.723497890" lastFinishedPulling="2026-04-20 19:28:31.925116308 +0000 UTC m=+185.915603825" observedRunningTime="2026-04-20 19:28:34.293013865 +0000 UTC m=+188.283501426" watchObservedRunningTime="2026-04-20 19:28:34.294099659 +0000 UTC m=+188.284587196" Apr 20 19:28:36.165025 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:36.164853 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-b69bfb95-snnfp" Apr 20 19:28:36.281363 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:36.281320 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-98cd4df66-gw2jt" event={"ID":"a558665e-4811-4d1f-b02d-c3e1dc6a92c5","Type":"ContainerStarted","Data":"3a444a58c82f71149f922313a82ed631c8f2306a79cdad03ebb488ea0ccaa77f"} Apr 20 19:28:36.281363 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:36.281370 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-98cd4df66-gw2jt" event={"ID":"a558665e-4811-4d1f-b02d-c3e1dc6a92c5","Type":"ContainerStarted","Data":"2a460b0f2bdb08581463e6bb5bdd53628db0c742c1542276cd7f8eaa703cbf34"} Apr 20 19:28:36.281629 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:36.281384 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-98cd4df66-gw2jt" event={"ID":"a558665e-4811-4d1f-b02d-c3e1dc6a92c5","Type":"ContainerStarted","Data":"301f4a1889d7250b39212bc9f917346378a206c11d2d90daa922966d5d667574"} Apr 20 19:28:36.285598 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:36.285514 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5cf88d5bcf-5p6r6" event={"ID":"72ead48b-c34a-4cb6-80ad-02aa7e0bf463","Type":"ContainerStarted","Data":"6b67fb694cfb967dd0a89b46496d53de6fe4459fca1667e080755bd7cfba995b"} Apr 20 19:28:36.285598 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:36.285555 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5cf88d5bcf-5p6r6" event={"ID":"72ead48b-c34a-4cb6-80ad-02aa7e0bf463","Type":"ContainerStarted","Data":"d211b6f6a0fd877ba74ee719e98de82f8066399f0cbd1b674039ef0e79d51946"} Apr 20 19:28:36.285598 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:36.285569 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5cf88d5bcf-5p6r6" event={"ID":"72ead48b-c34a-4cb6-80ad-02aa7e0bf463","Type":"ContainerStarted","Data":"8de3778b247e8b43d24411df13a666967309821f39d1108a11e0dceb74117ef7"} Apr 20 19:28:36.313314 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:36.312624 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-5cf88d5bcf-5p6r6" podStartSLOduration=7.007901322 podStartE2EDuration="10.312603037s" podCreationTimestamp="2026-04-20 19:28:26 +0000 UTC" firstStartedPulling="2026-04-20 19:28:32.191282209 +0000 UTC m=+186.181769722" lastFinishedPulling="2026-04-20 19:28:35.495983921 +0000 UTC m=+189.486471437" observedRunningTime="2026-04-20 19:28:36.310287064 +0000 UTC m=+190.300774634" watchObservedRunningTime="2026-04-20 19:28:36.312603037 +0000 UTC m=+190.303090572" Apr 20 19:28:38.248708 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:38.248661 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-ff7c97478-zk79v"] Apr 20 19:28:39.300834 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:39.300791 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-98cd4df66-gw2jt" event={"ID":"a558665e-4811-4d1f-b02d-c3e1dc6a92c5","Type":"ContainerStarted","Data":"1c7e874a5401df355701922acd8fe04de82771ac0d6cc4e8cd045d1d4d857fa5"} Apr 20 19:28:39.300834 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:39.300837 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-98cd4df66-gw2jt" event={"ID":"a558665e-4811-4d1f-b02d-c3e1dc6a92c5","Type":"ContainerStarted","Data":"56c9b3ead61f07e80ea3cdb4527e618414f616881e8de032f89279bd1e394c4d"} Apr 20 19:28:39.301311 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:39.300851 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-98cd4df66-gw2jt" event={"ID":"a558665e-4811-4d1f-b02d-c3e1dc6a92c5","Type":"ContainerStarted","Data":"3bb0531dda3526cad3523d427cf619014aad630453a22bf9b0382cdf5e4b80cc"} Apr 20 19:28:39.301311 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:39.300986 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-98cd4df66-gw2jt" Apr 20 19:28:39.303536 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:39.303507 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"269f1398-7daa-4518-b6a5-65f56c2971de","Type":"ContainerStarted","Data":"9be3aae681c3480fac29970461ba5dcf0c6713fd0a9de7405a3f06b43f0593c1"} Apr 20 19:28:39.303536 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:39.303535 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"269f1398-7daa-4518-b6a5-65f56c2971de","Type":"ContainerStarted","Data":"ed6ca77a45d9f9820a5bf5448f180535e55a74631f1fe00ef4c9aadfdbb8116a"} Apr 20 19:28:39.303718 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:39.303548 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"269f1398-7daa-4518-b6a5-65f56c2971de","Type":"ContainerStarted","Data":"ec97f8ede97bc37bc1c1a51b757fa87df42cfa7cf0a5f9bce3e2390d6002e5a2"} Apr 20 19:28:39.333471 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:39.333386 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-98cd4df66-gw2jt" podStartSLOduration=10.369906549 podStartE2EDuration="16.333364725s" podCreationTimestamp="2026-04-20 19:28:23 +0000 UTC" firstStartedPulling="2026-04-20 19:28:32.376885755 +0000 UTC m=+186.367373269" lastFinishedPulling="2026-04-20 19:28:38.340343923 +0000 UTC m=+192.330831445" observedRunningTime="2026-04-20 19:28:39.331051057 +0000 UTC m=+193.321538593" watchObservedRunningTime="2026-04-20 19:28:39.333364725 +0000 UTC m=+193.323852261" Apr 20 19:28:40.314232 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:40.314190 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"269f1398-7daa-4518-b6a5-65f56c2971de","Type":"ContainerStarted","Data":"01f66f38280a04bcafb66f0f77aa251b84f94200b3e9a3acf953a7873eae4fe9"} Apr 20 19:28:40.314232 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:40.314240 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"269f1398-7daa-4518-b6a5-65f56c2971de","Type":"ContainerStarted","Data":"1575891abb2e069e28ed5059632eddee5377ce8ebcdeb0938f7c107d81cbc188"} Apr 20 19:28:40.314781 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:40.314256 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"269f1398-7daa-4518-b6a5-65f56c2971de","Type":"ContainerStarted","Data":"59e1b9fed2aa1f8007aca0082b44d573cce9d71017bcff89c1c90b97ba5ffddc"} Apr 20 19:28:40.321260 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:40.321231 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-98cd4df66-gw2jt" Apr 20 19:28:40.342979 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:40.342924 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=6.724284099 podStartE2EDuration="13.34290268s" podCreationTimestamp="2026-04-20 19:28:27 +0000 UTC" firstStartedPulling="2026-04-20 19:28:32.174955091 +0000 UTC m=+186.165442622" lastFinishedPulling="2026-04-20 19:28:38.793573686 +0000 UTC m=+192.784061203" observedRunningTime="2026-04-20 19:28:40.34062753 +0000 UTC m=+194.331115099" watchObservedRunningTime="2026-04-20 19:28:40.34290268 +0000 UTC m=+194.333390219" Apr 20 19:28:41.512179 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:41.512137 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-ff7c97478-zk79v" Apr 20 19:28:43.267180 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:43.267153 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:48.100208 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:48.100178 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-kxblw_bc24b476-7aaf-4c95-b13e-44550d15e793/serve-healthcheck-canary/0.log" Apr 20 19:28:54.359500 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:54.359463 2572 generic.go:358] "Generic (PLEG): container finished" podID="e7a85080-ad4d-4e33-b890-2483a1f5c762" containerID="5588799756f23454cf6ba0a2ac6617468198747a824277fa75e33fb8cc234c77" exitCode=0 Apr 20 19:28:54.359900 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:54.359524 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b74jx" event={"ID":"e7a85080-ad4d-4e33-b890-2483a1f5c762","Type":"ContainerDied","Data":"5588799756f23454cf6ba0a2ac6617468198747a824277fa75e33fb8cc234c77"} Apr 20 19:28:54.359900 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:54.359846 2572 scope.go:117] "RemoveContainer" containerID="5588799756f23454cf6ba0a2ac6617468198747a824277fa75e33fb8cc234c77" Apr 20 19:28:55.363887 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:28:55.363857 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b74jx" event={"ID":"e7a85080-ad4d-4e33-b890-2483a1f5c762","Type":"ContainerStarted","Data":"64873d47e412fe47c480bae8474c4753223d4e6002b63a8dc37ceba731c1075e"} Apr 20 19:29:03.270840 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:03.270798 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-ff7c97478-zk79v" podUID="0b696b86-2f2e-4d02-b224-efa87b93bc24" containerName="console" containerID="cri-o://ce6a16ba5c07d7774711f64caaf6cb517480f2c0c75d0b6680e74e5f01327e3d" gracePeriod=15 Apr 20 19:29:03.396900 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:03.396872 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-ff7c97478-zk79v_0b696b86-2f2e-4d02-b224-efa87b93bc24/console/0.log" Apr 20 19:29:03.397098 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:03.396920 2572 generic.go:358] "Generic (PLEG): container finished" podID="0b696b86-2f2e-4d02-b224-efa87b93bc24" containerID="ce6a16ba5c07d7774711f64caaf6cb517480f2c0c75d0b6680e74e5f01327e3d" exitCode=2 Apr 20 19:29:03.397098 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:03.396997 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-ff7c97478-zk79v" event={"ID":"0b696b86-2f2e-4d02-b224-efa87b93bc24","Type":"ContainerDied","Data":"ce6a16ba5c07d7774711f64caaf6cb517480f2c0c75d0b6680e74e5f01327e3d"} Apr 20 19:29:03.539459 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:03.539419 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-ff7c97478-zk79v_0b696b86-2f2e-4d02-b224-efa87b93bc24/console/0.log" Apr 20 19:29:03.539603 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:03.539517 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-ff7c97478-zk79v" Apr 20 19:29:03.709436 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:03.709402 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z489v\" (UniqueName: \"kubernetes.io/projected/0b696b86-2f2e-4d02-b224-efa87b93bc24-kube-api-access-z489v\") pod \"0b696b86-2f2e-4d02-b224-efa87b93bc24\" (UID: \"0b696b86-2f2e-4d02-b224-efa87b93bc24\") " Apr 20 19:29:03.709630 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:03.709465 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0b696b86-2f2e-4d02-b224-efa87b93bc24-console-oauth-config\") pod \"0b696b86-2f2e-4d02-b224-efa87b93bc24\" (UID: \"0b696b86-2f2e-4d02-b224-efa87b93bc24\") " Apr 20 19:29:03.709630 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:03.709515 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b696b86-2f2e-4d02-b224-efa87b93bc24-service-ca\") pod \"0b696b86-2f2e-4d02-b224-efa87b93bc24\" (UID: \"0b696b86-2f2e-4d02-b224-efa87b93bc24\") " Apr 20 19:29:03.709630 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:03.709537 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0b696b86-2f2e-4d02-b224-efa87b93bc24-console-serving-cert\") pod \"0b696b86-2f2e-4d02-b224-efa87b93bc24\" (UID: \"0b696b86-2f2e-4d02-b224-efa87b93bc24\") " Apr 20 19:29:03.709630 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:03.709564 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0b696b86-2f2e-4d02-b224-efa87b93bc24-console-config\") pod \"0b696b86-2f2e-4d02-b224-efa87b93bc24\" (UID: \"0b696b86-2f2e-4d02-b224-efa87b93bc24\") " Apr 20 19:29:03.709839 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:03.709637 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0b696b86-2f2e-4d02-b224-efa87b93bc24-oauth-serving-cert\") pod \"0b696b86-2f2e-4d02-b224-efa87b93bc24\" (UID: \"0b696b86-2f2e-4d02-b224-efa87b93bc24\") " Apr 20 19:29:03.710473 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:03.709951 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b696b86-2f2e-4d02-b224-efa87b93bc24-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b696b86-2f2e-4d02-b224-efa87b93bc24" (UID: "0b696b86-2f2e-4d02-b224-efa87b93bc24"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:29:03.710473 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:03.710000 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b696b86-2f2e-4d02-b224-efa87b93bc24-console-config" (OuterVolumeSpecName: "console-config") pod "0b696b86-2f2e-4d02-b224-efa87b93bc24" (UID: "0b696b86-2f2e-4d02-b224-efa87b93bc24"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:29:03.710473 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:03.710069 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b696b86-2f2e-4d02-b224-efa87b93bc24-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0b696b86-2f2e-4d02-b224-efa87b93bc24" (UID: "0b696b86-2f2e-4d02-b224-efa87b93bc24"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:29:03.711841 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:03.711816 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b696b86-2f2e-4d02-b224-efa87b93bc24-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0b696b86-2f2e-4d02-b224-efa87b93bc24" (UID: "0b696b86-2f2e-4d02-b224-efa87b93bc24"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:29:03.711841 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:03.711829 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b696b86-2f2e-4d02-b224-efa87b93bc24-kube-api-access-z489v" (OuterVolumeSpecName: "kube-api-access-z489v") pod "0b696b86-2f2e-4d02-b224-efa87b93bc24" (UID: "0b696b86-2f2e-4d02-b224-efa87b93bc24"). InnerVolumeSpecName "kube-api-access-z489v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:29:03.712003 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:03.711905 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b696b86-2f2e-4d02-b224-efa87b93bc24-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0b696b86-2f2e-4d02-b224-efa87b93bc24" (UID: "0b696b86-2f2e-4d02-b224-efa87b93bc24"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:29:03.810794 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:03.810766 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0b696b86-2f2e-4d02-b224-efa87b93bc24-oauth-serving-cert\") on node \"ip-10-0-131-162.ec2.internal\" DevicePath \"\"" Apr 20 19:29:03.810794 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:03.810795 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z489v\" (UniqueName: \"kubernetes.io/projected/0b696b86-2f2e-4d02-b224-efa87b93bc24-kube-api-access-z489v\") on node \"ip-10-0-131-162.ec2.internal\" DevicePath \"\"" Apr 20 19:29:03.810972 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:03.810805 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0b696b86-2f2e-4d02-b224-efa87b93bc24-console-oauth-config\") on node \"ip-10-0-131-162.ec2.internal\" DevicePath \"\"" Apr 20 19:29:03.810972 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:03.810814 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b696b86-2f2e-4d02-b224-efa87b93bc24-service-ca\") on node \"ip-10-0-131-162.ec2.internal\" DevicePath \"\"" Apr 20 19:29:03.810972 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:03.810822 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0b696b86-2f2e-4d02-b224-efa87b93bc24-console-serving-cert\") on node \"ip-10-0-131-162.ec2.internal\" DevicePath \"\"" Apr 20 19:29:03.810972 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:03.810831 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0b696b86-2f2e-4d02-b224-efa87b93bc24-console-config\") on node \"ip-10-0-131-162.ec2.internal\" DevicePath \"\"" Apr 20 19:29:04.401621 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:04.401596 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-ff7c97478-zk79v_0b696b86-2f2e-4d02-b224-efa87b93bc24/console/0.log" Apr 20 19:29:04.402015 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:04.401709 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-ff7c97478-zk79v" Apr 20 19:29:04.402015 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:04.401710 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-ff7c97478-zk79v" event={"ID":"0b696b86-2f2e-4d02-b224-efa87b93bc24","Type":"ContainerDied","Data":"798483943bbb16abe098a2de560af99124d706d64058acad264e272c2b63f93d"} Apr 20 19:29:04.402015 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:04.401754 2572 scope.go:117] "RemoveContainer" containerID="ce6a16ba5c07d7774711f64caaf6cb517480f2c0c75d0b6680e74e5f01327e3d" Apr 20 19:29:04.422637 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:04.422612 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-ff7c97478-zk79v"] Apr 20 19:29:04.425963 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:04.425938 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-ff7c97478-zk79v"] Apr 20 19:29:04.602270 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:04.602236 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b696b86-2f2e-4d02-b224-efa87b93bc24" path="/var/lib/kubelet/pods/0b696b86-2f2e-4d02-b224-efa87b93bc24/volumes" Apr 20 19:29:19.450126 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:19.450090 2572 generic.go:358] "Generic (PLEG): container finished" podID="e52f4554-13e1-451b-851d-003e1e091adc" containerID="21213bde81c11727685bc4fffc6acb6eee908d52a8841d7d9105f5e7434e2c22" exitCode=0 Apr 20 19:29:19.450553 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:19.450169 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qf28c" event={"ID":"e52f4554-13e1-451b-851d-003e1e091adc","Type":"ContainerDied","Data":"21213bde81c11727685bc4fffc6acb6eee908d52a8841d7d9105f5e7434e2c22"} Apr 20 19:29:19.450553 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:19.450509 2572 scope.go:117] "RemoveContainer" containerID="21213bde81c11727685bc4fffc6acb6eee908d52a8841d7d9105f5e7434e2c22" Apr 20 19:29:20.454916 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:20.454883 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qf28c" event={"ID":"e52f4554-13e1-451b-851d-003e1e091adc","Type":"ContainerStarted","Data":"a1c1fd6181084b5fa74d6704f3964dcbd8caf4925662c10ea977c1d05f25aab2"} Apr 20 19:29:28.267301 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:28.267251 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:28.286918 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:28.286885 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:28.496558 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:28.496531 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:37.306883 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:37.306840 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/513dd790-7dbf-46da-821a-3493b9941466-metrics-certs\") pod \"network-metrics-daemon-7cd7d\" (UID: \"513dd790-7dbf-46da-821a-3493b9941466\") " pod="openshift-multus/network-metrics-daemon-7cd7d" Apr 20 19:29:37.309154 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:37.309133 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/513dd790-7dbf-46da-821a-3493b9941466-metrics-certs\") pod \"network-metrics-daemon-7cd7d\" (UID: \"513dd790-7dbf-46da-821a-3493b9941466\") " pod="openshift-multus/network-metrics-daemon-7cd7d" Apr 20 19:29:37.605044 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:37.605011 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-v8fpn\"" Apr 20 19:29:37.612725 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:37.612695 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cd7d" Apr 20 19:29:37.943505 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:37.943480 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7cd7d"] Apr 20 19:29:37.945927 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:29:37.945894 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod513dd790_7dbf_46da_821a_3493b9941466.slice/crio-1e3d4e5b349c1b09f1e97a7c633338611dced41d61f67510ae3a69153e96921a WatchSource:0}: Error finding container 1e3d4e5b349c1b09f1e97a7c633338611dced41d61f67510ae3a69153e96921a: Status 404 returned error can't find the container with id 1e3d4e5b349c1b09f1e97a7c633338611dced41d61f67510ae3a69153e96921a Apr 20 19:29:38.510934 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:38.510896 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7cd7d" event={"ID":"513dd790-7dbf-46da-821a-3493b9941466","Type":"ContainerStarted","Data":"1e3d4e5b349c1b09f1e97a7c633338611dced41d61f67510ae3a69153e96921a"} Apr 20 19:29:39.518637 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:39.518600 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7cd7d" event={"ID":"513dd790-7dbf-46da-821a-3493b9941466","Type":"ContainerStarted","Data":"f4f8460a1f555ffd9be0bbe80311820cc4ed5a460363e82ccfee3635a0f622b5"} Apr 20 19:29:39.518637 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:39.518636 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7cd7d" event={"ID":"513dd790-7dbf-46da-821a-3493b9941466","Type":"ContainerStarted","Data":"e2bc2f0301c7160b29eac67bd2887c7fa85758cdf0a6a2f66ab81bb15ca750ea"} Apr 20 19:29:39.535874 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:39.535769 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-7cd7d" podStartSLOduration=252.548491685 podStartE2EDuration="4m13.535750078s" podCreationTimestamp="2026-04-20 19:25:26 +0000 UTC" firstStartedPulling="2026-04-20 19:29:37.948000359 +0000 UTC m=+251.938487872" lastFinishedPulling="2026-04-20 19:29:38.935258748 +0000 UTC m=+252.925746265" observedRunningTime="2026-04-20 19:29:39.535725692 +0000 UTC m=+253.526213225" watchObservedRunningTime="2026-04-20 19:29:39.535750078 +0000 UTC m=+253.526237614" Apr 20 19:29:46.332191 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.332105 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 19:29:46.332584 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.332530 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="269f1398-7daa-4518-b6a5-65f56c2971de" containerName="prometheus" containerID="cri-o://ec97f8ede97bc37bc1c1a51b757fa87df42cfa7cf0a5f9bce3e2390d6002e5a2" gracePeriod=600 Apr 20 19:29:46.332584 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.332549 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="269f1398-7daa-4518-b6a5-65f56c2971de" containerName="kube-rbac-proxy-thanos" containerID="cri-o://01f66f38280a04bcafb66f0f77aa251b84f94200b3e9a3acf953a7873eae4fe9" gracePeriod=600 Apr 20 19:29:46.332708 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.332587 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="269f1398-7daa-4518-b6a5-65f56c2971de" containerName="kube-rbac-proxy" containerID="cri-o://1575891abb2e069e28ed5059632eddee5377ce8ebcdeb0938f7c107d81cbc188" gracePeriod=600 Apr 20 19:29:46.332708 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.332600 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="269f1398-7daa-4518-b6a5-65f56c2971de" containerName="kube-rbac-proxy-web" containerID="cri-o://59e1b9fed2aa1f8007aca0082b44d573cce9d71017bcff89c1c90b97ba5ffddc" gracePeriod=600 Apr 20 19:29:46.332708 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.332619 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="269f1398-7daa-4518-b6a5-65f56c2971de" containerName="config-reloader" containerID="cri-o://ed6ca77a45d9f9820a5bf5448f180535e55a74631f1fe00ef4c9aadfdbb8116a" gracePeriod=600 Apr 20 19:29:46.332829 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.332737 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="269f1398-7daa-4518-b6a5-65f56c2971de" containerName="thanos-sidecar" containerID="cri-o://9be3aae681c3480fac29970461ba5dcf0c6713fd0a9de7405a3f06b43f0593c1" gracePeriod=600 Apr 20 19:29:46.544048 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.544015 2572 generic.go:358] "Generic (PLEG): container finished" podID="269f1398-7daa-4518-b6a5-65f56c2971de" containerID="01f66f38280a04bcafb66f0f77aa251b84f94200b3e9a3acf953a7873eae4fe9" exitCode=0 Apr 20 19:29:46.544048 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.544041 2572 generic.go:358] "Generic (PLEG): container finished" podID="269f1398-7daa-4518-b6a5-65f56c2971de" containerID="1575891abb2e069e28ed5059632eddee5377ce8ebcdeb0938f7c107d81cbc188" exitCode=0 Apr 20 19:29:46.544048 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.544047 2572 generic.go:358] "Generic (PLEG): container finished" podID="269f1398-7daa-4518-b6a5-65f56c2971de" containerID="59e1b9fed2aa1f8007aca0082b44d573cce9d71017bcff89c1c90b97ba5ffddc" exitCode=0 Apr 20 19:29:46.544048 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.544053 2572 generic.go:358] "Generic (PLEG): container finished" podID="269f1398-7daa-4518-b6a5-65f56c2971de" containerID="9be3aae681c3480fac29970461ba5dcf0c6713fd0a9de7405a3f06b43f0593c1" exitCode=0 Apr 20 19:29:46.544048 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.544058 2572 generic.go:358] "Generic (PLEG): container finished" podID="269f1398-7daa-4518-b6a5-65f56c2971de" containerID="ed6ca77a45d9f9820a5bf5448f180535e55a74631f1fe00ef4c9aadfdbb8116a" exitCode=0 Apr 20 19:29:46.544351 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.544063 2572 generic.go:358] "Generic (PLEG): container finished" podID="269f1398-7daa-4518-b6a5-65f56c2971de" containerID="ec97f8ede97bc37bc1c1a51b757fa87df42cfa7cf0a5f9bce3e2390d6002e5a2" exitCode=0 Apr 20 19:29:46.544351 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.544088 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"269f1398-7daa-4518-b6a5-65f56c2971de","Type":"ContainerDied","Data":"01f66f38280a04bcafb66f0f77aa251b84f94200b3e9a3acf953a7873eae4fe9"} Apr 20 19:29:46.544351 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.544127 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"269f1398-7daa-4518-b6a5-65f56c2971de","Type":"ContainerDied","Data":"1575891abb2e069e28ed5059632eddee5377ce8ebcdeb0938f7c107d81cbc188"} Apr 20 19:29:46.544351 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.544141 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"269f1398-7daa-4518-b6a5-65f56c2971de","Type":"ContainerDied","Data":"59e1b9fed2aa1f8007aca0082b44d573cce9d71017bcff89c1c90b97ba5ffddc"} Apr 20 19:29:46.544351 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.544154 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"269f1398-7daa-4518-b6a5-65f56c2971de","Type":"ContainerDied","Data":"9be3aae681c3480fac29970461ba5dcf0c6713fd0a9de7405a3f06b43f0593c1"} Apr 20 19:29:46.544351 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.544169 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"269f1398-7daa-4518-b6a5-65f56c2971de","Type":"ContainerDied","Data":"ed6ca77a45d9f9820a5bf5448f180535e55a74631f1fe00ef4c9aadfdbb8116a"} Apr 20 19:29:46.544351 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.544180 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"269f1398-7daa-4518-b6a5-65f56c2971de","Type":"ContainerDied","Data":"ec97f8ede97bc37bc1c1a51b757fa87df42cfa7cf0a5f9bce3e2390d6002e5a2"} Apr 20 19:29:46.572417 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.572394 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:46.690885 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.690787 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"269f1398-7daa-4518-b6a5-65f56c2971de\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " Apr 20 19:29:46.690885 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.690833 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-web-config\") pod \"269f1398-7daa-4518-b6a5-65f56c2971de\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " Apr 20 19:29:46.690885 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.690862 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-thanos-prometheus-http-client-file\") pod \"269f1398-7daa-4518-b6a5-65f56c2971de\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " Apr 20 19:29:46.691175 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.690896 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/269f1398-7daa-4518-b6a5-65f56c2971de-prometheus-trusted-ca-bundle\") pod \"269f1398-7daa-4518-b6a5-65f56c2971de\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " Apr 20 19:29:46.691175 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.690926 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/269f1398-7daa-4518-b6a5-65f56c2971de-prometheus-k8s-db\") pod \"269f1398-7daa-4518-b6a5-65f56c2971de\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " Apr 20 19:29:46.691175 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.690964 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-secret-metrics-client-certs\") pod \"269f1398-7daa-4518-b6a5-65f56c2971de\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " Apr 20 19:29:46.691175 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.690990 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/269f1398-7daa-4518-b6a5-65f56c2971de-config-out\") pod \"269f1398-7daa-4518-b6a5-65f56c2971de\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " Apr 20 19:29:46.691175 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.691034 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/269f1398-7daa-4518-b6a5-65f56c2971de-configmap-metrics-client-ca\") pod \"269f1398-7daa-4518-b6a5-65f56c2971de\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " Apr 20 19:29:46.691175 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.691086 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/269f1398-7daa-4518-b6a5-65f56c2971de-configmap-kubelet-serving-ca-bundle\") pod \"269f1398-7daa-4518-b6a5-65f56c2971de\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " Apr 20 19:29:46.691175 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.691120 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/269f1398-7daa-4518-b6a5-65f56c2971de-tls-assets\") pod \"269f1398-7daa-4518-b6a5-65f56c2971de\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " Apr 20 19:29:46.691175 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.691145 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/269f1398-7daa-4518-b6a5-65f56c2971de-prometheus-k8s-rulefiles-0\") pod \"269f1398-7daa-4518-b6a5-65f56c2971de\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " Apr 20 19:29:46.691728 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.691188 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/269f1398-7daa-4518-b6a5-65f56c2971de-configmap-serving-certs-ca-bundle\") pod \"269f1398-7daa-4518-b6a5-65f56c2971de\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " Apr 20 19:29:46.691728 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.691229 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-secret-kube-rbac-proxy\") pod \"269f1398-7daa-4518-b6a5-65f56c2971de\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " Apr 20 19:29:46.691728 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.691288 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-config\") pod \"269f1398-7daa-4518-b6a5-65f56c2971de\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " Apr 20 19:29:46.691728 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.691320 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cs2bv\" (UniqueName: \"kubernetes.io/projected/269f1398-7daa-4518-b6a5-65f56c2971de-kube-api-access-cs2bv\") pod \"269f1398-7daa-4518-b6a5-65f56c2971de\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " Apr 20 19:29:46.691728 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.691341 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-secret-prometheus-k8s-tls\") pod \"269f1398-7daa-4518-b6a5-65f56c2971de\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " Apr 20 19:29:46.691728 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.691364 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-secret-grpc-tls\") pod \"269f1398-7daa-4518-b6a5-65f56c2971de\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " Apr 20 19:29:46.691728 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.691389 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"269f1398-7daa-4518-b6a5-65f56c2971de\" (UID: \"269f1398-7daa-4518-b6a5-65f56c2971de\") " Apr 20 19:29:46.691728 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.691404 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/269f1398-7daa-4518-b6a5-65f56c2971de-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "269f1398-7daa-4518-b6a5-65f56c2971de" (UID: "269f1398-7daa-4518-b6a5-65f56c2971de"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:29:46.691728 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.691724 2572 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/269f1398-7daa-4518-b6a5-65f56c2971de-prometheus-trusted-ca-bundle\") on node \"ip-10-0-131-162.ec2.internal\" DevicePath \"\"" Apr 20 19:29:46.692162 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.692059 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/269f1398-7daa-4518-b6a5-65f56c2971de-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "269f1398-7daa-4518-b6a5-65f56c2971de" (UID: "269f1398-7daa-4518-b6a5-65f56c2971de"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:29:46.692565 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.692535 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/269f1398-7daa-4518-b6a5-65f56c2971de-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "269f1398-7daa-4518-b6a5-65f56c2971de" (UID: "269f1398-7daa-4518-b6a5-65f56c2971de"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:29:46.692924 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.692900 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/269f1398-7daa-4518-b6a5-65f56c2971de-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "269f1398-7daa-4518-b6a5-65f56c2971de" (UID: "269f1398-7daa-4518-b6a5-65f56c2971de"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:29:46.693294 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.693265 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/269f1398-7daa-4518-b6a5-65f56c2971de-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "269f1398-7daa-4518-b6a5-65f56c2971de" (UID: "269f1398-7daa-4518-b6a5-65f56c2971de"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:29:46.694042 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.693804 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "269f1398-7daa-4518-b6a5-65f56c2971de" (UID: "269f1398-7daa-4518-b6a5-65f56c2971de"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:29:46.694146 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.694053 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/269f1398-7daa-4518-b6a5-65f56c2971de-config-out" (OuterVolumeSpecName: "config-out") pod "269f1398-7daa-4518-b6a5-65f56c2971de" (UID: "269f1398-7daa-4518-b6a5-65f56c2971de"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:29:46.694146 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.694063 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "269f1398-7daa-4518-b6a5-65f56c2971de" (UID: "269f1398-7daa-4518-b6a5-65f56c2971de"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:29:46.694252 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.694228 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "269f1398-7daa-4518-b6a5-65f56c2971de" (UID: "269f1398-7daa-4518-b6a5-65f56c2971de"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:29:46.694451 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.694409 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "269f1398-7daa-4518-b6a5-65f56c2971de" (UID: "269f1398-7daa-4518-b6a5-65f56c2971de"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:29:46.694607 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.694582 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/269f1398-7daa-4518-b6a5-65f56c2971de-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "269f1398-7daa-4518-b6a5-65f56c2971de" (UID: "269f1398-7daa-4518-b6a5-65f56c2971de"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:29:46.695902 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.695869 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "269f1398-7daa-4518-b6a5-65f56c2971de" (UID: "269f1398-7daa-4518-b6a5-65f56c2971de"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:29:46.696088 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.696054 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-config" (OuterVolumeSpecName: "config") pod "269f1398-7daa-4518-b6a5-65f56c2971de" (UID: "269f1398-7daa-4518-b6a5-65f56c2971de"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:29:46.696088 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.696066 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/269f1398-7daa-4518-b6a5-65f56c2971de-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "269f1398-7daa-4518-b6a5-65f56c2971de" (UID: "269f1398-7daa-4518-b6a5-65f56c2971de"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:29:46.696288 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.696272 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "269f1398-7daa-4518-b6a5-65f56c2971de" (UID: "269f1398-7daa-4518-b6a5-65f56c2971de"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:29:46.696355 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.696319 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/269f1398-7daa-4518-b6a5-65f56c2971de-kube-api-access-cs2bv" (OuterVolumeSpecName: "kube-api-access-cs2bv") pod "269f1398-7daa-4518-b6a5-65f56c2971de" (UID: "269f1398-7daa-4518-b6a5-65f56c2971de"). InnerVolumeSpecName "kube-api-access-cs2bv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:29:46.697066 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.697039 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "269f1398-7daa-4518-b6a5-65f56c2971de" (UID: "269f1398-7daa-4518-b6a5-65f56c2971de"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:29:46.705139 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.705111 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-web-config" (OuterVolumeSpecName: "web-config") pod "269f1398-7daa-4518-b6a5-65f56c2971de" (UID: "269f1398-7daa-4518-b6a5-65f56c2971de"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:29:46.792345 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.792314 2572 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-secret-metrics-client-certs\") on node \"ip-10-0-131-162.ec2.internal\" DevicePath \"\"" Apr 20 19:29:46.792345 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.792342 2572 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/269f1398-7daa-4518-b6a5-65f56c2971de-config-out\") on node \"ip-10-0-131-162.ec2.internal\" DevicePath \"\"" Apr 20 19:29:46.792345 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.792353 2572 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/269f1398-7daa-4518-b6a5-65f56c2971de-configmap-metrics-client-ca\") on node \"ip-10-0-131-162.ec2.internal\" DevicePath \"\"" Apr 20 19:29:46.792563 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.792364 2572 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/269f1398-7daa-4518-b6a5-65f56c2971de-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-131-162.ec2.internal\" DevicePath \"\"" Apr 20 19:29:46.792563 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.792373 2572 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/269f1398-7daa-4518-b6a5-65f56c2971de-tls-assets\") on node \"ip-10-0-131-162.ec2.internal\" DevicePath \"\"" Apr 20 19:29:46.792563 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.792381 2572 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/269f1398-7daa-4518-b6a5-65f56c2971de-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-131-162.ec2.internal\" DevicePath \"\"" Apr 20 19:29:46.792563 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.792390 2572 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/269f1398-7daa-4518-b6a5-65f56c2971de-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-131-162.ec2.internal\" DevicePath \"\"" Apr 20 19:29:46.792563 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.792400 2572 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-secret-kube-rbac-proxy\") on node \"ip-10-0-131-162.ec2.internal\" DevicePath \"\"" Apr 20 19:29:46.792563 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.792409 2572 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-config\") on node \"ip-10-0-131-162.ec2.internal\" DevicePath \"\"" Apr 20 19:29:46.792563 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.792417 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cs2bv\" (UniqueName: \"kubernetes.io/projected/269f1398-7daa-4518-b6a5-65f56c2971de-kube-api-access-cs2bv\") on node \"ip-10-0-131-162.ec2.internal\" DevicePath \"\"" Apr 20 19:29:46.792563 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.792427 2572 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-secret-prometheus-k8s-tls\") on node \"ip-10-0-131-162.ec2.internal\" DevicePath \"\"" Apr 20 19:29:46.792563 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.792435 2572 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-secret-grpc-tls\") on node \"ip-10-0-131-162.ec2.internal\" DevicePath \"\"" Apr 20 19:29:46.792563 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.792469 2572 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-131-162.ec2.internal\" DevicePath \"\"" Apr 20 19:29:46.792563 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.792478 2572 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-131-162.ec2.internal\" DevicePath \"\"" Apr 20 19:29:46.792563 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.792487 2572 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-web-config\") on node \"ip-10-0-131-162.ec2.internal\" DevicePath \"\"" Apr 20 19:29:46.792563 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.792496 2572 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/269f1398-7daa-4518-b6a5-65f56c2971de-thanos-prometheus-http-client-file\") on node \"ip-10-0-131-162.ec2.internal\" DevicePath \"\"" Apr 20 19:29:46.792563 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:46.792505 2572 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/269f1398-7daa-4518-b6a5-65f56c2971de-prometheus-k8s-db\") on node \"ip-10-0-131-162.ec2.internal\" DevicePath \"\"" Apr 20 19:29:47.550672 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.550636 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"269f1398-7daa-4518-b6a5-65f56c2971de","Type":"ContainerDied","Data":"57ed1b37c2bd15efb0aee8ae38cfcd1284f105e371052383c788c68320d6557c"} Apr 20 19:29:47.551094 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.550692 2572 scope.go:117] "RemoveContainer" containerID="01f66f38280a04bcafb66f0f77aa251b84f94200b3e9a3acf953a7873eae4fe9" Apr 20 19:29:47.551094 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.550736 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.558673 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.558516 2572 scope.go:117] "RemoveContainer" containerID="1575891abb2e069e28ed5059632eddee5377ce8ebcdeb0938f7c107d81cbc188" Apr 20 19:29:47.566347 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.566328 2572 scope.go:117] "RemoveContainer" containerID="59e1b9fed2aa1f8007aca0082b44d573cce9d71017bcff89c1c90b97ba5ffddc" Apr 20 19:29:47.573038 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.573014 2572 scope.go:117] "RemoveContainer" containerID="9be3aae681c3480fac29970461ba5dcf0c6713fd0a9de7405a3f06b43f0593c1" Apr 20 19:29:47.574715 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.574694 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 19:29:47.577848 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.577821 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 19:29:47.580714 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.580691 2572 scope.go:117] "RemoveContainer" containerID="ed6ca77a45d9f9820a5bf5448f180535e55a74631f1fe00ef4c9aadfdbb8116a" Apr 20 19:29:47.587099 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.587082 2572 scope.go:117] "RemoveContainer" containerID="ec97f8ede97bc37bc1c1a51b757fa87df42cfa7cf0a5f9bce3e2390d6002e5a2" Apr 20 19:29:47.594226 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.594207 2572 scope.go:117] "RemoveContainer" containerID="f62e7a89f63e49b55041e818a991a2298b6ae22a0521dfd5f2b935ffb308e5bf" Apr 20 19:29:47.601489 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.601399 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 19:29:47.601824 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.601804 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b696b86-2f2e-4d02-b224-efa87b93bc24" containerName="console" Apr 20 19:29:47.601898 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.601827 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b696b86-2f2e-4d02-b224-efa87b93bc24" containerName="console" Apr 20 19:29:47.601898 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.601839 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="269f1398-7daa-4518-b6a5-65f56c2971de" containerName="kube-rbac-proxy-web" Apr 20 19:29:47.601898 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.601847 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="269f1398-7daa-4518-b6a5-65f56c2971de" containerName="kube-rbac-proxy-web" Apr 20 19:29:47.601898 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.601868 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="269f1398-7daa-4518-b6a5-65f56c2971de" containerName="prometheus" Apr 20 19:29:47.601898 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.601877 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="269f1398-7daa-4518-b6a5-65f56c2971de" containerName="prometheus" Apr 20 19:29:47.601898 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.601888 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="269f1398-7daa-4518-b6a5-65f56c2971de" containerName="init-config-reloader" Apr 20 19:29:47.601898 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.601894 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="269f1398-7daa-4518-b6a5-65f56c2971de" containerName="init-config-reloader" Apr 20 19:29:47.601898 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.601900 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="269f1398-7daa-4518-b6a5-65f56c2971de" containerName="config-reloader" Apr 20 19:29:47.602267 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.601906 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="269f1398-7daa-4518-b6a5-65f56c2971de" containerName="config-reloader" Apr 20 19:29:47.602267 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.601913 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="269f1398-7daa-4518-b6a5-65f56c2971de" containerName="kube-rbac-proxy" Apr 20 19:29:47.602267 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.601918 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="269f1398-7daa-4518-b6a5-65f56c2971de" containerName="kube-rbac-proxy" Apr 20 19:29:47.602267 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.601926 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="269f1398-7daa-4518-b6a5-65f56c2971de" containerName="kube-rbac-proxy-thanos" Apr 20 19:29:47.602267 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.601930 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="269f1398-7daa-4518-b6a5-65f56c2971de" containerName="kube-rbac-proxy-thanos" Apr 20 19:29:47.602267 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.601937 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="269f1398-7daa-4518-b6a5-65f56c2971de" containerName="thanos-sidecar" Apr 20 19:29:47.602267 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.601941 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="269f1398-7daa-4518-b6a5-65f56c2971de" containerName="thanos-sidecar" Apr 20 19:29:47.602267 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.601992 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="269f1398-7daa-4518-b6a5-65f56c2971de" containerName="config-reloader" Apr 20 19:29:47.602267 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.602000 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="0b696b86-2f2e-4d02-b224-efa87b93bc24" containerName="console" Apr 20 19:29:47.602267 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.602008 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="269f1398-7daa-4518-b6a5-65f56c2971de" containerName="prometheus" Apr 20 19:29:47.602267 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.602016 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="269f1398-7daa-4518-b6a5-65f56c2971de" containerName="kube-rbac-proxy-web" Apr 20 19:29:47.602267 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.602023 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="269f1398-7daa-4518-b6a5-65f56c2971de" containerName="kube-rbac-proxy" Apr 20 19:29:47.602267 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.602032 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="269f1398-7daa-4518-b6a5-65f56c2971de" containerName="kube-rbac-proxy-thanos" Apr 20 19:29:47.602267 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.602039 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="269f1398-7daa-4518-b6a5-65f56c2971de" containerName="thanos-sidecar" Apr 20 19:29:47.607023 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.607004 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.609729 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.609705 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 20 19:29:47.609729 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.609726 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-tncz6\"" Apr 20 19:29:47.609908 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.609755 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 20 19:29:47.609908 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.609765 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-51sjjv57t37t6\"" Apr 20 19:29:47.609908 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.609705 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 20 19:29:47.609908 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.609767 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 20 19:29:47.610077 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.609964 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 20 19:29:47.610077 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.609963 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 20 19:29:47.610264 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.610251 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 20 19:29:47.610304 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.610281 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 20 19:29:47.610493 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.610478 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 20 19:29:47.610610 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.610593 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 20 19:29:47.610675 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.610599 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 20 19:29:47.612535 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.612515 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 20 19:29:47.614938 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.614919 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 20 19:29:47.618074 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.618056 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 19:29:47.700415 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.700373 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.700617 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.700477 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.700617 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.700501 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.700617 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.700520 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.700617 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.700541 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.700617 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.700565 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvsgv\" (UniqueName: \"kubernetes.io/projected/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-kube-api-access-fvsgv\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.700617 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.700592 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.700617 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.700608 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.700865 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.700684 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-config-out\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.700865 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.700723 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.700865 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.700756 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-web-config\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.700865 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.700800 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.700865 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.700827 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.701026 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.700864 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.701026 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.700902 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.701026 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.700942 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-config\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.701026 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.700960 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.701026 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.700983 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.802324 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.802288 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-config-out\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.802324 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.802326 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.802584 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.802344 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-web-config\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.802584 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.802371 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.802584 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.802398 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.802584 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.802426 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.802584 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.802480 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.803022 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.802991 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-config\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.803115 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.803047 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.803115 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.803088 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.803227 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.803118 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.803227 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.803192 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.803227 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.803215 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.803390 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.803220 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.803390 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.803282 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.803390 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.803316 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.803390 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.803345 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fvsgv\" (UniqueName: \"kubernetes.io/projected/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-kube-api-access-fvsgv\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.803390 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.803378 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.803678 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.803409 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.804080 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.804053 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.804180 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.804136 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.804753 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.804725 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.805555 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.805531 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.805684 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.805623 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.806185 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.806159 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.806626 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.806601 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.807024 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.806984 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.807098 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.807030 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-web-config\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.807688 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.807516 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-config\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.807774 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.807743 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-config-out\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.807894 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.807876 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.808104 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.808084 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.808487 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.808472 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.808662 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.808644 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.808883 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.808867 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.813513 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.813492 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvsgv\" (UniqueName: \"kubernetes.io/projected/6e5df1e4-62f1-4da8-ab77-58a1cddc3055-kube-api-access-fvsgv\") pod \"prometheus-k8s-0\" (UID: \"6e5df1e4-62f1-4da8-ab77-58a1cddc3055\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:47.917896 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:47.917799 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:29:48.048326 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:48.048293 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 19:29:48.051569 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:29:48.051541 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e5df1e4_62f1_4da8_ab77_58a1cddc3055.slice/crio-78849c75907a9388c24fb691d89ec2d06c54fd2de1d2674b212bc343813c75ab WatchSource:0}: Error finding container 78849c75907a9388c24fb691d89ec2d06c54fd2de1d2674b212bc343813c75ab: Status 404 returned error can't find the container with id 78849c75907a9388c24fb691d89ec2d06c54fd2de1d2674b212bc343813c75ab Apr 20 19:29:48.555844 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:48.555803 2572 generic.go:358] "Generic (PLEG): container finished" podID="6e5df1e4-62f1-4da8-ab77-58a1cddc3055" containerID="0c356bf065cc4b1c5385a302b2d299b903db1b889152dfa24414dd6c21f68a9b" exitCode=0 Apr 20 19:29:48.556270 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:48.555901 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6e5df1e4-62f1-4da8-ab77-58a1cddc3055","Type":"ContainerDied","Data":"0c356bf065cc4b1c5385a302b2d299b903db1b889152dfa24414dd6c21f68a9b"} Apr 20 19:29:48.556270 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:48.555947 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6e5df1e4-62f1-4da8-ab77-58a1cddc3055","Type":"ContainerStarted","Data":"78849c75907a9388c24fb691d89ec2d06c54fd2de1d2674b212bc343813c75ab"} Apr 20 19:29:48.604339 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:48.604311 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="269f1398-7daa-4518-b6a5-65f56c2971de" path="/var/lib/kubelet/pods/269f1398-7daa-4518-b6a5-65f56c2971de/volumes" Apr 20 19:29:49.565143 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:49.565108 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6e5df1e4-62f1-4da8-ab77-58a1cddc3055","Type":"ContainerStarted","Data":"b5dea33e8b66d9d9796f999ec88ab0ea293a02a287a53d965ed1e71c004d3f3b"} Apr 20 19:29:49.565143 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:49.565146 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6e5df1e4-62f1-4da8-ab77-58a1cddc3055","Type":"ContainerStarted","Data":"87a008216a1dfe65e3df8d03ca609ddc2f3925776b8c8df02fed25983f16979b"} Apr 20 19:29:49.565672 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:49.565159 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6e5df1e4-62f1-4da8-ab77-58a1cddc3055","Type":"ContainerStarted","Data":"6861a53c4af7df430aac853174a2277e41cb192ceae304380d9dfeac00625b97"} Apr 20 19:29:49.565672 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:49.565170 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6e5df1e4-62f1-4da8-ab77-58a1cddc3055","Type":"ContainerStarted","Data":"eabb711cb4d224f03c82eb0b895f922538ec956ed7f0c595d26df6e817c73662"} Apr 20 19:29:49.565672 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:49.565180 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6e5df1e4-62f1-4da8-ab77-58a1cddc3055","Type":"ContainerStarted","Data":"ffa802dcf438109622b3289ca246c7c6867dbb735004c65e06dc32fb1ac14048"} Apr 20 19:29:49.565672 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:49.565191 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6e5df1e4-62f1-4da8-ab77-58a1cddc3055","Type":"ContainerStarted","Data":"8b518e1fe9f1fd26e10d7431204a41806168f1e6eef8273bc74eedf48e8bc52d"} Apr 20 19:29:49.594405 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:49.594342 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.594322305 podStartE2EDuration="2.594322305s" podCreationTimestamp="2026-04-20 19:29:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:29:49.592429836 +0000 UTC m=+263.582917408" watchObservedRunningTime="2026-04-20 19:29:49.594322305 +0000 UTC m=+263.584809840" Apr 20 19:29:52.918659 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:29:52.918615 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:30:26.479369 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:30:26.479332 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-qpl5s_a25d18d6-5add-4c28-a671-0ee5222cb999/console-operator/1.log" Apr 20 19:30:26.480027 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:30:26.480010 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-qpl5s_a25d18d6-5add-4c28-a671-0ee5222cb999/console-operator/1.log" Apr 20 19:30:26.491919 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:30:26.491898 2572 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 19:30:47.918358 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:30:47.918321 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:30:47.934206 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:30:47.934180 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:30:48.754294 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:30:48.754268 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:31:53.801872 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:53.801835 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-9f747d685-2n2zg"] Apr 20 19:31:53.805394 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:53.805371 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-2n2zg" Apr 20 19:31:53.808263 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:53.808230 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 19:31:53.808421 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:53.808273 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 19:31:53.808421 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:53.808386 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 19:31:53.808421 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:53.808408 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 19:31:53.808676 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:53.808487 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-cx2tl\"" Apr 20 19:31:53.826251 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:53.826224 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-9f747d685-2n2zg"] Apr 20 19:31:53.878744 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:53.878705 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzvp5\" (UniqueName: \"kubernetes.io/projected/3507a6b2-e19b-491a-bd27-a1bb5136cda2-kube-api-access-wzvp5\") pod \"opendatahub-operator-controller-manager-9f747d685-2n2zg\" (UID: \"3507a6b2-e19b-491a-bd27-a1bb5136cda2\") " pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-2n2zg" Apr 20 19:31:53.878920 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:53.878812 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3507a6b2-e19b-491a-bd27-a1bb5136cda2-apiservice-cert\") pod \"opendatahub-operator-controller-manager-9f747d685-2n2zg\" (UID: \"3507a6b2-e19b-491a-bd27-a1bb5136cda2\") " pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-2n2zg" Apr 20 19:31:53.878920 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:53.878887 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3507a6b2-e19b-491a-bd27-a1bb5136cda2-webhook-cert\") pod \"opendatahub-operator-controller-manager-9f747d685-2n2zg\" (UID: \"3507a6b2-e19b-491a-bd27-a1bb5136cda2\") " pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-2n2zg" Apr 20 19:31:53.979483 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:53.979427 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3507a6b2-e19b-491a-bd27-a1bb5136cda2-apiservice-cert\") pod \"opendatahub-operator-controller-manager-9f747d685-2n2zg\" (UID: \"3507a6b2-e19b-491a-bd27-a1bb5136cda2\") " pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-2n2zg" Apr 20 19:31:53.979660 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:53.979516 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3507a6b2-e19b-491a-bd27-a1bb5136cda2-webhook-cert\") pod \"opendatahub-operator-controller-manager-9f747d685-2n2zg\" (UID: \"3507a6b2-e19b-491a-bd27-a1bb5136cda2\") " pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-2n2zg" Apr 20 19:31:53.979660 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:53.979553 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wzvp5\" (UniqueName: \"kubernetes.io/projected/3507a6b2-e19b-491a-bd27-a1bb5136cda2-kube-api-access-wzvp5\") pod \"opendatahub-operator-controller-manager-9f747d685-2n2zg\" (UID: \"3507a6b2-e19b-491a-bd27-a1bb5136cda2\") " pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-2n2zg" Apr 20 19:31:53.981969 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:53.981944 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3507a6b2-e19b-491a-bd27-a1bb5136cda2-webhook-cert\") pod \"opendatahub-operator-controller-manager-9f747d685-2n2zg\" (UID: \"3507a6b2-e19b-491a-bd27-a1bb5136cda2\") " pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-2n2zg" Apr 20 19:31:53.982091 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:53.982003 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3507a6b2-e19b-491a-bd27-a1bb5136cda2-apiservice-cert\") pod \"opendatahub-operator-controller-manager-9f747d685-2n2zg\" (UID: \"3507a6b2-e19b-491a-bd27-a1bb5136cda2\") " pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-2n2zg" Apr 20 19:31:53.990320 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:53.990291 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzvp5\" (UniqueName: \"kubernetes.io/projected/3507a6b2-e19b-491a-bd27-a1bb5136cda2-kube-api-access-wzvp5\") pod \"opendatahub-operator-controller-manager-9f747d685-2n2zg\" (UID: \"3507a6b2-e19b-491a-bd27-a1bb5136cda2\") " pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-2n2zg" Apr 20 19:31:54.116840 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:54.116806 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-2n2zg" Apr 20 19:31:54.255094 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:54.255066 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-9f747d685-2n2zg"] Apr 20 19:31:54.257841 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:31:54.257814 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3507a6b2_e19b_491a_bd27_a1bb5136cda2.slice/crio-0e372b7b29d4bbf7076c00452c8f9eb096a996fcd302659133fca6937d1c6289 WatchSource:0}: Error finding container 0e372b7b29d4bbf7076c00452c8f9eb096a996fcd302659133fca6937d1c6289: Status 404 returned error can't find the container with id 0e372b7b29d4bbf7076c00452c8f9eb096a996fcd302659133fca6937d1c6289 Apr 20 19:31:54.259546 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:54.259527 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 19:31:54.945555 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:54.945519 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-2n2zg" event={"ID":"3507a6b2-e19b-491a-bd27-a1bb5136cda2","Type":"ContainerStarted","Data":"0e372b7b29d4bbf7076c00452c8f9eb096a996fcd302659133fca6937d1c6289"} Apr 20 19:31:56.954901 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:56.954868 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-2n2zg" event={"ID":"3507a6b2-e19b-491a-bd27-a1bb5136cda2","Type":"ContainerStarted","Data":"828bd530a3ecbd8173cc09c4288de55e7841bcc848a8f0aabe656210fe862579"} Apr 20 19:31:56.955353 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:56.954994 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-2n2zg" Apr 20 19:31:56.975919 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:56.975876 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-2n2zg" podStartSLOduration=1.380948565 podStartE2EDuration="3.975862331s" podCreationTimestamp="2026-04-20 19:31:53 +0000 UTC" firstStartedPulling="2026-04-20 19:31:54.259646517 +0000 UTC m=+388.250134046" lastFinishedPulling="2026-04-20 19:31:56.854560298 +0000 UTC m=+390.845047812" observedRunningTime="2026-04-20 19:31:56.973986975 +0000 UTC m=+390.964474546" watchObservedRunningTime="2026-04-20 19:31:56.975862331 +0000 UTC m=+390.966349866" Apr 20 19:31:58.864819 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:58.864777 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-6ddf46b867-5jsw7"] Apr 20 19:31:58.868384 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:58.868358 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6ddf46b867-5jsw7" Apr 20 19:31:58.872168 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:58.872142 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:31:58.872168 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:58.872157 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-8lsbm\"" Apr 20 19:31:58.872353 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:58.872183 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 20 19:31:58.872353 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:58.872162 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 20 19:31:58.872353 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:58.872218 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 20 19:31:58.872353 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:58.872201 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 20 19:31:58.880033 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:58.880009 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6ddf46b867-5jsw7"] Apr 20 19:31:58.926337 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:58.926298 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrq87\" (UniqueName: \"kubernetes.io/projected/2020f544-06f0-42d2-94b5-697f9b55cd3a-kube-api-access-mrq87\") pod \"lws-controller-manager-6ddf46b867-5jsw7\" (UID: \"2020f544-06f0-42d2-94b5-697f9b55cd3a\") " pod="openshift-lws-operator/lws-controller-manager-6ddf46b867-5jsw7" Apr 20 19:31:58.926538 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:58.926342 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2020f544-06f0-42d2-94b5-697f9b55cd3a-cert\") pod \"lws-controller-manager-6ddf46b867-5jsw7\" (UID: \"2020f544-06f0-42d2-94b5-697f9b55cd3a\") " pod="openshift-lws-operator/lws-controller-manager-6ddf46b867-5jsw7" Apr 20 19:31:58.926538 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:58.926482 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/2020f544-06f0-42d2-94b5-697f9b55cd3a-manager-config\") pod \"lws-controller-manager-6ddf46b867-5jsw7\" (UID: \"2020f544-06f0-42d2-94b5-697f9b55cd3a\") " pod="openshift-lws-operator/lws-controller-manager-6ddf46b867-5jsw7" Apr 20 19:31:58.926538 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:58.926525 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/2020f544-06f0-42d2-94b5-697f9b55cd3a-metrics-cert\") pod \"lws-controller-manager-6ddf46b867-5jsw7\" (UID: \"2020f544-06f0-42d2-94b5-697f9b55cd3a\") " pod="openshift-lws-operator/lws-controller-manager-6ddf46b867-5jsw7" Apr 20 19:31:59.027407 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:59.027370 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/2020f544-06f0-42d2-94b5-697f9b55cd3a-manager-config\") pod \"lws-controller-manager-6ddf46b867-5jsw7\" (UID: \"2020f544-06f0-42d2-94b5-697f9b55cd3a\") " pod="openshift-lws-operator/lws-controller-manager-6ddf46b867-5jsw7" Apr 20 19:31:59.027407 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:59.027415 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/2020f544-06f0-42d2-94b5-697f9b55cd3a-metrics-cert\") pod \"lws-controller-manager-6ddf46b867-5jsw7\" (UID: \"2020f544-06f0-42d2-94b5-697f9b55cd3a\") " pod="openshift-lws-operator/lws-controller-manager-6ddf46b867-5jsw7" Apr 20 19:31:59.027673 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:59.027479 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mrq87\" (UniqueName: \"kubernetes.io/projected/2020f544-06f0-42d2-94b5-697f9b55cd3a-kube-api-access-mrq87\") pod \"lws-controller-manager-6ddf46b867-5jsw7\" (UID: \"2020f544-06f0-42d2-94b5-697f9b55cd3a\") " pod="openshift-lws-operator/lws-controller-manager-6ddf46b867-5jsw7" Apr 20 19:31:59.027673 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:59.027577 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2020f544-06f0-42d2-94b5-697f9b55cd3a-cert\") pod \"lws-controller-manager-6ddf46b867-5jsw7\" (UID: \"2020f544-06f0-42d2-94b5-697f9b55cd3a\") " pod="openshift-lws-operator/lws-controller-manager-6ddf46b867-5jsw7" Apr 20 19:31:59.028028 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:59.028006 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/2020f544-06f0-42d2-94b5-697f9b55cd3a-manager-config\") pod \"lws-controller-manager-6ddf46b867-5jsw7\" (UID: \"2020f544-06f0-42d2-94b5-697f9b55cd3a\") " pod="openshift-lws-operator/lws-controller-manager-6ddf46b867-5jsw7" Apr 20 19:31:59.029970 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:59.029952 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2020f544-06f0-42d2-94b5-697f9b55cd3a-cert\") pod \"lws-controller-manager-6ddf46b867-5jsw7\" (UID: \"2020f544-06f0-42d2-94b5-697f9b55cd3a\") " pod="openshift-lws-operator/lws-controller-manager-6ddf46b867-5jsw7" Apr 20 19:31:59.030138 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:59.030118 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/2020f544-06f0-42d2-94b5-697f9b55cd3a-metrics-cert\") pod \"lws-controller-manager-6ddf46b867-5jsw7\" (UID: \"2020f544-06f0-42d2-94b5-697f9b55cd3a\") " pod="openshift-lws-operator/lws-controller-manager-6ddf46b867-5jsw7" Apr 20 19:31:59.035642 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:59.035614 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrq87\" (UniqueName: \"kubernetes.io/projected/2020f544-06f0-42d2-94b5-697f9b55cd3a-kube-api-access-mrq87\") pod \"lws-controller-manager-6ddf46b867-5jsw7\" (UID: \"2020f544-06f0-42d2-94b5-697f9b55cd3a\") " pod="openshift-lws-operator/lws-controller-manager-6ddf46b867-5jsw7" Apr 20 19:31:59.177537 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:59.177428 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6ddf46b867-5jsw7" Apr 20 19:31:59.321017 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:59.320994 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6ddf46b867-5jsw7"] Apr 20 19:31:59.323850 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:31:59.323822 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2020f544_06f0_42d2_94b5_697f9b55cd3a.slice/crio-a6a8cd8553a66b8d6b2c6d9803f8b016fa3d710a95ac5d350d410ef81cb51bc8 WatchSource:0}: Error finding container a6a8cd8553a66b8d6b2c6d9803f8b016fa3d710a95ac5d350d410ef81cb51bc8: Status 404 returned error can't find the container with id a6a8cd8553a66b8d6b2c6d9803f8b016fa3d710a95ac5d350d410ef81cb51bc8 Apr 20 19:31:59.966598 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:31:59.966562 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6ddf46b867-5jsw7" event={"ID":"2020f544-06f0-42d2-94b5-697f9b55cd3a","Type":"ContainerStarted","Data":"a6a8cd8553a66b8d6b2c6d9803f8b016fa3d710a95ac5d350d410ef81cb51bc8"} Apr 20 19:32:01.974179 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:32:01.974145 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6ddf46b867-5jsw7" event={"ID":"2020f544-06f0-42d2-94b5-697f9b55cd3a","Type":"ContainerStarted","Data":"1094f0613fe4bef39b21ec303d4e78de12f4db4e6511413a981453189cd967eb"} Apr 20 19:32:01.974588 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:32:01.974263 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-6ddf46b867-5jsw7" Apr 20 19:32:01.995503 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:32:01.995451 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-6ddf46b867-5jsw7" podStartSLOduration=1.453964627 podStartE2EDuration="3.995421328s" podCreationTimestamp="2026-04-20 19:31:58 +0000 UTC" firstStartedPulling="2026-04-20 19:31:59.325707114 +0000 UTC m=+393.316194627" lastFinishedPulling="2026-04-20 19:32:01.867163812 +0000 UTC m=+395.857651328" observedRunningTime="2026-04-20 19:32:01.994178537 +0000 UTC m=+395.984666076" watchObservedRunningTime="2026-04-20 19:32:01.995421328 +0000 UTC m=+395.985908862" Apr 20 19:32:07.959898 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:32:07.959869 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-2n2zg" Apr 20 19:32:12.979156 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:32:12.979127 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-6ddf46b867-5jsw7" Apr 20 19:32:57.307820 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:32:57.307723 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f9tbr7"] Apr 20 19:32:57.311432 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:32:57.311403 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f9tbr7" Apr 20 19:32:57.314152 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:32:57.314128 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 20 19:32:57.314294 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:32:57.314142 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-lp869\"" Apr 20 19:32:57.314363 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:32:57.314142 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 19:32:57.314423 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:32:57.314168 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 19:32:57.324316 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:32:57.324292 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f9tbr7"] Apr 20 19:32:57.429604 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:32:57.429568 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f9tbr7\" (UID: \"5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f9tbr7" Apr 20 19:32:57.429604 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:32:57.429616 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f9tbr7\" (UID: \"5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f9tbr7" Apr 20 19:32:57.429835 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:32:57.429652 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f9tbr7\" (UID: \"5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f9tbr7" Apr 20 19:32:57.429835 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:32:57.429677 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f9tbr7\" (UID: \"5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f9tbr7" Apr 20 19:32:57.429835 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:32:57.429709 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mndwc\" (UniqueName: \"kubernetes.io/projected/5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8-kube-api-access-mndwc\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f9tbr7\" (UID: \"5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f9tbr7" Apr 20 19:32:57.429835 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:32:57.429801 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f9tbr7\" (UID: \"5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f9tbr7" Apr 20 19:32:57.429835 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:32:57.429831 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f9tbr7\" (UID: \"5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f9tbr7" Apr 20 19:32:57.430014 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:32:57.429854 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f9tbr7\" (UID: \"5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f9tbr7" Apr 20 19:32:57.430014 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:32:57.429870 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f9tbr7\" (UID: \"5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f9tbr7" Apr 20 19:32:57.530555 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:32:57.530523 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f9tbr7\" (UID: \"5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f9tbr7" Apr 20 19:32:57.530711 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:32:57.530560 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f9tbr7\" (UID: \"5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f9tbr7" Apr 20 19:32:57.530711 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:32:57.530591 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mndwc\" (UniqueName: \"kubernetes.io/projected/5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8-kube-api-access-mndwc\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f9tbr7\" (UID: \"5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f9tbr7" Apr 20 19:32:57.530711 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:32:57.530643 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f9tbr7\" (UID: \"5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f9tbr7" Apr 20 19:32:57.530711 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:32:57.530661 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f9tbr7\" (UID: \"5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f9tbr7" Apr 20 19:32:57.530711 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:32:57.530689 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f9tbr7\" (UID: \"5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f9tbr7" Apr 20 19:32:57.530934 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:32:57.530717 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f9tbr7\" (UID: \"5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f9tbr7" Apr 20 19:32:57.530934 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:32:57.530761 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f9tbr7\" (UID: \"5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f9tbr7" Apr 20 19:32:57.530934 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:32:57.530792 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f9tbr7\" (UID: \"5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f9tbr7" Apr 20 19:32:57.531087 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:32:57.530970 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f9tbr7\" (UID: \"5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f9tbr7" Apr 20 19:32:57.531209 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:32:57.531185 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f9tbr7\" (UID: \"5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f9tbr7" Apr 20 19:32:57.531390 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:32:57.531366 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f9tbr7\" (UID: \"5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f9tbr7" Apr 20 19:32:57.531495 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:32:57.531474 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f9tbr7\" (UID: \"5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f9tbr7" Apr 20 19:32:57.531761 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:32:57.531739 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f9tbr7\" (UID: \"5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f9tbr7" Apr 20 19:32:57.533294 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:32:57.533269 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f9tbr7\" (UID: \"5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f9tbr7" Apr 20 19:32:57.533486 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:32:57.533469 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f9tbr7\" (UID: \"5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f9tbr7" Apr 20 19:32:57.538775 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:32:57.538749 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f9tbr7\" (UID: \"5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f9tbr7" Apr 20 19:32:57.539732 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:32:57.539700 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mndwc\" (UniqueName: \"kubernetes.io/projected/5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8-kube-api-access-mndwc\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f9tbr7\" (UID: \"5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f9tbr7" Apr 20 19:32:57.624757 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:32:57.624725 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f9tbr7" Apr 20 19:32:57.751127 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:32:57.751099 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f9tbr7"] Apr 20 19:32:57.754021 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:32:57.753993 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ad2a447_0e3b_4eb4_b9e6_3ac592de3ea8.slice/crio-e1c63e6d1b2bb0d74271fadd5a1e734901daf37920bc79f6e5ebdcb4b522fca9 WatchSource:0}: Error finding container e1c63e6d1b2bb0d74271fadd5a1e734901daf37920bc79f6e5ebdcb4b522fca9: Status 404 returned error can't find the container with id e1c63e6d1b2bb0d74271fadd5a1e734901daf37920bc79f6e5ebdcb4b522fca9 Apr 20 19:32:58.161869 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:32:58.161832 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f9tbr7" event={"ID":"5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8","Type":"ContainerStarted","Data":"e1c63e6d1b2bb0d74271fadd5a1e734901daf37920bc79f6e5ebdcb4b522fca9"} Apr 20 19:33:00.518391 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:00.518347 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 20 19:33:00.518773 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:00.518473 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 20 19:33:00.518773 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:00.518523 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 20 19:33:01.173875 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:01.173829 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f9tbr7" event={"ID":"5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8","Type":"ContainerStarted","Data":"8b7b41c49b6c5c39bb191839418b5ec3a734f23ebac242ad331e6532e44cf58f"} Apr 20 19:33:01.195491 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:01.195420 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f9tbr7" podStartSLOduration=1.433204989 podStartE2EDuration="4.195406011s" podCreationTimestamp="2026-04-20 19:32:57 +0000 UTC" firstStartedPulling="2026-04-20 19:32:57.755887511 +0000 UTC m=+451.746375024" lastFinishedPulling="2026-04-20 19:33:00.51808853 +0000 UTC m=+454.508576046" observedRunningTime="2026-04-20 19:33:01.193406127 +0000 UTC m=+455.183893661" watchObservedRunningTime="2026-04-20 19:33:01.195406011 +0000 UTC m=+455.185893546" Apr 20 19:33:01.625766 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:01.625728 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f9tbr7" Apr 20 19:33:01.630376 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:01.630353 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f9tbr7" Apr 20 19:33:02.177559 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:02.177531 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f9tbr7" Apr 20 19:33:02.178477 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:02.178456 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f9tbr7" Apr 20 19:33:11.391159 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:11.391117 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-kwrs9"] Apr 20 19:33:11.394414 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:11.394397 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-kwrs9" Apr 20 19:33:11.397145 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:11.397120 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-xtksc\"" Apr 20 19:33:11.397305 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:11.397130 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 19:33:11.398262 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:11.398245 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 19:33:11.402879 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:11.402858 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-kwrs9"] Apr 20 19:33:11.463863 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:11.463824 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcg7c\" (UniqueName: \"kubernetes.io/projected/63be7e17-6758-431f-8c27-821172dbbff1-kube-api-access-mcg7c\") pod \"kuadrant-operator-catalog-kwrs9\" (UID: \"63be7e17-6758-431f-8c27-821172dbbff1\") " pod="kuadrant-system/kuadrant-operator-catalog-kwrs9" Apr 20 19:33:11.564696 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:11.564656 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mcg7c\" (UniqueName: \"kubernetes.io/projected/63be7e17-6758-431f-8c27-821172dbbff1-kube-api-access-mcg7c\") pod \"kuadrant-operator-catalog-kwrs9\" (UID: \"63be7e17-6758-431f-8c27-821172dbbff1\") " pod="kuadrant-system/kuadrant-operator-catalog-kwrs9" Apr 20 19:33:11.572805 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:11.572782 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcg7c\" (UniqueName: \"kubernetes.io/projected/63be7e17-6758-431f-8c27-821172dbbff1-kube-api-access-mcg7c\") pod \"kuadrant-operator-catalog-kwrs9\" (UID: \"63be7e17-6758-431f-8c27-821172dbbff1\") " pod="kuadrant-system/kuadrant-operator-catalog-kwrs9" Apr 20 19:33:11.704901 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:11.704820 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-kwrs9" Apr 20 19:33:11.763977 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:11.763906 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-kwrs9"] Apr 20 19:33:11.827471 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:11.827431 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-kwrs9"] Apr 20 19:33:11.829627 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:33:11.829596 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63be7e17_6758_431f_8c27_821172dbbff1.slice/crio-f38460b314c80fde6dfbeda1892939a90bba57198466e25d74486cc21e4fa02d WatchSource:0}: Error finding container f38460b314c80fde6dfbeda1892939a90bba57198466e25d74486cc21e4fa02d: Status 404 returned error can't find the container with id f38460b314c80fde6dfbeda1892939a90bba57198466e25d74486cc21e4fa02d Apr 20 19:33:11.969478 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:11.969387 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-pv7v7"] Apr 20 19:33:11.973939 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:11.973924 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-pv7v7" Apr 20 19:33:11.979754 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:11.979722 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-pv7v7"] Apr 20 19:33:12.069603 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:12.069568 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65mpd\" (UniqueName: \"kubernetes.io/projected/9b137590-67f2-410c-a41d-ef930a9357e1-kube-api-access-65mpd\") pod \"kuadrant-operator-catalog-pv7v7\" (UID: \"9b137590-67f2-410c-a41d-ef930a9357e1\") " pod="kuadrant-system/kuadrant-operator-catalog-pv7v7" Apr 20 19:33:12.170400 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:12.170368 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-65mpd\" (UniqueName: \"kubernetes.io/projected/9b137590-67f2-410c-a41d-ef930a9357e1-kube-api-access-65mpd\") pod \"kuadrant-operator-catalog-pv7v7\" (UID: \"9b137590-67f2-410c-a41d-ef930a9357e1\") " pod="kuadrant-system/kuadrant-operator-catalog-pv7v7" Apr 20 19:33:12.178942 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:12.178911 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-65mpd\" (UniqueName: \"kubernetes.io/projected/9b137590-67f2-410c-a41d-ef930a9357e1-kube-api-access-65mpd\") pod \"kuadrant-operator-catalog-pv7v7\" (UID: \"9b137590-67f2-410c-a41d-ef930a9357e1\") " pod="kuadrant-system/kuadrant-operator-catalog-pv7v7" Apr 20 19:33:12.211709 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:12.211670 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-kwrs9" event={"ID":"63be7e17-6758-431f-8c27-821172dbbff1","Type":"ContainerStarted","Data":"f38460b314c80fde6dfbeda1892939a90bba57198466e25d74486cc21e4fa02d"} Apr 20 19:33:12.285158 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:12.285076 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-pv7v7" Apr 20 19:33:12.417555 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:12.417533 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-pv7v7"] Apr 20 19:33:12.460366 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:33:12.460330 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b137590_67f2_410c_a41d_ef930a9357e1.slice/crio-e5c348f0e560bcdb745b246d4880a128807b327e2fcb88ac9ad69a03b5c85450 WatchSource:0}: Error finding container e5c348f0e560bcdb745b246d4880a128807b327e2fcb88ac9ad69a03b5c85450: Status 404 returned error can't find the container with id e5c348f0e560bcdb745b246d4880a128807b327e2fcb88ac9ad69a03b5c85450 Apr 20 19:33:13.219086 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:13.219046 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-pv7v7" event={"ID":"9b137590-67f2-410c-a41d-ef930a9357e1","Type":"ContainerStarted","Data":"e5c348f0e560bcdb745b246d4880a128807b327e2fcb88ac9ad69a03b5c85450"} Apr 20 19:33:14.223968 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:14.223926 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-kwrs9" event={"ID":"63be7e17-6758-431f-8c27-821172dbbff1","Type":"ContainerStarted","Data":"a244785b9b133c8354b4c4c81bccc4cc06e652512adef94e5d8d2017cd3a7864"} Apr 20 19:33:14.224485 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:14.223979 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-kwrs9" podUID="63be7e17-6758-431f-8c27-821172dbbff1" containerName="registry-server" containerID="cri-o://a244785b9b133c8354b4c4c81bccc4cc06e652512adef94e5d8d2017cd3a7864" gracePeriod=2 Apr 20 19:33:14.225296 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:14.225267 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-pv7v7" event={"ID":"9b137590-67f2-410c-a41d-ef930a9357e1","Type":"ContainerStarted","Data":"7f7d5ad30bffde9534a5f64b9184b7e6db52b84eb71ebaca98446fbdf0cd8347"} Apr 20 19:33:14.241860 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:14.241812 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-kwrs9" podStartSLOduration=1.278628453 podStartE2EDuration="3.241797746s" podCreationTimestamp="2026-04-20 19:33:11 +0000 UTC" firstStartedPulling="2026-04-20 19:33:11.831071495 +0000 UTC m=+465.821559022" lastFinishedPulling="2026-04-20 19:33:13.794240798 +0000 UTC m=+467.784728315" observedRunningTime="2026-04-20 19:33:14.239542694 +0000 UTC m=+468.230030230" watchObservedRunningTime="2026-04-20 19:33:14.241797746 +0000 UTC m=+468.232285281" Apr 20 19:33:14.255887 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:14.255829 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-pv7v7" podStartSLOduration=1.922438332 podStartE2EDuration="3.255814171s" podCreationTimestamp="2026-04-20 19:33:11 +0000 UTC" firstStartedPulling="2026-04-20 19:33:12.461622763 +0000 UTC m=+466.452110276" lastFinishedPulling="2026-04-20 19:33:13.794998598 +0000 UTC m=+467.785486115" observedRunningTime="2026-04-20 19:33:14.255103829 +0000 UTC m=+468.245591365" watchObservedRunningTime="2026-04-20 19:33:14.255814171 +0000 UTC m=+468.246301708" Apr 20 19:33:14.467162 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:14.467136 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-kwrs9" Apr 20 19:33:14.596013 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:14.595981 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcg7c\" (UniqueName: \"kubernetes.io/projected/63be7e17-6758-431f-8c27-821172dbbff1-kube-api-access-mcg7c\") pod \"63be7e17-6758-431f-8c27-821172dbbff1\" (UID: \"63be7e17-6758-431f-8c27-821172dbbff1\") " Apr 20 19:33:14.598130 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:14.598106 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63be7e17-6758-431f-8c27-821172dbbff1-kube-api-access-mcg7c" (OuterVolumeSpecName: "kube-api-access-mcg7c") pod "63be7e17-6758-431f-8c27-821172dbbff1" (UID: "63be7e17-6758-431f-8c27-821172dbbff1"). InnerVolumeSpecName "kube-api-access-mcg7c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:33:14.696847 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:14.696798 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mcg7c\" (UniqueName: \"kubernetes.io/projected/63be7e17-6758-431f-8c27-821172dbbff1-kube-api-access-mcg7c\") on node \"ip-10-0-131-162.ec2.internal\" DevicePath \"\"" Apr 20 19:33:15.230474 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:15.230426 2572 generic.go:358] "Generic (PLEG): container finished" podID="63be7e17-6758-431f-8c27-821172dbbff1" containerID="a244785b9b133c8354b4c4c81bccc4cc06e652512adef94e5d8d2017cd3a7864" exitCode=0 Apr 20 19:33:15.230919 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:15.230515 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-kwrs9" Apr 20 19:33:15.230919 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:15.230514 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-kwrs9" event={"ID":"63be7e17-6758-431f-8c27-821172dbbff1","Type":"ContainerDied","Data":"a244785b9b133c8354b4c4c81bccc4cc06e652512adef94e5d8d2017cd3a7864"} Apr 20 19:33:15.230919 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:15.230548 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-kwrs9" event={"ID":"63be7e17-6758-431f-8c27-821172dbbff1","Type":"ContainerDied","Data":"f38460b314c80fde6dfbeda1892939a90bba57198466e25d74486cc21e4fa02d"} Apr 20 19:33:15.230919 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:15.230565 2572 scope.go:117] "RemoveContainer" containerID="a244785b9b133c8354b4c4c81bccc4cc06e652512adef94e5d8d2017cd3a7864" Apr 20 19:33:15.239546 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:15.239528 2572 scope.go:117] "RemoveContainer" containerID="a244785b9b133c8354b4c4c81bccc4cc06e652512adef94e5d8d2017cd3a7864" Apr 20 19:33:15.239855 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:33:15.239826 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a244785b9b133c8354b4c4c81bccc4cc06e652512adef94e5d8d2017cd3a7864\": container with ID starting with a244785b9b133c8354b4c4c81bccc4cc06e652512adef94e5d8d2017cd3a7864 not found: ID does not exist" containerID="a244785b9b133c8354b4c4c81bccc4cc06e652512adef94e5d8d2017cd3a7864" Apr 20 19:33:15.239961 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:15.239868 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a244785b9b133c8354b4c4c81bccc4cc06e652512adef94e5d8d2017cd3a7864"} err="failed to get container status \"a244785b9b133c8354b4c4c81bccc4cc06e652512adef94e5d8d2017cd3a7864\": rpc error: code = NotFound desc = could not find container \"a244785b9b133c8354b4c4c81bccc4cc06e652512adef94e5d8d2017cd3a7864\": container with ID starting with a244785b9b133c8354b4c4c81bccc4cc06e652512adef94e5d8d2017cd3a7864 not found: ID does not exist" Apr 20 19:33:15.247647 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:15.247622 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-kwrs9"] Apr 20 19:33:15.251918 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:15.251896 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-kwrs9"] Apr 20 19:33:16.602712 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:16.602678 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63be7e17-6758-431f-8c27-821172dbbff1" path="/var/lib/kubelet/pods/63be7e17-6758-431f-8c27-821172dbbff1/volumes" Apr 20 19:33:22.286041 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:22.285995 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-pv7v7" Apr 20 19:33:22.286041 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:22.286046 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-pv7v7" Apr 20 19:33:22.307501 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:22.307474 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-pv7v7" Apr 20 19:33:23.281737 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:23.281709 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-pv7v7" Apr 20 19:33:42.571534 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:42.571503 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-wbk2k"] Apr 20 19:33:42.572002 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:42.571867 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="63be7e17-6758-431f-8c27-821172dbbff1" containerName="registry-server" Apr 20 19:33:42.572002 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:42.571878 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="63be7e17-6758-431f-8c27-821172dbbff1" containerName="registry-server" Apr 20 19:33:42.572002 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:42.571946 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="63be7e17-6758-431f-8c27-821172dbbff1" containerName="registry-server" Apr 20 19:33:42.577377 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:42.577355 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-wbk2k" Apr 20 19:33:42.580264 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:42.580242 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 20 19:33:42.580561 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:42.580544 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-jwlg4\"" Apr 20 19:33:42.590330 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:42.590307 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-wbk2k"] Apr 20 19:33:42.640039 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:42.640003 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7s4w\" (UniqueName: \"kubernetes.io/projected/5913892d-89ac-4fc0-8208-805a302a71f1-kube-api-access-k7s4w\") pod \"dns-operator-controller-manager-648d5c98bc-wbk2k\" (UID: \"5913892d-89ac-4fc0-8208-805a302a71f1\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-wbk2k" Apr 20 19:33:42.741332 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:42.741292 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k7s4w\" (UniqueName: \"kubernetes.io/projected/5913892d-89ac-4fc0-8208-805a302a71f1-kube-api-access-k7s4w\") pod \"dns-operator-controller-manager-648d5c98bc-wbk2k\" (UID: \"5913892d-89ac-4fc0-8208-805a302a71f1\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-wbk2k" Apr 20 19:33:42.749646 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:42.749619 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7s4w\" (UniqueName: \"kubernetes.io/projected/5913892d-89ac-4fc0-8208-805a302a71f1-kube-api-access-k7s4w\") pod \"dns-operator-controller-manager-648d5c98bc-wbk2k\" (UID: \"5913892d-89ac-4fc0-8208-805a302a71f1\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-wbk2k" Apr 20 19:33:42.887798 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:42.887700 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-wbk2k" Apr 20 19:33:43.012938 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:43.012908 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-wbk2k"] Apr 20 19:33:43.015629 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:33:43.015599 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5913892d_89ac_4fc0_8208_805a302a71f1.slice/crio-d64386a9df74da196480a87a57b8ebb1a7f299af3ac8c98455da43b611c06646 WatchSource:0}: Error finding container d64386a9df74da196480a87a57b8ebb1a7f299af3ac8c98455da43b611c06646: Status 404 returned error can't find the container with id d64386a9df74da196480a87a57b8ebb1a7f299af3ac8c98455da43b611c06646 Apr 20 19:33:43.329353 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:43.329315 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-wbk2k" event={"ID":"5913892d-89ac-4fc0-8208-805a302a71f1","Type":"ContainerStarted","Data":"d64386a9df74da196480a87a57b8ebb1a7f299af3ac8c98455da43b611c06646"} Apr 20 19:33:45.216479 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:45.216424 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-65df76fddb-5pzvr"] Apr 20 19:33:45.229664 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:45.229626 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65df76fddb-5pzvr" Apr 20 19:33:45.230924 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:45.230874 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-65df76fddb-5pzvr"] Apr 20 19:33:45.233066 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:45.233035 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 20 19:33:45.233228 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:45.233035 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 20 19:33:45.233498 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:45.233340 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-wvcd6\"" Apr 20 19:33:45.233816 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:45.233624 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 20 19:33:45.234284 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:45.234109 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 20 19:33:45.234407 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:45.234378 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 20 19:33:45.238496 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:45.238474 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 20 19:33:45.265343 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:45.265305 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1762578a-ac16-475b-8c21-b6d0085b8549-service-ca\") pod \"console-65df76fddb-5pzvr\" (UID: \"1762578a-ac16-475b-8c21-b6d0085b8549\") " pod="openshift-console/console-65df76fddb-5pzvr" Apr 20 19:33:45.265534 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:45.265350 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1762578a-ac16-475b-8c21-b6d0085b8549-console-oauth-config\") pod \"console-65df76fddb-5pzvr\" (UID: \"1762578a-ac16-475b-8c21-b6d0085b8549\") " pod="openshift-console/console-65df76fddb-5pzvr" Apr 20 19:33:45.265534 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:45.265381 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1762578a-ac16-475b-8c21-b6d0085b8549-console-serving-cert\") pod \"console-65df76fddb-5pzvr\" (UID: \"1762578a-ac16-475b-8c21-b6d0085b8549\") " pod="openshift-console/console-65df76fddb-5pzvr" Apr 20 19:33:45.265534 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:45.265423 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1762578a-ac16-475b-8c21-b6d0085b8549-trusted-ca-bundle\") pod \"console-65df76fddb-5pzvr\" (UID: \"1762578a-ac16-475b-8c21-b6d0085b8549\") " pod="openshift-console/console-65df76fddb-5pzvr" Apr 20 19:33:45.265534 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:45.265461 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1762578a-ac16-475b-8c21-b6d0085b8549-oauth-serving-cert\") pod \"console-65df76fddb-5pzvr\" (UID: \"1762578a-ac16-475b-8c21-b6d0085b8549\") " pod="openshift-console/console-65df76fddb-5pzvr" Apr 20 19:33:45.265714 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:45.265563 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2f67\" (UniqueName: \"kubernetes.io/projected/1762578a-ac16-475b-8c21-b6d0085b8549-kube-api-access-p2f67\") pod \"console-65df76fddb-5pzvr\" (UID: \"1762578a-ac16-475b-8c21-b6d0085b8549\") " pod="openshift-console/console-65df76fddb-5pzvr" Apr 20 19:33:45.265714 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:45.265625 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1762578a-ac16-475b-8c21-b6d0085b8549-console-config\") pod \"console-65df76fddb-5pzvr\" (UID: \"1762578a-ac16-475b-8c21-b6d0085b8549\") " pod="openshift-console/console-65df76fddb-5pzvr" Apr 20 19:33:45.366151 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:45.366116 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1762578a-ac16-475b-8c21-b6d0085b8549-trusted-ca-bundle\") pod \"console-65df76fddb-5pzvr\" (UID: \"1762578a-ac16-475b-8c21-b6d0085b8549\") " pod="openshift-console/console-65df76fddb-5pzvr" Apr 20 19:33:45.366151 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:45.366158 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1762578a-ac16-475b-8c21-b6d0085b8549-oauth-serving-cert\") pod \"console-65df76fddb-5pzvr\" (UID: \"1762578a-ac16-475b-8c21-b6d0085b8549\") " pod="openshift-console/console-65df76fddb-5pzvr" Apr 20 19:33:45.366401 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:45.366219 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p2f67\" (UniqueName: \"kubernetes.io/projected/1762578a-ac16-475b-8c21-b6d0085b8549-kube-api-access-p2f67\") pod \"console-65df76fddb-5pzvr\" (UID: \"1762578a-ac16-475b-8c21-b6d0085b8549\") " pod="openshift-console/console-65df76fddb-5pzvr" Apr 20 19:33:45.366401 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:45.366266 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1762578a-ac16-475b-8c21-b6d0085b8549-console-config\") pod \"console-65df76fddb-5pzvr\" (UID: \"1762578a-ac16-475b-8c21-b6d0085b8549\") " pod="openshift-console/console-65df76fddb-5pzvr" Apr 20 19:33:45.366401 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:45.366302 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1762578a-ac16-475b-8c21-b6d0085b8549-service-ca\") pod \"console-65df76fddb-5pzvr\" (UID: \"1762578a-ac16-475b-8c21-b6d0085b8549\") " pod="openshift-console/console-65df76fddb-5pzvr" Apr 20 19:33:45.366401 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:45.366333 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1762578a-ac16-475b-8c21-b6d0085b8549-console-oauth-config\") pod \"console-65df76fddb-5pzvr\" (UID: \"1762578a-ac16-475b-8c21-b6d0085b8549\") " pod="openshift-console/console-65df76fddb-5pzvr" Apr 20 19:33:45.366401 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:45.366362 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1762578a-ac16-475b-8c21-b6d0085b8549-console-serving-cert\") pod \"console-65df76fddb-5pzvr\" (UID: \"1762578a-ac16-475b-8c21-b6d0085b8549\") " pod="openshift-console/console-65df76fddb-5pzvr" Apr 20 19:33:45.367070 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:45.367043 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1762578a-ac16-475b-8c21-b6d0085b8549-oauth-serving-cert\") pod \"console-65df76fddb-5pzvr\" (UID: \"1762578a-ac16-475b-8c21-b6d0085b8549\") " pod="openshift-console/console-65df76fddb-5pzvr" Apr 20 19:33:45.367188 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:45.367133 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1762578a-ac16-475b-8c21-b6d0085b8549-console-config\") pod \"console-65df76fddb-5pzvr\" (UID: \"1762578a-ac16-475b-8c21-b6d0085b8549\") " pod="openshift-console/console-65df76fddb-5pzvr" Apr 20 19:33:45.367188 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:45.367130 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1762578a-ac16-475b-8c21-b6d0085b8549-trusted-ca-bundle\") pod \"console-65df76fddb-5pzvr\" (UID: \"1762578a-ac16-475b-8c21-b6d0085b8549\") " pod="openshift-console/console-65df76fddb-5pzvr" Apr 20 19:33:45.367188 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:45.367175 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1762578a-ac16-475b-8c21-b6d0085b8549-service-ca\") pod \"console-65df76fddb-5pzvr\" (UID: \"1762578a-ac16-475b-8c21-b6d0085b8549\") " pod="openshift-console/console-65df76fddb-5pzvr" Apr 20 19:33:45.369342 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:45.369318 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1762578a-ac16-475b-8c21-b6d0085b8549-console-oauth-config\") pod \"console-65df76fddb-5pzvr\" (UID: \"1762578a-ac16-475b-8c21-b6d0085b8549\") " pod="openshift-console/console-65df76fddb-5pzvr" Apr 20 19:33:45.369491 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:45.369473 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1762578a-ac16-475b-8c21-b6d0085b8549-console-serving-cert\") pod \"console-65df76fddb-5pzvr\" (UID: \"1762578a-ac16-475b-8c21-b6d0085b8549\") " pod="openshift-console/console-65df76fddb-5pzvr" Apr 20 19:33:45.383044 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:45.383018 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2f67\" (UniqueName: \"kubernetes.io/projected/1762578a-ac16-475b-8c21-b6d0085b8549-kube-api-access-p2f67\") pod \"console-65df76fddb-5pzvr\" (UID: \"1762578a-ac16-475b-8c21-b6d0085b8549\") " pod="openshift-console/console-65df76fddb-5pzvr" Apr 20 19:33:45.543408 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:45.543377 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65df76fddb-5pzvr" Apr 20 19:33:45.667949 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:45.667923 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-65df76fddb-5pzvr"] Apr 20 19:33:45.670396 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:33:45.670365 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1762578a_ac16_475b_8c21_b6d0085b8549.slice/crio-cb5a201e97b21cb6f84b19ebeb411b5defd61056048ed7cc5319c61465171b4e WatchSource:0}: Error finding container cb5a201e97b21cb6f84b19ebeb411b5defd61056048ed7cc5319c61465171b4e: Status 404 returned error can't find the container with id cb5a201e97b21cb6f84b19ebeb411b5defd61056048ed7cc5319c61465171b4e Apr 20 19:33:46.342377 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:46.342341 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-wbk2k" event={"ID":"5913892d-89ac-4fc0-8208-805a302a71f1","Type":"ContainerStarted","Data":"c5390f2f7ef19e6b91d613738f2c61daf30903329d00d941e2d56892de38be4c"} Apr 20 19:33:46.342850 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:46.342408 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-wbk2k" Apr 20 19:33:46.343746 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:46.343725 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65df76fddb-5pzvr" event={"ID":"1762578a-ac16-475b-8c21-b6d0085b8549","Type":"ContainerStarted","Data":"1ae69c80d8da19f268e10095e1694be8dba4068f4300e01d1cb71e437585e882"} Apr 20 19:33:46.343838 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:46.343751 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65df76fddb-5pzvr" event={"ID":"1762578a-ac16-475b-8c21-b6d0085b8549","Type":"ContainerStarted","Data":"cb5a201e97b21cb6f84b19ebeb411b5defd61056048ed7cc5319c61465171b4e"} Apr 20 19:33:46.362528 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:46.362475 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-wbk2k" podStartSLOduration=1.8736841659999999 podStartE2EDuration="4.362461869s" podCreationTimestamp="2026-04-20 19:33:42 +0000 UTC" firstStartedPulling="2026-04-20 19:33:43.017526714 +0000 UTC m=+497.008014227" lastFinishedPulling="2026-04-20 19:33:45.506304417 +0000 UTC m=+499.496791930" observedRunningTime="2026-04-20 19:33:46.359706055 +0000 UTC m=+500.350193612" watchObservedRunningTime="2026-04-20 19:33:46.362461869 +0000 UTC m=+500.352949398" Apr 20 19:33:46.842718 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:46.842669 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-65df76fddb-5pzvr" podStartSLOduration=1.842650946 podStartE2EDuration="1.842650946s" podCreationTimestamp="2026-04-20 19:33:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:33:46.377759561 +0000 UTC m=+500.368247121" watchObservedRunningTime="2026-04-20 19:33:46.842650946 +0000 UTC m=+500.833138481" Apr 20 19:33:46.844665 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:46.844637 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-sfmzv"] Apr 20 19:33:46.847241 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:46.847218 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-sfmzv" Apr 20 19:33:46.850038 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:46.850016 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-vh6mz\"" Apr 20 19:33:46.865405 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:46.865381 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-sfmzv"] Apr 20 19:33:46.880007 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:46.879979 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gngm4\" (UniqueName: \"kubernetes.io/projected/bc91b37b-b181-4658-9961-bca7209a70b9-kube-api-access-gngm4\") pod \"authorino-operator-657f44b778-sfmzv\" (UID: \"bc91b37b-b181-4658-9961-bca7209a70b9\") " pod="kuadrant-system/authorino-operator-657f44b778-sfmzv" Apr 20 19:33:46.981602 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:46.981561 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gngm4\" (UniqueName: \"kubernetes.io/projected/bc91b37b-b181-4658-9961-bca7209a70b9-kube-api-access-gngm4\") pod \"authorino-operator-657f44b778-sfmzv\" (UID: \"bc91b37b-b181-4658-9961-bca7209a70b9\") " pod="kuadrant-system/authorino-operator-657f44b778-sfmzv" Apr 20 19:33:46.998077 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:46.998045 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gngm4\" (UniqueName: \"kubernetes.io/projected/bc91b37b-b181-4658-9961-bca7209a70b9-kube-api-access-gngm4\") pod \"authorino-operator-657f44b778-sfmzv\" (UID: \"bc91b37b-b181-4658-9961-bca7209a70b9\") " pod="kuadrant-system/authorino-operator-657f44b778-sfmzv" Apr 20 19:33:47.158317 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:47.158207 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-sfmzv" Apr 20 19:33:47.339099 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:47.339063 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-sfmzv"] Apr 20 19:33:47.342668 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:33:47.342614 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc91b37b_b181_4658_9961_bca7209a70b9.slice/crio-b7e80bba53edd9eed0d521ecd65a565114a5b01f780b228984abda6dab376004 WatchSource:0}: Error finding container b7e80bba53edd9eed0d521ecd65a565114a5b01f780b228984abda6dab376004: Status 404 returned error can't find the container with id b7e80bba53edd9eed0d521ecd65a565114a5b01f780b228984abda6dab376004 Apr 20 19:33:47.348993 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:47.348945 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-sfmzv" event={"ID":"bc91b37b-b181-4658-9961-bca7209a70b9","Type":"ContainerStarted","Data":"b7e80bba53edd9eed0d521ecd65a565114a5b01f780b228984abda6dab376004"} Apr 20 19:33:49.953881 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:49.953845 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7q7sw"] Apr 20 19:33:49.956517 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:49.956496 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7q7sw" Apr 20 19:33:49.959381 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:49.959358 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-2sbfm\"" Apr 20 19:33:49.968123 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:49.968103 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7q7sw"] Apr 20 19:33:50.010718 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:50.010688 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhrpq\" (UniqueName: \"kubernetes.io/projected/bbf89f1a-4664-4665-a223-f3df84cc2fb4-kube-api-access-rhrpq\") pod \"limitador-operator-controller-manager-85c4996f8c-7q7sw\" (UID: \"bbf89f1a-4664-4665-a223-f3df84cc2fb4\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7q7sw" Apr 20 19:33:50.111774 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:50.111733 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rhrpq\" (UniqueName: \"kubernetes.io/projected/bbf89f1a-4664-4665-a223-f3df84cc2fb4-kube-api-access-rhrpq\") pod \"limitador-operator-controller-manager-85c4996f8c-7q7sw\" (UID: \"bbf89f1a-4664-4665-a223-f3df84cc2fb4\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7q7sw" Apr 20 19:33:50.127153 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:50.127121 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhrpq\" (UniqueName: \"kubernetes.io/projected/bbf89f1a-4664-4665-a223-f3df84cc2fb4-kube-api-access-rhrpq\") pod \"limitador-operator-controller-manager-85c4996f8c-7q7sw\" (UID: \"bbf89f1a-4664-4665-a223-f3df84cc2fb4\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7q7sw" Apr 20 19:33:50.267640 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:50.267547 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7q7sw" Apr 20 19:33:50.362452 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:50.362411 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-sfmzv" event={"ID":"bc91b37b-b181-4658-9961-bca7209a70b9","Type":"ContainerStarted","Data":"bee6fb36fb5b48bc0e70a1be3db6626d03f9c1395aebb52388ac932f88fe6e55"} Apr 20 19:33:50.362624 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:50.362489 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-sfmzv" Apr 20 19:33:50.392977 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:50.391535 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-sfmzv" podStartSLOduration=2.259210161 podStartE2EDuration="4.391512297s" podCreationTimestamp="2026-04-20 19:33:46 +0000 UTC" firstStartedPulling="2026-04-20 19:33:47.344881814 +0000 UTC m=+501.335369329" lastFinishedPulling="2026-04-20 19:33:49.477183952 +0000 UTC m=+503.467671465" observedRunningTime="2026-04-20 19:33:50.389411685 +0000 UTC m=+504.379899221" watchObservedRunningTime="2026-04-20 19:33:50.391512297 +0000 UTC m=+504.381999845" Apr 20 19:33:50.428247 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:50.428216 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7q7sw"] Apr 20 19:33:50.434162 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:33:50.434119 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbf89f1a_4664_4665_a223_f3df84cc2fb4.slice/crio-da94029cefcac33cf5d70dc4c0eb7f91e5142676a01c021ee8e532e2c163e04f WatchSource:0}: Error finding container da94029cefcac33cf5d70dc4c0eb7f91e5142676a01c021ee8e532e2c163e04f: Status 404 returned error can't find the container with id da94029cefcac33cf5d70dc4c0eb7f91e5142676a01c021ee8e532e2c163e04f Apr 20 19:33:51.368368 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:51.368326 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7q7sw" event={"ID":"bbf89f1a-4664-4665-a223-f3df84cc2fb4","Type":"ContainerStarted","Data":"da94029cefcac33cf5d70dc4c0eb7f91e5142676a01c021ee8e532e2c163e04f"} Apr 20 19:33:52.373144 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:52.373109 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7q7sw" event={"ID":"bbf89f1a-4664-4665-a223-f3df84cc2fb4","Type":"ContainerStarted","Data":"d885576edf71425d68e2ffb9c54a1ce8ee9199b5630ee6e57ac3561884edb9c9"} Apr 20 19:33:52.373610 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:52.373212 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7q7sw" Apr 20 19:33:52.398104 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:52.398048 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7q7sw" podStartSLOduration=2.038398285 podStartE2EDuration="3.398033826s" podCreationTimestamp="2026-04-20 19:33:49 +0000 UTC" firstStartedPulling="2026-04-20 19:33:50.436288025 +0000 UTC m=+504.426775540" lastFinishedPulling="2026-04-20 19:33:51.795923564 +0000 UTC m=+505.786411081" observedRunningTime="2026-04-20 19:33:52.395816359 +0000 UTC m=+506.386303894" watchObservedRunningTime="2026-04-20 19:33:52.398033826 +0000 UTC m=+506.388521361" Apr 20 19:33:55.543728 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:55.543689 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-65df76fddb-5pzvr" Apr 20 19:33:55.543728 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:55.543733 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-65df76fddb-5pzvr" Apr 20 19:33:55.548608 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:55.548585 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-65df76fddb-5pzvr" Apr 20 19:33:56.398506 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:56.398476 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-65df76fddb-5pzvr" Apr 20 19:33:57.351237 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:33:57.351197 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-wbk2k" Apr 20 19:34:01.370852 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:01.370819 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-sfmzv" Apr 20 19:34:03.386998 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:03.386971 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7q7sw" Apr 20 19:34:13.173099 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:13.173060 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7q7sw"] Apr 20 19:34:13.173863 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:13.173432 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7q7sw" podUID="bbf89f1a-4664-4665-a223-f3df84cc2fb4" containerName="manager" containerID="cri-o://d885576edf71425d68e2ffb9c54a1ce8ee9199b5630ee6e57ac3561884edb9c9" gracePeriod=2 Apr 20 19:34:13.189832 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:13.189800 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7q7sw"] Apr 20 19:34:13.201415 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:13.201378 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-fmx42"] Apr 20 19:34:13.201998 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:13.201979 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bbf89f1a-4664-4665-a223-f3df84cc2fb4" containerName="manager" Apr 20 19:34:13.202073 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:13.202000 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbf89f1a-4664-4665-a223-f3df84cc2fb4" containerName="manager" Apr 20 19:34:13.202125 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:13.202114 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="bbf89f1a-4664-4665-a223-f3df84cc2fb4" containerName="manager" Apr 20 19:34:13.207195 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:13.207175 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-fmx42" Apr 20 19:34:13.211471 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:13.211245 2572 status_manager.go:895] "Failed to get status for pod" podUID="bbf89f1a-4664-4665-a223-f3df84cc2fb4" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7q7sw" err="pods \"limitador-operator-controller-manager-85c4996f8c-7q7sw\" is forbidden: User \"system:node:ip-10-0-131-162.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-131-162.ec2.internal' and this object" Apr 20 19:34:13.214103 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:13.214075 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-fmx42"] Apr 20 19:34:13.330966 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:13.330932 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p4bf\" (UniqueName: \"kubernetes.io/projected/af242632-188d-41a5-b48b-fb8d9ea4acc1-kube-api-access-9p4bf\") pod \"limitador-operator-controller-manager-85c4996f8c-fmx42\" (UID: \"af242632-188d-41a5-b48b-fb8d9ea4acc1\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-fmx42" Apr 20 19:34:13.408856 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:13.408834 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7q7sw" Apr 20 19:34:13.411213 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:13.411162 2572 status_manager.go:895] "Failed to get status for pod" podUID="bbf89f1a-4664-4665-a223-f3df84cc2fb4" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7q7sw" err="pods \"limitador-operator-controller-manager-85c4996f8c-7q7sw\" is forbidden: User \"system:node:ip-10-0-131-162.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-131-162.ec2.internal' and this object" Apr 20 19:34:13.431537 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:13.431463 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhrpq\" (UniqueName: \"kubernetes.io/projected/bbf89f1a-4664-4665-a223-f3df84cc2fb4-kube-api-access-rhrpq\") pod \"bbf89f1a-4664-4665-a223-f3df84cc2fb4\" (UID: \"bbf89f1a-4664-4665-a223-f3df84cc2fb4\") " Apr 20 19:34:13.431679 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:13.431664 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9p4bf\" (UniqueName: \"kubernetes.io/projected/af242632-188d-41a5-b48b-fb8d9ea4acc1-kube-api-access-9p4bf\") pod \"limitador-operator-controller-manager-85c4996f8c-fmx42\" (UID: \"af242632-188d-41a5-b48b-fb8d9ea4acc1\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-fmx42" Apr 20 19:34:13.433481 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:13.433452 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbf89f1a-4664-4665-a223-f3df84cc2fb4-kube-api-access-rhrpq" (OuterVolumeSpecName: "kube-api-access-rhrpq") pod "bbf89f1a-4664-4665-a223-f3df84cc2fb4" (UID: "bbf89f1a-4664-4665-a223-f3df84cc2fb4"). InnerVolumeSpecName "kube-api-access-rhrpq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:34:13.442687 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:13.442664 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p4bf\" (UniqueName: \"kubernetes.io/projected/af242632-188d-41a5-b48b-fb8d9ea4acc1-kube-api-access-9p4bf\") pod \"limitador-operator-controller-manager-85c4996f8c-fmx42\" (UID: \"af242632-188d-41a5-b48b-fb8d9ea4acc1\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-fmx42" Apr 20 19:34:13.461186 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:13.461155 2572 generic.go:358] "Generic (PLEG): container finished" podID="bbf89f1a-4664-4665-a223-f3df84cc2fb4" containerID="d885576edf71425d68e2ffb9c54a1ce8ee9199b5630ee6e57ac3561884edb9c9" exitCode=0 Apr 20 19:34:13.461316 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:13.461202 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7q7sw" Apr 20 19:34:13.461316 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:13.461216 2572 scope.go:117] "RemoveContainer" containerID="d885576edf71425d68e2ffb9c54a1ce8ee9199b5630ee6e57ac3561884edb9c9" Apr 20 19:34:13.463555 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:13.463532 2572 status_manager.go:895] "Failed to get status for pod" podUID="bbf89f1a-4664-4665-a223-f3df84cc2fb4" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7q7sw" err="pods \"limitador-operator-controller-manager-85c4996f8c-7q7sw\" is forbidden: User \"system:node:ip-10-0-131-162.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-131-162.ec2.internal' and this object" Apr 20 19:34:13.469370 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:13.469353 2572 scope.go:117] "RemoveContainer" containerID="d885576edf71425d68e2ffb9c54a1ce8ee9199b5630ee6e57ac3561884edb9c9" Apr 20 19:34:13.469635 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:34:13.469612 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d885576edf71425d68e2ffb9c54a1ce8ee9199b5630ee6e57ac3561884edb9c9\": container with ID starting with d885576edf71425d68e2ffb9c54a1ce8ee9199b5630ee6e57ac3561884edb9c9 not found: ID does not exist" containerID="d885576edf71425d68e2ffb9c54a1ce8ee9199b5630ee6e57ac3561884edb9c9" Apr 20 19:34:13.469725 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:13.469640 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d885576edf71425d68e2ffb9c54a1ce8ee9199b5630ee6e57ac3561884edb9c9"} err="failed to get container status \"d885576edf71425d68e2ffb9c54a1ce8ee9199b5630ee6e57ac3561884edb9c9\": rpc error: code = NotFound desc = could not find container \"d885576edf71425d68e2ffb9c54a1ce8ee9199b5630ee6e57ac3561884edb9c9\": container with ID starting with d885576edf71425d68e2ffb9c54a1ce8ee9199b5630ee6e57ac3561884edb9c9 not found: ID does not exist" Apr 20 19:34:13.471464 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:13.471424 2572 status_manager.go:895] "Failed to get status for pod" podUID="bbf89f1a-4664-4665-a223-f3df84cc2fb4" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7q7sw" err="pods \"limitador-operator-controller-manager-85c4996f8c-7q7sw\" is forbidden: User \"system:node:ip-10-0-131-162.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-131-162.ec2.internal' and this object" Apr 20 19:34:13.532928 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:13.532894 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rhrpq\" (UniqueName: \"kubernetes.io/projected/bbf89f1a-4664-4665-a223-f3df84cc2fb4-kube-api-access-rhrpq\") on node \"ip-10-0-131-162.ec2.internal\" DevicePath \"\"" Apr 20 19:34:13.551804 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:13.551771 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-fmx42" Apr 20 19:34:13.683304 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:13.683278 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-fmx42"] Apr 20 19:34:13.685919 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:34:13.685890 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf242632_188d_41a5_b48b_fb8d9ea4acc1.slice/crio-87e9afba193ea4af817429617a60e8a97942d3f8f71199c13269a8276650838a WatchSource:0}: Error finding container 87e9afba193ea4af817429617a60e8a97942d3f8f71199c13269a8276650838a: Status 404 returned error can't find the container with id 87e9afba193ea4af817429617a60e8a97942d3f8f71199c13269a8276650838a Apr 20 19:34:14.386464 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:14.386378 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7q7sw" podUID="bbf89f1a-4664-4665-a223-f3df84cc2fb4" containerName="manager" probeResult="failure" output="Get \"http://10.133.0.33:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 20 19:34:14.467290 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:14.467255 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-fmx42" event={"ID":"af242632-188d-41a5-b48b-fb8d9ea4acc1","Type":"ContainerStarted","Data":"46ee0cfca02e060a6b9dddb80bd672f892322630c4ea97b79bc22416319dd34f"} Apr 20 19:34:14.467290 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:14.467290 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-fmx42" event={"ID":"af242632-188d-41a5-b48b-fb8d9ea4acc1","Type":"ContainerStarted","Data":"87e9afba193ea4af817429617a60e8a97942d3f8f71199c13269a8276650838a"} Apr 20 19:34:14.467533 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:14.467395 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-fmx42" Apr 20 19:34:14.469750 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:14.469729 2572 status_manager.go:895] "Failed to get status for pod" podUID="bbf89f1a-4664-4665-a223-f3df84cc2fb4" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7q7sw" err="pods \"limitador-operator-controller-manager-85c4996f8c-7q7sw\" is forbidden: User \"system:node:ip-10-0-131-162.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-131-162.ec2.internal' and this object" Apr 20 19:34:14.495117 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:14.495060 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-fmx42" podStartSLOduration=1.495042993 podStartE2EDuration="1.495042993s" podCreationTimestamp="2026-04-20 19:34:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:34:14.494562136 +0000 UTC m=+528.485049670" watchObservedRunningTime="2026-04-20 19:34:14.495042993 +0000 UTC m=+528.485530528" Apr 20 19:34:14.603134 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:14.603101 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbf89f1a-4664-4665-a223-f3df84cc2fb4" path="/var/lib/kubelet/pods/bbf89f1a-4664-4665-a223-f3df84cc2fb4/volumes" Apr 20 19:34:25.473846 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:25.473772 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-fmx42" Apr 20 19:34:41.885235 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:41.885202 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-jkf4g"] Apr 20 19:34:41.890580 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:41.890551 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-jkf4g" Apr 20 19:34:41.893139 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:41.893113 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-qbbbh\"" Apr 20 19:34:41.902460 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:41.902373 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-jkf4g"] Apr 20 19:34:41.996545 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:41.996510 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/5888b8ba-846a-49d6-a0b9-b409361b907e-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-jkf4g\" (UID: \"5888b8ba-846a-49d6-a0b9-b409361b907e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-jkf4g" Apr 20 19:34:41.996545 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:41.996547 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/5888b8ba-846a-49d6-a0b9-b409361b907e-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-jkf4g\" (UID: \"5888b8ba-846a-49d6-a0b9-b409361b907e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-jkf4g" Apr 20 19:34:41.996756 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:41.996574 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/5888b8ba-846a-49d6-a0b9-b409361b907e-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-jkf4g\" (UID: \"5888b8ba-846a-49d6-a0b9-b409361b907e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-jkf4g" Apr 20 19:34:41.996756 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:41.996655 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/5888b8ba-846a-49d6-a0b9-b409361b907e-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-jkf4g\" (UID: \"5888b8ba-846a-49d6-a0b9-b409361b907e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-jkf4g" Apr 20 19:34:41.996756 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:41.996691 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vpmp\" (UniqueName: \"kubernetes.io/projected/5888b8ba-846a-49d6-a0b9-b409361b907e-kube-api-access-9vpmp\") pod \"maas-default-gateway-openshift-default-845c6b4b48-jkf4g\" (UID: \"5888b8ba-846a-49d6-a0b9-b409361b907e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-jkf4g" Apr 20 19:34:41.996852 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:41.996771 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/5888b8ba-846a-49d6-a0b9-b409361b907e-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-jkf4g\" (UID: \"5888b8ba-846a-49d6-a0b9-b409361b907e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-jkf4g" Apr 20 19:34:41.996852 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:41.996791 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/5888b8ba-846a-49d6-a0b9-b409361b907e-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-jkf4g\" (UID: \"5888b8ba-846a-49d6-a0b9-b409361b907e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-jkf4g" Apr 20 19:34:41.996852 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:41.996821 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/5888b8ba-846a-49d6-a0b9-b409361b907e-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-jkf4g\" (UID: \"5888b8ba-846a-49d6-a0b9-b409361b907e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-jkf4g" Apr 20 19:34:41.996852 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:41.996843 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/5888b8ba-846a-49d6-a0b9-b409361b907e-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-jkf4g\" (UID: \"5888b8ba-846a-49d6-a0b9-b409361b907e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-jkf4g" Apr 20 19:34:42.098096 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:42.098053 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/5888b8ba-846a-49d6-a0b9-b409361b907e-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-jkf4g\" (UID: \"5888b8ba-846a-49d6-a0b9-b409361b907e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-jkf4g" Apr 20 19:34:42.098096 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:42.098095 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9vpmp\" (UniqueName: \"kubernetes.io/projected/5888b8ba-846a-49d6-a0b9-b409361b907e-kube-api-access-9vpmp\") pod \"maas-default-gateway-openshift-default-845c6b4b48-jkf4g\" (UID: \"5888b8ba-846a-49d6-a0b9-b409361b907e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-jkf4g" Apr 20 19:34:42.098341 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:42.098164 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/5888b8ba-846a-49d6-a0b9-b409361b907e-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-jkf4g\" (UID: \"5888b8ba-846a-49d6-a0b9-b409361b907e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-jkf4g" Apr 20 19:34:42.098341 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:42.098194 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/5888b8ba-846a-49d6-a0b9-b409361b907e-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-jkf4g\" (UID: \"5888b8ba-846a-49d6-a0b9-b409361b907e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-jkf4g" Apr 20 19:34:42.098341 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:42.098298 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/5888b8ba-846a-49d6-a0b9-b409361b907e-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-jkf4g\" (UID: \"5888b8ba-846a-49d6-a0b9-b409361b907e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-jkf4g" Apr 20 19:34:42.098532 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:42.098351 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/5888b8ba-846a-49d6-a0b9-b409361b907e-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-jkf4g\" (UID: \"5888b8ba-846a-49d6-a0b9-b409361b907e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-jkf4g" Apr 20 19:34:42.098532 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:42.098473 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/5888b8ba-846a-49d6-a0b9-b409361b907e-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-jkf4g\" (UID: \"5888b8ba-846a-49d6-a0b9-b409361b907e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-jkf4g" Apr 20 19:34:42.098532 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:42.098509 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/5888b8ba-846a-49d6-a0b9-b409361b907e-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-jkf4g\" (UID: \"5888b8ba-846a-49d6-a0b9-b409361b907e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-jkf4g" Apr 20 19:34:42.098734 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:42.098537 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/5888b8ba-846a-49d6-a0b9-b409361b907e-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-jkf4g\" (UID: \"5888b8ba-846a-49d6-a0b9-b409361b907e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-jkf4g" Apr 20 19:34:42.098805 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:42.098776 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/5888b8ba-846a-49d6-a0b9-b409361b907e-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-jkf4g\" (UID: \"5888b8ba-846a-49d6-a0b9-b409361b907e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-jkf4g" Apr 20 19:34:42.098866 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:42.098837 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/5888b8ba-846a-49d6-a0b9-b409361b907e-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-jkf4g\" (UID: \"5888b8ba-846a-49d6-a0b9-b409361b907e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-jkf4g" Apr 20 19:34:42.098963 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:42.098942 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/5888b8ba-846a-49d6-a0b9-b409361b907e-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-jkf4g\" (UID: \"5888b8ba-846a-49d6-a0b9-b409361b907e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-jkf4g" Apr 20 19:34:42.099015 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:42.098996 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/5888b8ba-846a-49d6-a0b9-b409361b907e-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-jkf4g\" (UID: \"5888b8ba-846a-49d6-a0b9-b409361b907e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-jkf4g" Apr 20 19:34:42.099175 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:42.099156 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/5888b8ba-846a-49d6-a0b9-b409361b907e-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-jkf4g\" (UID: \"5888b8ba-846a-49d6-a0b9-b409361b907e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-jkf4g" Apr 20 19:34:42.100613 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:42.100588 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/5888b8ba-846a-49d6-a0b9-b409361b907e-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-jkf4g\" (UID: \"5888b8ba-846a-49d6-a0b9-b409361b907e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-jkf4g" Apr 20 19:34:42.100781 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:42.100764 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/5888b8ba-846a-49d6-a0b9-b409361b907e-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-jkf4g\" (UID: \"5888b8ba-846a-49d6-a0b9-b409361b907e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-jkf4g" Apr 20 19:34:42.114037 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:42.114008 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/5888b8ba-846a-49d6-a0b9-b409361b907e-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-jkf4g\" (UID: \"5888b8ba-846a-49d6-a0b9-b409361b907e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-jkf4g" Apr 20 19:34:42.114149 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:42.114135 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vpmp\" (UniqueName: \"kubernetes.io/projected/5888b8ba-846a-49d6-a0b9-b409361b907e-kube-api-access-9vpmp\") pod \"maas-default-gateway-openshift-default-845c6b4b48-jkf4g\" (UID: \"5888b8ba-846a-49d6-a0b9-b409361b907e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-jkf4g" Apr 20 19:34:42.204719 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:42.204632 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-jkf4g" Apr 20 19:34:42.339618 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:42.339592 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-jkf4g"] Apr 20 19:34:42.341912 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:34:42.341885 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5888b8ba_846a_49d6_a0b9_b409361b907e.slice/crio-fbc5aaf276d4f7f4611afdf2971ae627901a28028dbfa099405e33f8de2425df WatchSource:0}: Error finding container fbc5aaf276d4f7f4611afdf2971ae627901a28028dbfa099405e33f8de2425df: Status 404 returned error can't find the container with id fbc5aaf276d4f7f4611afdf2971ae627901a28028dbfa099405e33f8de2425df Apr 20 19:34:42.343914 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:42.343880 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 20 19:34:42.344029 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:42.343965 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 20 19:34:42.344029 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:42.344007 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 20 19:34:42.568818 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:42.568787 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-jkf4g" event={"ID":"5888b8ba-846a-49d6-a0b9-b409361b907e","Type":"ContainerStarted","Data":"f024ef3784a7cfe292c41058e37ce37c6cd78b7195f016b60629c19ebd1c9657"} Apr 20 19:34:42.568986 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:42.568826 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-jkf4g" event={"ID":"5888b8ba-846a-49d6-a0b9-b409361b907e","Type":"ContainerStarted","Data":"fbc5aaf276d4f7f4611afdf2971ae627901a28028dbfa099405e33f8de2425df"} Apr 20 19:34:42.592548 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:42.592499 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-jkf4g" podStartSLOduration=1.5924847130000002 podStartE2EDuration="1.592484713s" podCreationTimestamp="2026-04-20 19:34:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:34:42.590812969 +0000 UTC m=+556.581300544" watchObservedRunningTime="2026-04-20 19:34:42.592484713 +0000 UTC m=+556.582972244" Apr 20 19:34:43.205796 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:43.205750 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-jkf4g" Apr 20 19:34:43.210803 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:43.210777 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-jkf4g" Apr 20 19:34:43.572220 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:43.572188 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-jkf4g" Apr 20 19:34:43.573333 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:43.573310 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-jkf4g" Apr 20 19:34:56.869565 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:56.869526 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-pd8t9"] Apr 20 19:34:56.878344 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:56.877847 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-pd8t9" Apr 20 19:34:56.880703 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:56.880674 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-j57d9\"" Apr 20 19:34:56.881483 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:56.881452 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-pd8t9"] Apr 20 19:34:56.938833 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:56.938799 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dllx2\" (UniqueName: \"kubernetes.io/projected/a8ab0fb9-5b29-413c-b26c-858db722d4fe-kube-api-access-dllx2\") pod \"authorino-7498df8756-pd8t9\" (UID: \"a8ab0fb9-5b29-413c-b26c-858db722d4fe\") " pod="kuadrant-system/authorino-7498df8756-pd8t9" Apr 20 19:34:57.039664 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:57.039628 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dllx2\" (UniqueName: \"kubernetes.io/projected/a8ab0fb9-5b29-413c-b26c-858db722d4fe-kube-api-access-dllx2\") pod \"authorino-7498df8756-pd8t9\" (UID: \"a8ab0fb9-5b29-413c-b26c-858db722d4fe\") " pod="kuadrant-system/authorino-7498df8756-pd8t9" Apr 20 19:34:57.047454 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:57.047419 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dllx2\" (UniqueName: \"kubernetes.io/projected/a8ab0fb9-5b29-413c-b26c-858db722d4fe-kube-api-access-dllx2\") pod \"authorino-7498df8756-pd8t9\" (UID: \"a8ab0fb9-5b29-413c-b26c-858db722d4fe\") " pod="kuadrant-system/authorino-7498df8756-pd8t9" Apr 20 19:34:57.197491 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:57.197404 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-pd8t9" Apr 20 19:34:57.322951 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:57.322747 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-pd8t9"] Apr 20 19:34:57.326582 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:34:57.326553 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8ab0fb9_5b29_413c_b26c_858db722d4fe.slice/crio-5de88b8d5beee499ab367d5b53948396a5455c09f8f9a9f4d59d0513719acf38 WatchSource:0}: Error finding container 5de88b8d5beee499ab367d5b53948396a5455c09f8f9a9f4d59d0513719acf38: Status 404 returned error can't find the container with id 5de88b8d5beee499ab367d5b53948396a5455c09f8f9a9f4d59d0513719acf38 Apr 20 19:34:57.623834 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:34:57.623799 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-pd8t9" event={"ID":"a8ab0fb9-5b29-413c-b26c-858db722d4fe","Type":"ContainerStarted","Data":"5de88b8d5beee499ab367d5b53948396a5455c09f8f9a9f4d59d0513719acf38"} Apr 20 19:35:00.635690 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:35:00.635652 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-pd8t9" event={"ID":"a8ab0fb9-5b29-413c-b26c-858db722d4fe","Type":"ContainerStarted","Data":"f06a79be2887081be57dd28f0edfa03b6e73a145652f50c451f26f7a0dc957fe"} Apr 20 19:35:00.651855 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:35:00.651804 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-pd8t9" podStartSLOduration=1.73913578 podStartE2EDuration="4.651786308s" podCreationTimestamp="2026-04-20 19:34:56 +0000 UTC" firstStartedPulling="2026-04-20 19:34:57.327926691 +0000 UTC m=+571.318414204" lastFinishedPulling="2026-04-20 19:35:00.240577214 +0000 UTC m=+574.231064732" observedRunningTime="2026-04-20 19:35:00.649980791 +0000 UTC m=+574.640468325" watchObservedRunningTime="2026-04-20 19:35:00.651786308 +0000 UTC m=+574.642273844" Apr 20 19:35:26.507742 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:35:26.507714 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-qpl5s_a25d18d6-5add-4c28-a671-0ee5222cb999/console-operator/1.log" Apr 20 19:35:26.510765 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:35:26.510741 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-qpl5s_a25d18d6-5add-4c28-a671-0ee5222cb999/console-operator/1.log" Apr 20 19:35:54.550269 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:35:54.550232 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 20 19:35:54.552960 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:35:54.552939 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 20 19:35:54.557002 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:35:54.556982 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"maas-keycloak-initial-admin\"" Apr 20 19:35:54.558164 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:35:54.558143 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"default-dockercfg-5pbf7\"" Apr 20 19:35:54.558269 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:35:54.558148 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"kube-root-ca.crt\"" Apr 20 19:35:54.558269 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:35:54.558150 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"openshift-service-ca.crt\"" Apr 20 19:35:54.566614 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:35:54.566592 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 20 19:35:54.668471 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:35:54.668402 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dddd\" (UniqueName: \"kubernetes.io/projected/3f44d534-b149-449a-9860-4202dd9adca8-kube-api-access-6dddd\") pod \"maas-keycloak-0\" (UID: \"3f44d534-b149-449a-9860-4202dd9adca8\") " pod="keycloak-system/maas-keycloak-0" Apr 20 19:35:54.769175 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:35:54.769136 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6dddd\" (UniqueName: \"kubernetes.io/projected/3f44d534-b149-449a-9860-4202dd9adca8-kube-api-access-6dddd\") pod \"maas-keycloak-0\" (UID: \"3f44d534-b149-449a-9860-4202dd9adca8\") " pod="keycloak-system/maas-keycloak-0" Apr 20 19:35:54.778978 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:35:54.778952 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dddd\" (UniqueName: \"kubernetes.io/projected/3f44d534-b149-449a-9860-4202dd9adca8-kube-api-access-6dddd\") pod \"maas-keycloak-0\" (UID: \"3f44d534-b149-449a-9860-4202dd9adca8\") " pod="keycloak-system/maas-keycloak-0" Apr 20 19:35:54.862798 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:35:54.862762 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 20 19:35:54.991423 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:35:54.991269 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 20 19:35:54.993799 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:35:54.993770 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f44d534_b149_449a_9860_4202dd9adca8.slice/crio-94d05f77cc7b3e6da79372d7b65a4ddd1eae41f25d9a93c3e76a03a161f2da24 WatchSource:0}: Error finding container 94d05f77cc7b3e6da79372d7b65a4ddd1eae41f25d9a93c3e76a03a161f2da24: Status 404 returned error can't find the container with id 94d05f77cc7b3e6da79372d7b65a4ddd1eae41f25d9a93c3e76a03a161f2da24 Apr 20 19:35:55.843632 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:35:55.843598 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"3f44d534-b149-449a-9860-4202dd9adca8","Type":"ContainerStarted","Data":"94d05f77cc7b3e6da79372d7b65a4ddd1eae41f25d9a93c3e76a03a161f2da24"} Apr 20 19:35:59.863618 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:35:59.863570 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"3f44d534-b149-449a-9860-4202dd9adca8","Type":"ContainerStarted","Data":"0ae0c2a3a002c2dc94b4d5418e43f135b5ad0467b38c13494b148cfb2e61f7c5"} Apr 20 19:35:59.884210 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:35:59.884152 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keycloak-system/maas-keycloak-0" podStartSLOduration=1.220104971 podStartE2EDuration="5.884135537s" podCreationTimestamp="2026-04-20 19:35:54 +0000 UTC" firstStartedPulling="2026-04-20 19:35:54.995134755 +0000 UTC m=+628.985622272" lastFinishedPulling="2026-04-20 19:35:59.659165311 +0000 UTC m=+633.649652838" observedRunningTime="2026-04-20 19:35:59.881632628 +0000 UTC m=+633.872120164" watchObservedRunningTime="2026-04-20 19:35:59.884135537 +0000 UTC m=+633.874623104" Apr 20 19:36:00.862990 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:00.862954 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keycloak-system/maas-keycloak-0" Apr 20 19:36:00.864856 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:00.864811 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="3f44d534-b149-449a-9860-4202dd9adca8" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.37:9000/health/started\": dial tcp 10.133.0.37:9000: connect: connection refused" Apr 20 19:36:01.864111 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:01.864056 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="3f44d534-b149-449a-9860-4202dd9adca8" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.37:9000/health/started\": dial tcp 10.133.0.37:9000: connect: connection refused" Apr 20 19:36:02.864261 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:02.864201 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="3f44d534-b149-449a-9860-4202dd9adca8" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.37:9000/health/started\": dial tcp 10.133.0.37:9000: connect: connection refused" Apr 20 19:36:03.863382 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:03.863334 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="3f44d534-b149-449a-9860-4202dd9adca8" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.37:9000/health/started\": dial tcp 10.133.0.37:9000: connect: connection refused" Apr 20 19:36:04.863578 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:04.863533 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="keycloak-system/maas-keycloak-0" Apr 20 19:36:04.865242 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:04.864499 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="3f44d534-b149-449a-9860-4202dd9adca8" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.37:9000/health/started\": dial tcp 10.133.0.37:9000: connect: connection refused" Apr 20 19:36:05.864372 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:05.864322 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="3f44d534-b149-449a-9860-4202dd9adca8" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.37:9000/health/started\": dial tcp 10.133.0.37:9000: connect: connection refused" Apr 20 19:36:06.864035 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:06.863988 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="3f44d534-b149-449a-9860-4202dd9adca8" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.37:9000/health/started\": dial tcp 10.133.0.37:9000: connect: connection refused" Apr 20 19:36:07.863459 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:07.863393 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="3f44d534-b149-449a-9860-4202dd9adca8" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.37:9000/health/started\": dial tcp 10.133.0.37:9000: connect: connection refused" Apr 20 19:36:08.863353 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:08.863295 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="3f44d534-b149-449a-9860-4202dd9adca8" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.37:9000/health/started\": dial tcp 10.133.0.37:9000: connect: connection refused" Apr 20 19:36:09.863451 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:09.863395 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="3f44d534-b149-449a-9860-4202dd9adca8" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.37:9000/health/started\": dial tcp 10.133.0.37:9000: connect: connection refused" Apr 20 19:36:10.863612 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:10.863559 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="3f44d534-b149-449a-9860-4202dd9adca8" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.37:9000/health/started\": dial tcp 10.133.0.37:9000: connect: connection refused" Apr 20 19:36:11.864122 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:11.864073 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="3f44d534-b149-449a-9860-4202dd9adca8" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.37:9000/health/started\": dial tcp 10.133.0.37:9000: connect: connection refused" Apr 20 19:36:12.863928 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:12.863879 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="3f44d534-b149-449a-9860-4202dd9adca8" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.37:9000/health/started\": dial tcp 10.133.0.37:9000: connect: connection refused" Apr 20 19:36:13.994723 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:13.994686 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="keycloak-system/maas-keycloak-0" Apr 20 19:36:14.014241 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:14.014188 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="keycloak-system/maas-keycloak-0" podUID="3f44d534-b149-449a-9860-4202dd9adca8" containerName="keycloak" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 19:36:24.002030 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:24.001999 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="keycloak-system/maas-keycloak-0" Apr 20 19:36:26.351296 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:26.351259 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-pd8t9"] Apr 20 19:36:26.351877 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:26.351529 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-pd8t9" podUID="a8ab0fb9-5b29-413c-b26c-858db722d4fe" containerName="authorino" containerID="cri-o://f06a79be2887081be57dd28f0edfa03b6e73a145652f50c451f26f7a0dc957fe" gracePeriod=30 Apr 20 19:36:26.610637 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:26.610572 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-pd8t9" Apr 20 19:36:26.700612 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:26.700580 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dllx2\" (UniqueName: \"kubernetes.io/projected/a8ab0fb9-5b29-413c-b26c-858db722d4fe-kube-api-access-dllx2\") pod \"a8ab0fb9-5b29-413c-b26c-858db722d4fe\" (UID: \"a8ab0fb9-5b29-413c-b26c-858db722d4fe\") " Apr 20 19:36:26.702751 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:26.702718 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8ab0fb9-5b29-413c-b26c-858db722d4fe-kube-api-access-dllx2" (OuterVolumeSpecName: "kube-api-access-dllx2") pod "a8ab0fb9-5b29-413c-b26c-858db722d4fe" (UID: "a8ab0fb9-5b29-413c-b26c-858db722d4fe"). InnerVolumeSpecName "kube-api-access-dllx2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:36:26.802378 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:26.802342 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dllx2\" (UniqueName: \"kubernetes.io/projected/a8ab0fb9-5b29-413c-b26c-858db722d4fe-kube-api-access-dllx2\") on node \"ip-10-0-131-162.ec2.internal\" DevicePath \"\"" Apr 20 19:36:26.980516 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:26.980409 2572 generic.go:358] "Generic (PLEG): container finished" podID="a8ab0fb9-5b29-413c-b26c-858db722d4fe" containerID="f06a79be2887081be57dd28f0edfa03b6e73a145652f50c451f26f7a0dc957fe" exitCode=0 Apr 20 19:36:26.980516 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:26.980491 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-pd8t9" Apr 20 19:36:26.980516 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:26.980497 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-pd8t9" event={"ID":"a8ab0fb9-5b29-413c-b26c-858db722d4fe","Type":"ContainerDied","Data":"f06a79be2887081be57dd28f0edfa03b6e73a145652f50c451f26f7a0dc957fe"} Apr 20 19:36:26.980775 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:26.980537 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-pd8t9" event={"ID":"a8ab0fb9-5b29-413c-b26c-858db722d4fe","Type":"ContainerDied","Data":"5de88b8d5beee499ab367d5b53948396a5455c09f8f9a9f4d59d0513719acf38"} Apr 20 19:36:26.980775 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:26.980552 2572 scope.go:117] "RemoveContainer" containerID="f06a79be2887081be57dd28f0edfa03b6e73a145652f50c451f26f7a0dc957fe" Apr 20 19:36:26.989796 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:26.989776 2572 scope.go:117] "RemoveContainer" containerID="f06a79be2887081be57dd28f0edfa03b6e73a145652f50c451f26f7a0dc957fe" Apr 20 19:36:26.990091 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:36:26.990071 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f06a79be2887081be57dd28f0edfa03b6e73a145652f50c451f26f7a0dc957fe\": container with ID starting with f06a79be2887081be57dd28f0edfa03b6e73a145652f50c451f26f7a0dc957fe not found: ID does not exist" containerID="f06a79be2887081be57dd28f0edfa03b6e73a145652f50c451f26f7a0dc957fe" Apr 20 19:36:26.990152 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:26.990107 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f06a79be2887081be57dd28f0edfa03b6e73a145652f50c451f26f7a0dc957fe"} err="failed to get container status \"f06a79be2887081be57dd28f0edfa03b6e73a145652f50c451f26f7a0dc957fe\": rpc error: code = NotFound desc = could not find container \"f06a79be2887081be57dd28f0edfa03b6e73a145652f50c451f26f7a0dc957fe\": container with ID starting with f06a79be2887081be57dd28f0edfa03b6e73a145652f50c451f26f7a0dc957fe not found: ID does not exist" Apr 20 19:36:27.007342 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:27.007308 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-pd8t9"] Apr 20 19:36:27.015269 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:27.015237 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-pd8t9"] Apr 20 19:36:27.128117 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:27.128084 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-cr7kf"] Apr 20 19:36:27.128506 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:27.128491 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8ab0fb9-5b29-413c-b26c-858db722d4fe" containerName="authorino" Apr 20 19:36:27.128506 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:27.128507 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8ab0fb9-5b29-413c-b26c-858db722d4fe" containerName="authorino" Apr 20 19:36:27.128640 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:27.128587 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="a8ab0fb9-5b29-413c-b26c-858db722d4fe" containerName="authorino" Apr 20 19:36:27.132679 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:27.132656 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-cr7kf" Apr 20 19:36:27.138886 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:27.138860 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-h864r\"" Apr 20 19:36:27.142388 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:27.142359 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-cr7kf"] Apr 20 19:36:27.206709 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:27.206679 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5mrc\" (UniqueName: \"kubernetes.io/projected/f54b4d62-fe56-4373-851c-3102b0555745-kube-api-access-s5mrc\") pod \"maas-controller-6d4c8f55f9-cr7kf\" (UID: \"f54b4d62-fe56-4373-851c-3102b0555745\") " pod="opendatahub/maas-controller-6d4c8f55f9-cr7kf" Apr 20 19:36:27.275640 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:27.275555 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6687ccdc69-tn8ln"] Apr 20 19:36:27.278818 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:27.278799 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6687ccdc69-tn8ln" Apr 20 19:36:27.287166 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:27.287141 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6687ccdc69-tn8ln"] Apr 20 19:36:27.308181 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:27.308148 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s5mrc\" (UniqueName: \"kubernetes.io/projected/f54b4d62-fe56-4373-851c-3102b0555745-kube-api-access-s5mrc\") pod \"maas-controller-6d4c8f55f9-cr7kf\" (UID: \"f54b4d62-fe56-4373-851c-3102b0555745\") " pod="opendatahub/maas-controller-6d4c8f55f9-cr7kf" Apr 20 19:36:27.308321 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:27.308213 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwqjb\" (UniqueName: \"kubernetes.io/projected/0949d738-175c-43e7-b221-c7e68792b1a8-kube-api-access-wwqjb\") pod \"maas-controller-6687ccdc69-tn8ln\" (UID: \"0949d738-175c-43e7-b221-c7e68792b1a8\") " pod="opendatahub/maas-controller-6687ccdc69-tn8ln" Apr 20 19:36:27.316166 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:27.316138 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5mrc\" (UniqueName: \"kubernetes.io/projected/f54b4d62-fe56-4373-851c-3102b0555745-kube-api-access-s5mrc\") pod \"maas-controller-6d4c8f55f9-cr7kf\" (UID: \"f54b4d62-fe56-4373-851c-3102b0555745\") " pod="opendatahub/maas-controller-6d4c8f55f9-cr7kf" Apr 20 19:36:27.382751 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:27.382715 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6687ccdc69-tn8ln"] Apr 20 19:36:27.383120 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:36:27.383046 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-wwqjb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="opendatahub/maas-controller-6687ccdc69-tn8ln" podUID="0949d738-175c-43e7-b221-c7e68792b1a8" Apr 20 19:36:27.407785 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:27.407740 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6cf548645d-smmfr"] Apr 20 19:36:27.408842 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:27.408816 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wwqjb\" (UniqueName: \"kubernetes.io/projected/0949d738-175c-43e7-b221-c7e68792b1a8-kube-api-access-wwqjb\") pod \"maas-controller-6687ccdc69-tn8ln\" (UID: \"0949d738-175c-43e7-b221-c7e68792b1a8\") " pod="opendatahub/maas-controller-6687ccdc69-tn8ln" Apr 20 19:36:27.411540 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:27.411524 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6cf548645d-smmfr" Apr 20 19:36:27.421459 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:27.421422 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6cf548645d-smmfr"] Apr 20 19:36:27.422499 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:27.422478 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwqjb\" (UniqueName: \"kubernetes.io/projected/0949d738-175c-43e7-b221-c7e68792b1a8-kube-api-access-wwqjb\") pod \"maas-controller-6687ccdc69-tn8ln\" (UID: \"0949d738-175c-43e7-b221-c7e68792b1a8\") " pod="opendatahub/maas-controller-6687ccdc69-tn8ln" Apr 20 19:36:27.444537 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:27.444510 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-cr7kf" Apr 20 19:36:27.509632 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:27.509596 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg272\" (UniqueName: \"kubernetes.io/projected/29925e69-43f1-4b40-9a0e-4b083f5c6887-kube-api-access-cg272\") pod \"maas-controller-6cf548645d-smmfr\" (UID: \"29925e69-43f1-4b40-9a0e-4b083f5c6887\") " pod="opendatahub/maas-controller-6cf548645d-smmfr" Apr 20 19:36:27.568788 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:27.568763 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-cr7kf"] Apr 20 19:36:27.570386 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:36:27.570357 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf54b4d62_fe56_4373_851c_3102b0555745.slice/crio-bd16dd7cb1b497ce04c77e9cffe8b85d28a350082c9853a81b82ca11ce4430ec WatchSource:0}: Error finding container bd16dd7cb1b497ce04c77e9cffe8b85d28a350082c9853a81b82ca11ce4430ec: Status 404 returned error can't find the container with id bd16dd7cb1b497ce04c77e9cffe8b85d28a350082c9853a81b82ca11ce4430ec Apr 20 19:36:27.610634 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:27.610596 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cg272\" (UniqueName: \"kubernetes.io/projected/29925e69-43f1-4b40-9a0e-4b083f5c6887-kube-api-access-cg272\") pod \"maas-controller-6cf548645d-smmfr\" (UID: \"29925e69-43f1-4b40-9a0e-4b083f5c6887\") " pod="opendatahub/maas-controller-6cf548645d-smmfr" Apr 20 19:36:27.619402 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:27.619368 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg272\" (UniqueName: \"kubernetes.io/projected/29925e69-43f1-4b40-9a0e-4b083f5c6887-kube-api-access-cg272\") pod \"maas-controller-6cf548645d-smmfr\" (UID: \"29925e69-43f1-4b40-9a0e-4b083f5c6887\") " pod="opendatahub/maas-controller-6cf548645d-smmfr" Apr 20 19:36:27.723360 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:27.723323 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6cf548645d-smmfr" Apr 20 19:36:27.845753 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:27.845725 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6cf548645d-smmfr"] Apr 20 19:36:27.848052 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:36:27.848023 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29925e69_43f1_4b40_9a0e_4b083f5c6887.slice/crio-37ae99329568319f666a4b040bb2615cd3c19f4bf861a4a63459b4f462a3d4b8 WatchSource:0}: Error finding container 37ae99329568319f666a4b040bb2615cd3c19f4bf861a4a63459b4f462a3d4b8: Status 404 returned error can't find the container with id 37ae99329568319f666a4b040bb2615cd3c19f4bf861a4a63459b4f462a3d4b8 Apr 20 19:36:27.987324 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:27.987273 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6cf548645d-smmfr" event={"ID":"29925e69-43f1-4b40-9a0e-4b083f5c6887","Type":"ContainerStarted","Data":"37ae99329568319f666a4b040bb2615cd3c19f4bf861a4a63459b4f462a3d4b8"} Apr 20 19:36:27.989773 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:27.989740 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-cr7kf" event={"ID":"f54b4d62-fe56-4373-851c-3102b0555745","Type":"ContainerStarted","Data":"bd16dd7cb1b497ce04c77e9cffe8b85d28a350082c9853a81b82ca11ce4430ec"} Apr 20 19:36:27.989773 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:27.989755 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6687ccdc69-tn8ln" Apr 20 19:36:27.996915 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:27.996889 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6687ccdc69-tn8ln" Apr 20 19:36:28.013639 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:28.013613 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwqjb\" (UniqueName: \"kubernetes.io/projected/0949d738-175c-43e7-b221-c7e68792b1a8-kube-api-access-wwqjb\") pod \"0949d738-175c-43e7-b221-c7e68792b1a8\" (UID: \"0949d738-175c-43e7-b221-c7e68792b1a8\") " Apr 20 19:36:28.016018 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:28.015988 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0949d738-175c-43e7-b221-c7e68792b1a8-kube-api-access-wwqjb" (OuterVolumeSpecName: "kube-api-access-wwqjb") pod "0949d738-175c-43e7-b221-c7e68792b1a8" (UID: "0949d738-175c-43e7-b221-c7e68792b1a8"). InnerVolumeSpecName "kube-api-access-wwqjb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:36:28.115778 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:28.115691 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wwqjb\" (UniqueName: \"kubernetes.io/projected/0949d738-175c-43e7-b221-c7e68792b1a8-kube-api-access-wwqjb\") on node \"ip-10-0-131-162.ec2.internal\" DevicePath \"\"" Apr 20 19:36:28.607492 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:28.606281 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8ab0fb9-5b29-413c-b26c-858db722d4fe" path="/var/lib/kubelet/pods/a8ab0fb9-5b29-413c-b26c-858db722d4fe/volumes" Apr 20 19:36:28.995805 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:28.995776 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6687ccdc69-tn8ln" Apr 20 19:36:29.024467 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:29.024075 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6687ccdc69-tn8ln"] Apr 20 19:36:29.028677 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:29.028619 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6687ccdc69-tn8ln"] Apr 20 19:36:30.604508 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:30.604476 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0949d738-175c-43e7-b221-c7e68792b1a8" path="/var/lib/kubelet/pods/0949d738-175c-43e7-b221-c7e68792b1a8/volumes" Apr 20 19:36:31.006669 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:31.006632 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6cf548645d-smmfr" event={"ID":"29925e69-43f1-4b40-9a0e-4b083f5c6887","Type":"ContainerStarted","Data":"8f66f1a7b621340c0e7a240da34ca9f0e285cbef39fb139bfdf034d4916ba696"} Apr 20 19:36:31.006872 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:31.006704 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6cf548645d-smmfr" Apr 20 19:36:31.008343 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:31.008313 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-cr7kf" event={"ID":"f54b4d62-fe56-4373-851c-3102b0555745","Type":"ContainerStarted","Data":"8196ce34cc156a7210fdb0c63e0c6036696412db51679810a956a86c869d0714"} Apr 20 19:36:31.008489 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:31.008450 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6d4c8f55f9-cr7kf" Apr 20 19:36:31.028253 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:31.028209 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6cf548645d-smmfr" podStartSLOduration=1.353798702 podStartE2EDuration="4.028195626s" podCreationTimestamp="2026-04-20 19:36:27 +0000 UTC" firstStartedPulling="2026-04-20 19:36:27.849434003 +0000 UTC m=+661.839921515" lastFinishedPulling="2026-04-20 19:36:30.523830926 +0000 UTC m=+664.514318439" observedRunningTime="2026-04-20 19:36:31.026187909 +0000 UTC m=+665.016675444" watchObservedRunningTime="2026-04-20 19:36:31.028195626 +0000 UTC m=+665.018683162" Apr 20 19:36:31.047436 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:31.047391 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6d4c8f55f9-cr7kf" podStartSLOduration=1.098526869 podStartE2EDuration="4.047377161s" podCreationTimestamp="2026-04-20 19:36:27 +0000 UTC" firstStartedPulling="2026-04-20 19:36:27.571821207 +0000 UTC m=+661.562308720" lastFinishedPulling="2026-04-20 19:36:30.520671498 +0000 UTC m=+664.511159012" observedRunningTime="2026-04-20 19:36:31.04632941 +0000 UTC m=+665.036816940" watchObservedRunningTime="2026-04-20 19:36:31.047377161 +0000 UTC m=+665.037864696" Apr 20 19:36:32.814091 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:32.814054 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-796c56555f-bbvsj"] Apr 20 19:36:32.890675 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:32.890639 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-796c56555f-bbvsj"] Apr 20 19:36:32.890865 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:32.890765 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-796c56555f-bbvsj" Apr 20 19:36:32.893374 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:32.893350 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 20 19:36:32.894303 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:32.894284 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 20 19:36:32.894415 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:32.894285 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-xf29q\"" Apr 20 19:36:32.964947 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:32.964909 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvxs4\" (UniqueName: \"kubernetes.io/projected/4c7766a2-46bc-4c85-bc78-67046b90b14e-kube-api-access-wvxs4\") pod \"maas-api-796c56555f-bbvsj\" (UID: \"4c7766a2-46bc-4c85-bc78-67046b90b14e\") " pod="opendatahub/maas-api-796c56555f-bbvsj" Apr 20 19:36:32.965112 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:32.964977 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/4c7766a2-46bc-4c85-bc78-67046b90b14e-maas-api-tls\") pod \"maas-api-796c56555f-bbvsj\" (UID: \"4c7766a2-46bc-4c85-bc78-67046b90b14e\") " pod="opendatahub/maas-api-796c56555f-bbvsj" Apr 20 19:36:33.065830 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:33.065738 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/4c7766a2-46bc-4c85-bc78-67046b90b14e-maas-api-tls\") pod \"maas-api-796c56555f-bbvsj\" (UID: \"4c7766a2-46bc-4c85-bc78-67046b90b14e\") " pod="opendatahub/maas-api-796c56555f-bbvsj" Apr 20 19:36:33.065979 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:33.065859 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wvxs4\" (UniqueName: \"kubernetes.io/projected/4c7766a2-46bc-4c85-bc78-67046b90b14e-kube-api-access-wvxs4\") pod \"maas-api-796c56555f-bbvsj\" (UID: \"4c7766a2-46bc-4c85-bc78-67046b90b14e\") " pod="opendatahub/maas-api-796c56555f-bbvsj" Apr 20 19:36:33.068304 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:33.068273 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/4c7766a2-46bc-4c85-bc78-67046b90b14e-maas-api-tls\") pod \"maas-api-796c56555f-bbvsj\" (UID: \"4c7766a2-46bc-4c85-bc78-67046b90b14e\") " pod="opendatahub/maas-api-796c56555f-bbvsj" Apr 20 19:36:33.074567 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:33.074543 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvxs4\" (UniqueName: \"kubernetes.io/projected/4c7766a2-46bc-4c85-bc78-67046b90b14e-kube-api-access-wvxs4\") pod \"maas-api-796c56555f-bbvsj\" (UID: \"4c7766a2-46bc-4c85-bc78-67046b90b14e\") " pod="opendatahub/maas-api-796c56555f-bbvsj" Apr 20 19:36:33.202457 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:33.202410 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-796c56555f-bbvsj" Apr 20 19:36:33.542006 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:33.541978 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-796c56555f-bbvsj"] Apr 20 19:36:33.544224 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:36:33.544197 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c7766a2_46bc_4c85_bc78_67046b90b14e.slice/crio-3ca58666ca254114be270563cbf170bfcd542a0315cca13f85ff2ae8e064d61f WatchSource:0}: Error finding container 3ca58666ca254114be270563cbf170bfcd542a0315cca13f85ff2ae8e064d61f: Status 404 returned error can't find the container with id 3ca58666ca254114be270563cbf170bfcd542a0315cca13f85ff2ae8e064d61f Apr 20 19:36:34.023611 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:34.023574 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-796c56555f-bbvsj" event={"ID":"4c7766a2-46bc-4c85-bc78-67046b90b14e","Type":"ContainerStarted","Data":"3ca58666ca254114be270563cbf170bfcd542a0315cca13f85ff2ae8e064d61f"} Apr 20 19:36:35.028279 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:35.028244 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-796c56555f-bbvsj" event={"ID":"4c7766a2-46bc-4c85-bc78-67046b90b14e","Type":"ContainerStarted","Data":"59cb9000113927555f4a7ab07561cd8df2f72bd33d4d2dc45eef22c2bba05935"} Apr 20 19:36:35.028663 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:35.028361 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-796c56555f-bbvsj" Apr 20 19:36:35.047138 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:35.047087 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-796c56555f-bbvsj" podStartSLOduration=1.661190681 podStartE2EDuration="3.047070643s" podCreationTimestamp="2026-04-20 19:36:32 +0000 UTC" firstStartedPulling="2026-04-20 19:36:33.546146287 +0000 UTC m=+667.536633803" lastFinishedPulling="2026-04-20 19:36:34.932026249 +0000 UTC m=+668.922513765" observedRunningTime="2026-04-20 19:36:35.044564081 +0000 UTC m=+669.035051626" watchObservedRunningTime="2026-04-20 19:36:35.047070643 +0000 UTC m=+669.037558177" Apr 20 19:36:41.037871 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:41.037839 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-796c56555f-bbvsj" Apr 20 19:36:42.017184 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:42.017149 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-6cf548645d-smmfr" Apr 20 19:36:42.017366 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:42.017210 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-6d4c8f55f9-cr7kf" Apr 20 19:36:42.076111 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:42.076073 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-cr7kf"] Apr 20 19:36:42.076526 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:42.076291 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6d4c8f55f9-cr7kf" podUID="f54b4d62-fe56-4373-851c-3102b0555745" containerName="manager" containerID="cri-o://8196ce34cc156a7210fdb0c63e0c6036696412db51679810a956a86c869d0714" gracePeriod=10 Apr 20 19:36:42.327368 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:42.327344 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-cr7kf" Apr 20 19:36:42.362967 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:42.362931 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-56cb74577d-pp6s4"] Apr 20 19:36:42.363518 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:42.363492 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f54b4d62-fe56-4373-851c-3102b0555745" containerName="manager" Apr 20 19:36:42.363518 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:42.363515 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f54b4d62-fe56-4373-851c-3102b0555745" containerName="manager" Apr 20 19:36:42.363719 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:42.363594 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="f54b4d62-fe56-4373-851c-3102b0555745" containerName="manager" Apr 20 19:36:42.367063 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:42.367045 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-56cb74577d-pp6s4" Apr 20 19:36:42.376028 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:42.376001 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-56cb74577d-pp6s4"] Apr 20 19:36:42.473396 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:42.473358 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5mrc\" (UniqueName: \"kubernetes.io/projected/f54b4d62-fe56-4373-851c-3102b0555745-kube-api-access-s5mrc\") pod \"f54b4d62-fe56-4373-851c-3102b0555745\" (UID: \"f54b4d62-fe56-4373-851c-3102b0555745\") " Apr 20 19:36:42.473623 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:42.473610 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86r5j\" (UniqueName: \"kubernetes.io/projected/4122ac80-0aee-4c28-8268-ef519cfb0da8-kube-api-access-86r5j\") pod \"maas-controller-56cb74577d-pp6s4\" (UID: \"4122ac80-0aee-4c28-8268-ef519cfb0da8\") " pod="opendatahub/maas-controller-56cb74577d-pp6s4" Apr 20 19:36:42.475582 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:42.475554 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f54b4d62-fe56-4373-851c-3102b0555745-kube-api-access-s5mrc" (OuterVolumeSpecName: "kube-api-access-s5mrc") pod "f54b4d62-fe56-4373-851c-3102b0555745" (UID: "f54b4d62-fe56-4373-851c-3102b0555745"). InnerVolumeSpecName "kube-api-access-s5mrc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:36:42.574279 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:42.574246 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-86r5j\" (UniqueName: \"kubernetes.io/projected/4122ac80-0aee-4c28-8268-ef519cfb0da8-kube-api-access-86r5j\") pod \"maas-controller-56cb74577d-pp6s4\" (UID: \"4122ac80-0aee-4c28-8268-ef519cfb0da8\") " pod="opendatahub/maas-controller-56cb74577d-pp6s4" Apr 20 19:36:42.574603 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:42.574369 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s5mrc\" (UniqueName: \"kubernetes.io/projected/f54b4d62-fe56-4373-851c-3102b0555745-kube-api-access-s5mrc\") on node \"ip-10-0-131-162.ec2.internal\" DevicePath \"\"" Apr 20 19:36:42.583299 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:42.583277 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-86r5j\" (UniqueName: \"kubernetes.io/projected/4122ac80-0aee-4c28-8268-ef519cfb0da8-kube-api-access-86r5j\") pod \"maas-controller-56cb74577d-pp6s4\" (UID: \"4122ac80-0aee-4c28-8268-ef519cfb0da8\") " pod="opendatahub/maas-controller-56cb74577d-pp6s4" Apr 20 19:36:42.679140 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:42.679101 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-56cb74577d-pp6s4" Apr 20 19:36:42.808356 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:42.808325 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-56cb74577d-pp6s4"] Apr 20 19:36:42.809671 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:36:42.809643 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4122ac80_0aee_4c28_8268_ef519cfb0da8.slice/crio-2b24bf8990f588f16b492fbefa2f65688940150995cc4e0ede848b7a5e6dfd5d WatchSource:0}: Error finding container 2b24bf8990f588f16b492fbefa2f65688940150995cc4e0ede848b7a5e6dfd5d: Status 404 returned error can't find the container with id 2b24bf8990f588f16b492fbefa2f65688940150995cc4e0ede848b7a5e6dfd5d Apr 20 19:36:43.057986 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:43.057893 2572 generic.go:358] "Generic (PLEG): container finished" podID="f54b4d62-fe56-4373-851c-3102b0555745" containerID="8196ce34cc156a7210fdb0c63e0c6036696412db51679810a956a86c869d0714" exitCode=0 Apr 20 19:36:43.057986 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:43.057966 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-cr7kf" event={"ID":"f54b4d62-fe56-4373-851c-3102b0555745","Type":"ContainerDied","Data":"8196ce34cc156a7210fdb0c63e0c6036696412db51679810a956a86c869d0714"} Apr 20 19:36:43.057986 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:43.057975 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-cr7kf" Apr 20 19:36:43.058272 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:43.057999 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-cr7kf" event={"ID":"f54b4d62-fe56-4373-851c-3102b0555745","Type":"ContainerDied","Data":"bd16dd7cb1b497ce04c77e9cffe8b85d28a350082c9853a81b82ca11ce4430ec"} Apr 20 19:36:43.058272 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:43.058021 2572 scope.go:117] "RemoveContainer" containerID="8196ce34cc156a7210fdb0c63e0c6036696412db51679810a956a86c869d0714" Apr 20 19:36:43.059243 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:43.059204 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-56cb74577d-pp6s4" event={"ID":"4122ac80-0aee-4c28-8268-ef519cfb0da8","Type":"ContainerStarted","Data":"2b24bf8990f588f16b492fbefa2f65688940150995cc4e0ede848b7a5e6dfd5d"} Apr 20 19:36:43.066262 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:43.066245 2572 scope.go:117] "RemoveContainer" containerID="8196ce34cc156a7210fdb0c63e0c6036696412db51679810a956a86c869d0714" Apr 20 19:36:43.066547 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:36:43.066523 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8196ce34cc156a7210fdb0c63e0c6036696412db51679810a956a86c869d0714\": container with ID starting with 8196ce34cc156a7210fdb0c63e0c6036696412db51679810a956a86c869d0714 not found: ID does not exist" containerID="8196ce34cc156a7210fdb0c63e0c6036696412db51679810a956a86c869d0714" Apr 20 19:36:43.066620 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:43.066554 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8196ce34cc156a7210fdb0c63e0c6036696412db51679810a956a86c869d0714"} err="failed to get container status \"8196ce34cc156a7210fdb0c63e0c6036696412db51679810a956a86c869d0714\": rpc error: code = NotFound desc = could not find container \"8196ce34cc156a7210fdb0c63e0c6036696412db51679810a956a86c869d0714\": container with ID starting with 8196ce34cc156a7210fdb0c63e0c6036696412db51679810a956a86c869d0714 not found: ID does not exist" Apr 20 19:36:43.076983 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:43.076955 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-cr7kf"] Apr 20 19:36:43.080373 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:43.080352 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-cr7kf"] Apr 20 19:36:44.065118 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:44.065086 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-56cb74577d-pp6s4" event={"ID":"4122ac80-0aee-4c28-8268-ef519cfb0da8","Type":"ContainerStarted","Data":"b31e961b264dd3a9db5a74f2e930dfd6f8feff2591de039ca51f116560303022"} Apr 20 19:36:44.065301 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:44.065181 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-56cb74577d-pp6s4" Apr 20 19:36:44.083369 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:44.083319 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-56cb74577d-pp6s4" podStartSLOduration=1.775880336 podStartE2EDuration="2.083305828s" podCreationTimestamp="2026-04-20 19:36:42 +0000 UTC" firstStartedPulling="2026-04-20 19:36:42.810975674 +0000 UTC m=+676.801463187" lastFinishedPulling="2026-04-20 19:36:43.118401153 +0000 UTC m=+677.108888679" observedRunningTime="2026-04-20 19:36:44.081529913 +0000 UTC m=+678.072017449" watchObservedRunningTime="2026-04-20 19:36:44.083305828 +0000 UTC m=+678.073793363" Apr 20 19:36:44.603805 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:44.603763 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f54b4d62-fe56-4373-851c-3102b0555745" path="/var/lib/kubelet/pods/f54b4d62-fe56-4373-851c-3102b0555745/volumes" Apr 20 19:36:55.074298 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:55.074258 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-56cb74577d-pp6s4" Apr 20 19:36:55.120295 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:55.120258 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6cf548645d-smmfr"] Apr 20 19:36:55.120547 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:55.120524 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6cf548645d-smmfr" podUID="29925e69-43f1-4b40-9a0e-4b083f5c6887" containerName="manager" containerID="cri-o://8f66f1a7b621340c0e7a240da34ca9f0e285cbef39fb139bfdf034d4916ba696" gracePeriod=10 Apr 20 19:36:55.364662 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:55.364637 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6cf548645d-smmfr" Apr 20 19:36:55.494237 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:55.494197 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cg272\" (UniqueName: \"kubernetes.io/projected/29925e69-43f1-4b40-9a0e-4b083f5c6887-kube-api-access-cg272\") pod \"29925e69-43f1-4b40-9a0e-4b083f5c6887\" (UID: \"29925e69-43f1-4b40-9a0e-4b083f5c6887\") " Apr 20 19:36:55.496305 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:55.496278 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29925e69-43f1-4b40-9a0e-4b083f5c6887-kube-api-access-cg272" (OuterVolumeSpecName: "kube-api-access-cg272") pod "29925e69-43f1-4b40-9a0e-4b083f5c6887" (UID: "29925e69-43f1-4b40-9a0e-4b083f5c6887"). InnerVolumeSpecName "kube-api-access-cg272". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:36:55.595345 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:55.595308 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cg272\" (UniqueName: \"kubernetes.io/projected/29925e69-43f1-4b40-9a0e-4b083f5c6887-kube-api-access-cg272\") on node \"ip-10-0-131-162.ec2.internal\" DevicePath \"\"" Apr 20 19:36:56.110854 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:56.110816 2572 generic.go:358] "Generic (PLEG): container finished" podID="29925e69-43f1-4b40-9a0e-4b083f5c6887" containerID="8f66f1a7b621340c0e7a240da34ca9f0e285cbef39fb139bfdf034d4916ba696" exitCode=0 Apr 20 19:36:56.111329 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:56.110877 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6cf548645d-smmfr" Apr 20 19:36:56.111329 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:56.110901 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6cf548645d-smmfr" event={"ID":"29925e69-43f1-4b40-9a0e-4b083f5c6887","Type":"ContainerDied","Data":"8f66f1a7b621340c0e7a240da34ca9f0e285cbef39fb139bfdf034d4916ba696"} Apr 20 19:36:56.111329 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:56.110942 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6cf548645d-smmfr" event={"ID":"29925e69-43f1-4b40-9a0e-4b083f5c6887","Type":"ContainerDied","Data":"37ae99329568319f666a4b040bb2615cd3c19f4bf861a4a63459b4f462a3d4b8"} Apr 20 19:36:56.111329 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:56.110963 2572 scope.go:117] "RemoveContainer" containerID="8f66f1a7b621340c0e7a240da34ca9f0e285cbef39fb139bfdf034d4916ba696" Apr 20 19:36:56.119966 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:56.119948 2572 scope.go:117] "RemoveContainer" containerID="8f66f1a7b621340c0e7a240da34ca9f0e285cbef39fb139bfdf034d4916ba696" Apr 20 19:36:56.120271 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:36:56.120252 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f66f1a7b621340c0e7a240da34ca9f0e285cbef39fb139bfdf034d4916ba696\": container with ID starting with 8f66f1a7b621340c0e7a240da34ca9f0e285cbef39fb139bfdf034d4916ba696 not found: ID does not exist" containerID="8f66f1a7b621340c0e7a240da34ca9f0e285cbef39fb139bfdf034d4916ba696" Apr 20 19:36:56.120333 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:56.120284 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f66f1a7b621340c0e7a240da34ca9f0e285cbef39fb139bfdf034d4916ba696"} err="failed to get container status \"8f66f1a7b621340c0e7a240da34ca9f0e285cbef39fb139bfdf034d4916ba696\": rpc error: code = NotFound desc = could not find container \"8f66f1a7b621340c0e7a240da34ca9f0e285cbef39fb139bfdf034d4916ba696\": container with ID starting with 8f66f1a7b621340c0e7a240da34ca9f0e285cbef39fb139bfdf034d4916ba696 not found: ID does not exist" Apr 20 19:36:56.133266 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:56.133235 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6cf548645d-smmfr"] Apr 20 19:36:56.139197 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:56.139168 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6cf548645d-smmfr"] Apr 20 19:36:56.231973 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:56.231937 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 20 19:36:56.232190 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:56.232164 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="keycloak-system/maas-keycloak-0" podUID="3f44d534-b149-449a-9860-4202dd9adca8" containerName="keycloak" containerID="cri-o://0ae0c2a3a002c2dc94b4d5418e43f135b5ad0467b38c13494b148cfb2e61f7c5" gracePeriod=30 Apr 20 19:36:56.604038 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:56.604007 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29925e69-43f1-4b40-9a0e-4b083f5c6887" path="/var/lib/kubelet/pods/29925e69-43f1-4b40-9a0e-4b083f5c6887/volumes" Apr 20 19:36:57.872215 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:57.872189 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 20 19:36:58.017284 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:58.017185 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dddd\" (UniqueName: \"kubernetes.io/projected/3f44d534-b149-449a-9860-4202dd9adca8-kube-api-access-6dddd\") pod \"3f44d534-b149-449a-9860-4202dd9adca8\" (UID: \"3f44d534-b149-449a-9860-4202dd9adca8\") " Apr 20 19:36:58.019479 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:58.019426 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f44d534-b149-449a-9860-4202dd9adca8-kube-api-access-6dddd" (OuterVolumeSpecName: "kube-api-access-6dddd") pod "3f44d534-b149-449a-9860-4202dd9adca8" (UID: "3f44d534-b149-449a-9860-4202dd9adca8"). InnerVolumeSpecName "kube-api-access-6dddd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:36:58.118101 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:58.118070 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6dddd\" (UniqueName: \"kubernetes.io/projected/3f44d534-b149-449a-9860-4202dd9adca8-kube-api-access-6dddd\") on node \"ip-10-0-131-162.ec2.internal\" DevicePath \"\"" Apr 20 19:36:58.120326 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:58.120301 2572 generic.go:358] "Generic (PLEG): container finished" podID="3f44d534-b149-449a-9860-4202dd9adca8" containerID="0ae0c2a3a002c2dc94b4d5418e43f135b5ad0467b38c13494b148cfb2e61f7c5" exitCode=143 Apr 20 19:36:58.120467 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:58.120343 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"3f44d534-b149-449a-9860-4202dd9adca8","Type":"ContainerDied","Data":"0ae0c2a3a002c2dc94b4d5418e43f135b5ad0467b38c13494b148cfb2e61f7c5"} Apr 20 19:36:58.120467 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:58.120363 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 20 19:36:58.120467 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:58.120375 2572 scope.go:117] "RemoveContainer" containerID="0ae0c2a3a002c2dc94b4d5418e43f135b5ad0467b38c13494b148cfb2e61f7c5" Apr 20 19:36:58.120467 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:58.120366 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"3f44d534-b149-449a-9860-4202dd9adca8","Type":"ContainerDied","Data":"94d05f77cc7b3e6da79372d7b65a4ddd1eae41f25d9a93c3e76a03a161f2da24"} Apr 20 19:36:58.129593 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:58.129572 2572 scope.go:117] "RemoveContainer" containerID="0ae0c2a3a002c2dc94b4d5418e43f135b5ad0467b38c13494b148cfb2e61f7c5" Apr 20 19:36:58.129862 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:36:58.129843 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ae0c2a3a002c2dc94b4d5418e43f135b5ad0467b38c13494b148cfb2e61f7c5\": container with ID starting with 0ae0c2a3a002c2dc94b4d5418e43f135b5ad0467b38c13494b148cfb2e61f7c5 not found: ID does not exist" containerID="0ae0c2a3a002c2dc94b4d5418e43f135b5ad0467b38c13494b148cfb2e61f7c5" Apr 20 19:36:58.129909 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:58.129870 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ae0c2a3a002c2dc94b4d5418e43f135b5ad0467b38c13494b148cfb2e61f7c5"} err="failed to get container status \"0ae0c2a3a002c2dc94b4d5418e43f135b5ad0467b38c13494b148cfb2e61f7c5\": rpc error: code = NotFound desc = could not find container \"0ae0c2a3a002c2dc94b4d5418e43f135b5ad0467b38c13494b148cfb2e61f7c5\": container with ID starting with 0ae0c2a3a002c2dc94b4d5418e43f135b5ad0467b38c13494b148cfb2e61f7c5 not found: ID does not exist" Apr 20 19:36:58.141738 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:58.141716 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 20 19:36:58.146107 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:58.146083 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 20 19:36:58.170348 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:58.170314 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 20 19:36:58.170764 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:58.170750 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29925e69-43f1-4b40-9a0e-4b083f5c6887" containerName="manager" Apr 20 19:36:58.170806 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:58.170767 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="29925e69-43f1-4b40-9a0e-4b083f5c6887" containerName="manager" Apr 20 19:36:58.170806 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:58.170792 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f44d534-b149-449a-9860-4202dd9adca8" containerName="keycloak" Apr 20 19:36:58.170806 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:58.170798 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f44d534-b149-449a-9860-4202dd9adca8" containerName="keycloak" Apr 20 19:36:58.170903 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:58.170858 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="29925e69-43f1-4b40-9a0e-4b083f5c6887" containerName="manager" Apr 20 19:36:58.170903 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:58.170869 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="3f44d534-b149-449a-9860-4202dd9adca8" containerName="keycloak" Apr 20 19:36:58.175403 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:58.175379 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 20 19:36:58.178026 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:58.177991 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"kube-root-ca.crt\"" Apr 20 19:36:58.178163 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:58.178075 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"openshift-service-ca.crt\"" Apr 20 19:36:58.178231 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:58.178170 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"maas-keycloak-initial-admin\"" Apr 20 19:36:58.178372 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:58.178357 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"keycloak-test-realms\"" Apr 20 19:36:58.178411 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:58.178363 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"default-dockercfg-5pbf7\"" Apr 20 19:36:58.183981 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:58.183953 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 20 19:36:58.319268 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:58.319234 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-realms\" (UniqueName: \"kubernetes.io/configmap/2b2ae287-9c19-4599-8acf-af5c6e175f6d-test-realms\") pod \"maas-keycloak-0\" (UID: \"2b2ae287-9c19-4599-8acf-af5c6e175f6d\") " pod="keycloak-system/maas-keycloak-0" Apr 20 19:36:58.319430 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:58.319305 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmzlg\" (UniqueName: \"kubernetes.io/projected/2b2ae287-9c19-4599-8acf-af5c6e175f6d-kube-api-access-kmzlg\") pod \"maas-keycloak-0\" (UID: \"2b2ae287-9c19-4599-8acf-af5c6e175f6d\") " pod="keycloak-system/maas-keycloak-0" Apr 20 19:36:58.419915 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:58.419868 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"test-realms\" (UniqueName: \"kubernetes.io/configmap/2b2ae287-9c19-4599-8acf-af5c6e175f6d-test-realms\") pod \"maas-keycloak-0\" (UID: \"2b2ae287-9c19-4599-8acf-af5c6e175f6d\") " pod="keycloak-system/maas-keycloak-0" Apr 20 19:36:58.419915 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:58.419918 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kmzlg\" (UniqueName: \"kubernetes.io/projected/2b2ae287-9c19-4599-8acf-af5c6e175f6d-kube-api-access-kmzlg\") pod \"maas-keycloak-0\" (UID: \"2b2ae287-9c19-4599-8acf-af5c6e175f6d\") " pod="keycloak-system/maas-keycloak-0" Apr 20 19:36:58.420569 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:58.420549 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"test-realms\" (UniqueName: \"kubernetes.io/configmap/2b2ae287-9c19-4599-8acf-af5c6e175f6d-test-realms\") pod \"maas-keycloak-0\" (UID: \"2b2ae287-9c19-4599-8acf-af5c6e175f6d\") " pod="keycloak-system/maas-keycloak-0" Apr 20 19:36:58.428869 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:58.428846 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmzlg\" (UniqueName: \"kubernetes.io/projected/2b2ae287-9c19-4599-8acf-af5c6e175f6d-kube-api-access-kmzlg\") pod \"maas-keycloak-0\" (UID: \"2b2ae287-9c19-4599-8acf-af5c6e175f6d\") " pod="keycloak-system/maas-keycloak-0" Apr 20 19:36:58.486004 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:58.485965 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 20 19:36:58.603031 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:58.602956 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f44d534-b149-449a-9860-4202dd9adca8" path="/var/lib/kubelet/pods/3f44d534-b149-449a-9860-4202dd9adca8/volumes" Apr 20 19:36:58.612329 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:58.612122 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 20 19:36:58.618218 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:58.617673 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 19:36:59.126392 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:59.126355 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"2b2ae287-9c19-4599-8acf-af5c6e175f6d","Type":"ContainerStarted","Data":"e9ce0e4866eb55dba6bdf05e29c3c2e542030bd3cc5d68ab6fcaf8542094cf48"} Apr 20 19:36:59.126392 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:59.126395 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"2b2ae287-9c19-4599-8acf-af5c6e175f6d","Type":"ContainerStarted","Data":"4f7c4012ffed58a4c80471624a0b4b38da4aaf6809df18da7d0398cfc3ed09fb"} Apr 20 19:36:59.147213 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:59.147148 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keycloak-system/maas-keycloak-0" podStartSLOduration=0.774900665 podStartE2EDuration="1.147129928s" podCreationTimestamp="2026-04-20 19:36:58 +0000 UTC" firstStartedPulling="2026-04-20 19:36:58.617878525 +0000 UTC m=+692.608366038" lastFinishedPulling="2026-04-20 19:36:58.990107788 +0000 UTC m=+692.980595301" observedRunningTime="2026-04-20 19:36:59.145781087 +0000 UTC m=+693.136268623" watchObservedRunningTime="2026-04-20 19:36:59.147129928 +0000 UTC m=+693.137617465" Apr 20 19:36:59.486558 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:59.486521 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keycloak-system/maas-keycloak-0" Apr 20 19:36:59.488390 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:36:59.488351 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="2b2ae287-9c19-4599-8acf-af5c6e175f6d" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.43:9000/health/started\": dial tcp 10.133.0.43:9000: connect: connection refused" Apr 20 19:37:00.487459 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:00.487401 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="2b2ae287-9c19-4599-8acf-af5c6e175f6d" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.43:9000/health/started\": dial tcp 10.133.0.43:9000: connect: connection refused" Apr 20 19:37:01.487566 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:01.487504 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="2b2ae287-9c19-4599-8acf-af5c6e175f6d" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.43:9000/health/started\": dial tcp 10.133.0.43:9000: connect: connection refused" Apr 20 19:37:02.486723 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:02.486674 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="2b2ae287-9c19-4599-8acf-af5c6e175f6d" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.43:9000/health/started\": dial tcp 10.133.0.43:9000: connect: connection refused" Apr 20 19:37:03.487797 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:03.487366 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="2b2ae287-9c19-4599-8acf-af5c6e175f6d" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.43:9000/health/started\": dial tcp 10.133.0.43:9000: connect: connection refused" Apr 20 19:37:04.151703 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:04.151664 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-6996cc8f49-8fj8h"] Apr 20 19:37:04.159183 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:04.159139 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6996cc8f49-8fj8h" Apr 20 19:37:04.168064 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:04.168006 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-6996cc8f49-8fj8h"] Apr 20 19:37:04.175120 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:04.174976 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8z2j\" (UniqueName: \"kubernetes.io/projected/350c00d6-81b0-47a4-9396-0547f4b26823-kube-api-access-f8z2j\") pod \"maas-api-6996cc8f49-8fj8h\" (UID: \"350c00d6-81b0-47a4-9396-0547f4b26823\") " pod="opendatahub/maas-api-6996cc8f49-8fj8h" Apr 20 19:37:04.175120 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:04.175027 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/350c00d6-81b0-47a4-9396-0547f4b26823-maas-api-tls\") pod \"maas-api-6996cc8f49-8fj8h\" (UID: \"350c00d6-81b0-47a4-9396-0547f4b26823\") " pod="opendatahub/maas-api-6996cc8f49-8fj8h" Apr 20 19:37:04.276533 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:04.275829 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f8z2j\" (UniqueName: \"kubernetes.io/projected/350c00d6-81b0-47a4-9396-0547f4b26823-kube-api-access-f8z2j\") pod \"maas-api-6996cc8f49-8fj8h\" (UID: \"350c00d6-81b0-47a4-9396-0547f4b26823\") " pod="opendatahub/maas-api-6996cc8f49-8fj8h" Apr 20 19:37:04.276533 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:04.275888 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/350c00d6-81b0-47a4-9396-0547f4b26823-maas-api-tls\") pod \"maas-api-6996cc8f49-8fj8h\" (UID: \"350c00d6-81b0-47a4-9396-0547f4b26823\") " pod="opendatahub/maas-api-6996cc8f49-8fj8h" Apr 20 19:37:04.279541 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:04.279425 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/350c00d6-81b0-47a4-9396-0547f4b26823-maas-api-tls\") pod \"maas-api-6996cc8f49-8fj8h\" (UID: \"350c00d6-81b0-47a4-9396-0547f4b26823\") " pod="opendatahub/maas-api-6996cc8f49-8fj8h" Apr 20 19:37:04.288007 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:04.287932 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8z2j\" (UniqueName: \"kubernetes.io/projected/350c00d6-81b0-47a4-9396-0547f4b26823-kube-api-access-f8z2j\") pod \"maas-api-6996cc8f49-8fj8h\" (UID: \"350c00d6-81b0-47a4-9396-0547f4b26823\") " pod="opendatahub/maas-api-6996cc8f49-8fj8h" Apr 20 19:37:04.482020 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:04.481907 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6996cc8f49-8fj8h" Apr 20 19:37:04.487352 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:04.487311 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="2b2ae287-9c19-4599-8acf-af5c6e175f6d" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.43:9000/health/started\": dial tcp 10.133.0.43:9000: connect: connection refused" Apr 20 19:37:04.896626 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:04.896374 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-6996cc8f49-8fj8h"] Apr 20 19:37:04.904397 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:37:04.904347 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod350c00d6_81b0_47a4_9396_0547f4b26823.slice/crio-f5ad946d05120fb8502e5407c92c29b3d6be752152a8a4754683f3ec6f2006a1 WatchSource:0}: Error finding container f5ad946d05120fb8502e5407c92c29b3d6be752152a8a4754683f3ec6f2006a1: Status 404 returned error can't find the container with id f5ad946d05120fb8502e5407c92c29b3d6be752152a8a4754683f3ec6f2006a1 Apr 20 19:37:05.161087 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:05.160969 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6996cc8f49-8fj8h" event={"ID":"350c00d6-81b0-47a4-9396-0547f4b26823","Type":"ContainerStarted","Data":"f5ad946d05120fb8502e5407c92c29b3d6be752152a8a4754683f3ec6f2006a1"} Apr 20 19:37:05.486423 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:05.486323 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="2b2ae287-9c19-4599-8acf-af5c6e175f6d" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.43:9000/health/started\": dial tcp 10.133.0.43:9000: connect: connection refused" Apr 20 19:37:06.486955 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:06.486894 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="2b2ae287-9c19-4599-8acf-af5c6e175f6d" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.43:9000/health/started\": dial tcp 10.133.0.43:9000: connect: connection refused" Apr 20 19:37:07.487467 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:07.487399 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="2b2ae287-9c19-4599-8acf-af5c6e175f6d" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.43:9000/health/started\": dial tcp 10.133.0.43:9000: connect: connection refused" Apr 20 19:37:08.194354 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:08.194312 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6996cc8f49-8fj8h" event={"ID":"350c00d6-81b0-47a4-9396-0547f4b26823","Type":"ContainerStarted","Data":"169708af3d5b70bc10913726448d3e8f27a908c904ee109be479bca14122f47e"} Apr 20 19:37:08.194565 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:08.194471 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-6996cc8f49-8fj8h" Apr 20 19:37:08.486403 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:08.486305 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="keycloak-system/maas-keycloak-0" Apr 20 19:37:08.486601 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:08.486479 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="2b2ae287-9c19-4599-8acf-af5c6e175f6d" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.43:9000/health/started\": dial tcp 10.133.0.43:9000: connect: connection refused" Apr 20 19:37:09.487296 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:09.487245 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="2b2ae287-9c19-4599-8acf-af5c6e175f6d" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.43:9000/health/started\": dial tcp 10.133.0.43:9000: connect: connection refused" Apr 20 19:37:10.486656 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:10.486607 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="2b2ae287-9c19-4599-8acf-af5c6e175f6d" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.43:9000/health/started\": dial tcp 10.133.0.43:9000: connect: connection refused" Apr 20 19:37:11.486834 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:11.486792 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="2b2ae287-9c19-4599-8acf-af5c6e175f6d" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.43:9000/health/started\": dial tcp 10.133.0.43:9000: connect: connection refused" Apr 20 19:37:12.486845 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:12.486799 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="2b2ae287-9c19-4599-8acf-af5c6e175f6d" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.43:9000/health/started\": dial tcp 10.133.0.43:9000: connect: connection refused" Apr 20 19:37:13.619259 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:13.619209 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="keycloak-system/maas-keycloak-0" Apr 20 19:37:13.638768 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:13.638702 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-6996cc8f49-8fj8h" podStartSLOduration=7.18463007 podStartE2EDuration="9.638681279s" podCreationTimestamp="2026-04-20 19:37:04 +0000 UTC" firstStartedPulling="2026-04-20 19:37:04.907271358 +0000 UTC m=+698.897758886" lastFinishedPulling="2026-04-20 19:37:07.361322563 +0000 UTC m=+701.351810095" observedRunningTime="2026-04-20 19:37:08.213301167 +0000 UTC m=+702.203788704" watchObservedRunningTime="2026-04-20 19:37:13.638681279 +0000 UTC m=+707.629168815" Apr 20 19:37:13.640473 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:13.639190 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="keycloak-system/maas-keycloak-0" podUID="2b2ae287-9c19-4599-8acf-af5c6e175f6d" containerName="keycloak" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 19:37:14.206724 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:14.206684 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-6996cc8f49-8fj8h" Apr 20 19:37:14.265468 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:14.265073 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-796c56555f-bbvsj"] Apr 20 19:37:14.265468 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:14.265382 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-796c56555f-bbvsj" podUID="4c7766a2-46bc-4c85-bc78-67046b90b14e" containerName="maas-api" containerID="cri-o://59cb9000113927555f4a7ab07561cd8df2f72bd33d4d2dc45eef22c2bba05935" gracePeriod=30 Apr 20 19:37:14.558909 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:14.558880 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-796c56555f-bbvsj" Apr 20 19:37:14.594278 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:14.593564 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/4c7766a2-46bc-4c85-bc78-67046b90b14e-maas-api-tls\") pod \"4c7766a2-46bc-4c85-bc78-67046b90b14e\" (UID: \"4c7766a2-46bc-4c85-bc78-67046b90b14e\") " Apr 20 19:37:14.594278 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:14.593646 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvxs4\" (UniqueName: \"kubernetes.io/projected/4c7766a2-46bc-4c85-bc78-67046b90b14e-kube-api-access-wvxs4\") pod \"4c7766a2-46bc-4c85-bc78-67046b90b14e\" (UID: \"4c7766a2-46bc-4c85-bc78-67046b90b14e\") " Apr 20 19:37:14.603136 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:14.598658 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c7766a2-46bc-4c85-bc78-67046b90b14e-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "4c7766a2-46bc-4c85-bc78-67046b90b14e" (UID: "4c7766a2-46bc-4c85-bc78-67046b90b14e"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:37:14.603136 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:14.603012 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c7766a2-46bc-4c85-bc78-67046b90b14e-kube-api-access-wvxs4" (OuterVolumeSpecName: "kube-api-access-wvxs4") pod "4c7766a2-46bc-4c85-bc78-67046b90b14e" (UID: "4c7766a2-46bc-4c85-bc78-67046b90b14e"). InnerVolumeSpecName "kube-api-access-wvxs4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:37:14.696212 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:14.695747 2572 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/4c7766a2-46bc-4c85-bc78-67046b90b14e-maas-api-tls\") on node \"ip-10-0-131-162.ec2.internal\" DevicePath \"\"" Apr 20 19:37:14.696212 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:14.695785 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wvxs4\" (UniqueName: \"kubernetes.io/projected/4c7766a2-46bc-4c85-bc78-67046b90b14e-kube-api-access-wvxs4\") on node \"ip-10-0-131-162.ec2.internal\" DevicePath \"\"" Apr 20 19:37:15.227371 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:15.226768 2572 generic.go:358] "Generic (PLEG): container finished" podID="4c7766a2-46bc-4c85-bc78-67046b90b14e" containerID="59cb9000113927555f4a7ab07561cd8df2f72bd33d4d2dc45eef22c2bba05935" exitCode=0 Apr 20 19:37:15.227371 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:15.226847 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-796c56555f-bbvsj" Apr 20 19:37:15.227371 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:15.226855 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-796c56555f-bbvsj" event={"ID":"4c7766a2-46bc-4c85-bc78-67046b90b14e","Type":"ContainerDied","Data":"59cb9000113927555f4a7ab07561cd8df2f72bd33d4d2dc45eef22c2bba05935"} Apr 20 19:37:15.227371 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:15.226901 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-796c56555f-bbvsj" event={"ID":"4c7766a2-46bc-4c85-bc78-67046b90b14e","Type":"ContainerDied","Data":"3ca58666ca254114be270563cbf170bfcd542a0315cca13f85ff2ae8e064d61f"} Apr 20 19:37:15.227371 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:15.226923 2572 scope.go:117] "RemoveContainer" containerID="59cb9000113927555f4a7ab07561cd8df2f72bd33d4d2dc45eef22c2bba05935" Apr 20 19:37:15.239352 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:15.239325 2572 scope.go:117] "RemoveContainer" containerID="59cb9000113927555f4a7ab07561cd8df2f72bd33d4d2dc45eef22c2bba05935" Apr 20 19:37:15.239756 ip-10-0-131-162 kubenswrapper[2572]: E0420 19:37:15.239733 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59cb9000113927555f4a7ab07561cd8df2f72bd33d4d2dc45eef22c2bba05935\": container with ID starting with 59cb9000113927555f4a7ab07561cd8df2f72bd33d4d2dc45eef22c2bba05935 not found: ID does not exist" containerID="59cb9000113927555f4a7ab07561cd8df2f72bd33d4d2dc45eef22c2bba05935" Apr 20 19:37:15.239844 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:15.239770 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59cb9000113927555f4a7ab07561cd8df2f72bd33d4d2dc45eef22c2bba05935"} err="failed to get container status \"59cb9000113927555f4a7ab07561cd8df2f72bd33d4d2dc45eef22c2bba05935\": rpc error: code = NotFound desc = could not find container \"59cb9000113927555f4a7ab07561cd8df2f72bd33d4d2dc45eef22c2bba05935\": container with ID starting with 59cb9000113927555f4a7ab07561cd8df2f72bd33d4d2dc45eef22c2bba05935 not found: ID does not exist" Apr 20 19:37:15.247167 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:15.247124 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-796c56555f-bbvsj"] Apr 20 19:37:15.250948 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:15.250919 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-796c56555f-bbvsj"] Apr 20 19:37:16.606157 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:16.606116 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c7766a2-46bc-4c85-bc78-67046b90b14e" path="/var/lib/kubelet/pods/4c7766a2-46bc-4c85-bc78-67046b90b14e/volumes" Apr 20 19:37:23.625928 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:23.625888 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="keycloak-system/maas-keycloak-0" Apr 20 19:37:51.238091 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:51.238053 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq"] Apr 20 19:37:51.238586 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:51.238466 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c7766a2-46bc-4c85-bc78-67046b90b14e" containerName="maas-api" Apr 20 19:37:51.238586 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:51.238476 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c7766a2-46bc-4c85-bc78-67046b90b14e" containerName="maas-api" Apr 20 19:37:51.238586 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:51.238554 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="4c7766a2-46bc-4c85-bc78-67046b90b14e" containerName="maas-api" Apr 20 19:37:51.241795 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:51.241779 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq" Apr 20 19:37:51.244657 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:51.244635 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 20 19:37:51.245848 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:51.245832 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 20 19:37:51.245933 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:51.245885 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-dfttb\"" Apr 20 19:37:51.245994 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:51.245935 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 20 19:37:51.252785 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:51.252764 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq"] Apr 20 19:37:51.357578 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:51.357535 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c95435c9-e556-46c9-b9e9-554b5b82ea0d-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq\" (UID: \"c95435c9-e556-46c9-b9e9-554b5b82ea0d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq" Apr 20 19:37:51.357778 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:51.357670 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c95435c9-e556-46c9-b9e9-554b5b82ea0d-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq\" (UID: \"c95435c9-e556-46c9-b9e9-554b5b82ea0d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq" Apr 20 19:37:51.357778 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:51.357754 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c95435c9-e556-46c9-b9e9-554b5b82ea0d-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq\" (UID: \"c95435c9-e556-46c9-b9e9-554b5b82ea0d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq" Apr 20 19:37:51.357898 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:51.357812 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c95435c9-e556-46c9-b9e9-554b5b82ea0d-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq\" (UID: \"c95435c9-e556-46c9-b9e9-554b5b82ea0d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq" Apr 20 19:37:51.357898 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:51.357858 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jddc\" (UniqueName: \"kubernetes.io/projected/c95435c9-e556-46c9-b9e9-554b5b82ea0d-kube-api-access-8jddc\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq\" (UID: \"c95435c9-e556-46c9-b9e9-554b5b82ea0d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq" Apr 20 19:37:51.358002 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:51.357913 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c95435c9-e556-46c9-b9e9-554b5b82ea0d-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq\" (UID: \"c95435c9-e556-46c9-b9e9-554b5b82ea0d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq" Apr 20 19:37:51.458825 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:51.458767 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8jddc\" (UniqueName: \"kubernetes.io/projected/c95435c9-e556-46c9-b9e9-554b5b82ea0d-kube-api-access-8jddc\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq\" (UID: \"c95435c9-e556-46c9-b9e9-554b5b82ea0d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq" Apr 20 19:37:51.459051 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:51.458932 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c95435c9-e556-46c9-b9e9-554b5b82ea0d-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq\" (UID: \"c95435c9-e556-46c9-b9e9-554b5b82ea0d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq" Apr 20 19:37:51.459051 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:51.458984 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c95435c9-e556-46c9-b9e9-554b5b82ea0d-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq\" (UID: \"c95435c9-e556-46c9-b9e9-554b5b82ea0d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq" Apr 20 19:37:51.459203 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:51.459087 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c95435c9-e556-46c9-b9e9-554b5b82ea0d-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq\" (UID: \"c95435c9-e556-46c9-b9e9-554b5b82ea0d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq" Apr 20 19:37:51.459203 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:51.459151 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c95435c9-e556-46c9-b9e9-554b5b82ea0d-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq\" (UID: \"c95435c9-e556-46c9-b9e9-554b5b82ea0d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq" Apr 20 19:37:51.459315 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:51.459203 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c95435c9-e556-46c9-b9e9-554b5b82ea0d-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq\" (UID: \"c95435c9-e556-46c9-b9e9-554b5b82ea0d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq" Apr 20 19:37:51.459491 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:51.459432 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c95435c9-e556-46c9-b9e9-554b5b82ea0d-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq\" (UID: \"c95435c9-e556-46c9-b9e9-554b5b82ea0d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq" Apr 20 19:37:51.459613 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:51.459499 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c95435c9-e556-46c9-b9e9-554b5b82ea0d-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq\" (UID: \"c95435c9-e556-46c9-b9e9-554b5b82ea0d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq" Apr 20 19:37:51.459613 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:51.459606 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c95435c9-e556-46c9-b9e9-554b5b82ea0d-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq\" (UID: \"c95435c9-e556-46c9-b9e9-554b5b82ea0d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq" Apr 20 19:37:51.461316 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:51.461287 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c95435c9-e556-46c9-b9e9-554b5b82ea0d-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq\" (UID: \"c95435c9-e556-46c9-b9e9-554b5b82ea0d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq" Apr 20 19:37:51.461674 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:51.461657 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c95435c9-e556-46c9-b9e9-554b5b82ea0d-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq\" (UID: \"c95435c9-e556-46c9-b9e9-554b5b82ea0d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq" Apr 20 19:37:51.467389 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:51.467357 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jddc\" (UniqueName: \"kubernetes.io/projected/c95435c9-e556-46c9-b9e9-554b5b82ea0d-kube-api-access-8jddc\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq\" (UID: \"c95435c9-e556-46c9-b9e9-554b5b82ea0d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq" Apr 20 19:37:51.553370 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:51.553261 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq" Apr 20 19:37:51.686734 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:51.686704 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq"] Apr 20 19:37:52.374877 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:52.374839 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq" event={"ID":"c95435c9-e556-46c9-b9e9-554b5b82ea0d","Type":"ContainerStarted","Data":"5ff3bdb568bcd5be0321cd6b1bf1e78075e19103f2e79f1ae273c5b8b2d2e094"} Apr 20 19:37:57.397218 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:57.397171 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq" event={"ID":"c95435c9-e556-46c9-b9e9-554b5b82ea0d","Type":"ContainerStarted","Data":"80dd50f36c47577d54c2df5f959cd518355ca7fa53f4badf705043eec74bb77b"} Apr 20 19:37:58.634488 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.634451 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d"] Apr 20 19:37:58.638569 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.638545 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d" Apr 20 19:37:58.641333 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.641307 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 20 19:37:58.648289 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.648264 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d"] Apr 20 19:37:58.735686 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.735648 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r"] Apr 20 19:37:58.738738 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.738714 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fd0732eb-3fc6-4033-bec3-83ece891494d-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d\" (UID: \"fd0732eb-3fc6-4033-bec3-83ece891494d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d" Apr 20 19:37:58.738887 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.738863 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fd0732eb-3fc6-4033-bec3-83ece891494d-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d\" (UID: \"fd0732eb-3fc6-4033-bec3-83ece891494d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d" Apr 20 19:37:58.739033 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.739007 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwgxr\" (UniqueName: \"kubernetes.io/projected/fd0732eb-3fc6-4033-bec3-83ece891494d-kube-api-access-gwgxr\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d\" (UID: \"fd0732eb-3fc6-4033-bec3-83ece891494d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d" Apr 20 19:37:58.739106 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.739062 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fd0732eb-3fc6-4033-bec3-83ece891494d-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d\" (UID: \"fd0732eb-3fc6-4033-bec3-83ece891494d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d" Apr 20 19:37:58.739175 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.739151 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fd0732eb-3fc6-4033-bec3-83ece891494d-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d\" (UID: \"fd0732eb-3fc6-4033-bec3-83ece891494d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d" Apr 20 19:37:58.739234 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.739209 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fd0732eb-3fc6-4033-bec3-83ece891494d-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d\" (UID: \"fd0732eb-3fc6-4033-bec3-83ece891494d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d" Apr 20 19:37:58.739712 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.739694 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r" Apr 20 19:37:58.742512 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.742494 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 20 19:37:58.751278 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.751247 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r"] Apr 20 19:37:58.840954 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.840908 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4ktb\" (UniqueName: \"kubernetes.io/projected/4551c113-e2f6-4983-ac8c-dd61dc9d15c6-kube-api-access-k4ktb\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r\" (UID: \"4551c113-e2f6-4983-ac8c-dd61dc9d15c6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r" Apr 20 19:37:58.841109 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.840968 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fd0732eb-3fc6-4033-bec3-83ece891494d-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d\" (UID: \"fd0732eb-3fc6-4033-bec3-83ece891494d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d" Apr 20 19:37:58.841109 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.841027 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4551c113-e2f6-4983-ac8c-dd61dc9d15c6-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r\" (UID: \"4551c113-e2f6-4983-ac8c-dd61dc9d15c6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r" Apr 20 19:37:58.841109 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.841095 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fd0732eb-3fc6-4033-bec3-83ece891494d-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d\" (UID: \"fd0732eb-3fc6-4033-bec3-83ece891494d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d" Apr 20 19:37:58.841273 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.841163 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwgxr\" (UniqueName: \"kubernetes.io/projected/fd0732eb-3fc6-4033-bec3-83ece891494d-kube-api-access-gwgxr\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d\" (UID: \"fd0732eb-3fc6-4033-bec3-83ece891494d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d" Apr 20 19:37:58.841273 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.841204 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4551c113-e2f6-4983-ac8c-dd61dc9d15c6-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r\" (UID: \"4551c113-e2f6-4983-ac8c-dd61dc9d15c6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r" Apr 20 19:37:58.841351 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.841288 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fd0732eb-3fc6-4033-bec3-83ece891494d-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d\" (UID: \"fd0732eb-3fc6-4033-bec3-83ece891494d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d" Apr 20 19:37:58.841351 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.841340 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4551c113-e2f6-4983-ac8c-dd61dc9d15c6-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r\" (UID: \"4551c113-e2f6-4983-ac8c-dd61dc9d15c6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r" Apr 20 19:37:58.841482 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.841367 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4551c113-e2f6-4983-ac8c-dd61dc9d15c6-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r\" (UID: \"4551c113-e2f6-4983-ac8c-dd61dc9d15c6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r" Apr 20 19:37:58.841482 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.841372 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fd0732eb-3fc6-4033-bec3-83ece891494d-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d\" (UID: \"fd0732eb-3fc6-4033-bec3-83ece891494d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d" Apr 20 19:37:58.841482 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.841406 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fd0732eb-3fc6-4033-bec3-83ece891494d-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d\" (UID: \"fd0732eb-3fc6-4033-bec3-83ece891494d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d" Apr 20 19:37:58.841482 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.841477 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fd0732eb-3fc6-4033-bec3-83ece891494d-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d\" (UID: \"fd0732eb-3fc6-4033-bec3-83ece891494d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d" Apr 20 19:37:58.841684 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.841493 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fd0732eb-3fc6-4033-bec3-83ece891494d-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d\" (UID: \"fd0732eb-3fc6-4033-bec3-83ece891494d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d" Apr 20 19:37:58.841684 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.841514 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4551c113-e2f6-4983-ac8c-dd61dc9d15c6-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r\" (UID: \"4551c113-e2f6-4983-ac8c-dd61dc9d15c6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r" Apr 20 19:37:58.841786 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.841765 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fd0732eb-3fc6-4033-bec3-83ece891494d-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d\" (UID: \"fd0732eb-3fc6-4033-bec3-83ece891494d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d" Apr 20 19:37:58.843638 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.843604 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fd0732eb-3fc6-4033-bec3-83ece891494d-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d\" (UID: \"fd0732eb-3fc6-4033-bec3-83ece891494d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d" Apr 20 19:37:58.843837 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.843785 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fd0732eb-3fc6-4033-bec3-83ece891494d-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d\" (UID: \"fd0732eb-3fc6-4033-bec3-83ece891494d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d" Apr 20 19:37:58.849703 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.849680 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwgxr\" (UniqueName: \"kubernetes.io/projected/fd0732eb-3fc6-4033-bec3-83ece891494d-kube-api-access-gwgxr\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d\" (UID: \"fd0732eb-3fc6-4033-bec3-83ece891494d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d" Apr 20 19:37:58.942543 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.942413 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4551c113-e2f6-4983-ac8c-dd61dc9d15c6-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r\" (UID: \"4551c113-e2f6-4983-ac8c-dd61dc9d15c6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r" Apr 20 19:37:58.942692 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.942570 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4551c113-e2f6-4983-ac8c-dd61dc9d15c6-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r\" (UID: \"4551c113-e2f6-4983-ac8c-dd61dc9d15c6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r" Apr 20 19:37:58.942692 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.942633 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4551c113-e2f6-4983-ac8c-dd61dc9d15c6-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r\" (UID: \"4551c113-e2f6-4983-ac8c-dd61dc9d15c6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r" Apr 20 19:37:58.942692 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.942659 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4551c113-e2f6-4983-ac8c-dd61dc9d15c6-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r\" (UID: \"4551c113-e2f6-4983-ac8c-dd61dc9d15c6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r" Apr 20 19:37:58.942847 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.942718 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4551c113-e2f6-4983-ac8c-dd61dc9d15c6-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r\" (UID: \"4551c113-e2f6-4983-ac8c-dd61dc9d15c6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r" Apr 20 19:37:58.942847 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.942796 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4ktb\" (UniqueName: \"kubernetes.io/projected/4551c113-e2f6-4983-ac8c-dd61dc9d15c6-kube-api-access-k4ktb\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r\" (UID: \"4551c113-e2f6-4983-ac8c-dd61dc9d15c6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r" Apr 20 19:37:58.943043 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.943009 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4551c113-e2f6-4983-ac8c-dd61dc9d15c6-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r\" (UID: \"4551c113-e2f6-4983-ac8c-dd61dc9d15c6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r" Apr 20 19:37:58.943043 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.943025 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4551c113-e2f6-4983-ac8c-dd61dc9d15c6-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r\" (UID: \"4551c113-e2f6-4983-ac8c-dd61dc9d15c6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r" Apr 20 19:37:58.943197 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.943119 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4551c113-e2f6-4983-ac8c-dd61dc9d15c6-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r\" (UID: \"4551c113-e2f6-4983-ac8c-dd61dc9d15c6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r" Apr 20 19:37:58.945050 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.945024 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4551c113-e2f6-4983-ac8c-dd61dc9d15c6-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r\" (UID: \"4551c113-e2f6-4983-ac8c-dd61dc9d15c6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r" Apr 20 19:37:58.945226 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.945207 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4551c113-e2f6-4983-ac8c-dd61dc9d15c6-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r\" (UID: \"4551c113-e2f6-4983-ac8c-dd61dc9d15c6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r" Apr 20 19:37:58.950546 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.950519 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d" Apr 20 19:37:58.950870 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:58.950845 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4ktb\" (UniqueName: \"kubernetes.io/projected/4551c113-e2f6-4983-ac8c-dd61dc9d15c6-kube-api-access-k4ktb\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r\" (UID: \"4551c113-e2f6-4983-ac8c-dd61dc9d15c6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r" Apr 20 19:37:59.053081 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:59.052572 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r" Apr 20 19:37:59.146651 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:59.146623 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d"] Apr 20 19:37:59.148311 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:37:59.148283 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd0732eb_3fc6_4033_bec3_83ece891494d.slice/crio-e34add49cb4d96b47ab69eab179004c4911b40933d0031e52ab98dbf7509f59e WatchSource:0}: Error finding container e34add49cb4d96b47ab69eab179004c4911b40933d0031e52ab98dbf7509f59e: Status 404 returned error can't find the container with id e34add49cb4d96b47ab69eab179004c4911b40933d0031e52ab98dbf7509f59e Apr 20 19:37:59.212221 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:59.212194 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r"] Apr 20 19:37:59.214796 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:37:59.214754 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4551c113_e2f6_4983_ac8c_dd61dc9d15c6.slice/crio-bd5201a570ac827b42321be48d33a687236efb022a2615117ffd3623d18d4699 WatchSource:0}: Error finding container bd5201a570ac827b42321be48d33a687236efb022a2615117ffd3623d18d4699: Status 404 returned error can't find the container with id bd5201a570ac827b42321be48d33a687236efb022a2615117ffd3623d18d4699 Apr 20 19:37:59.408016 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:59.407968 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r" event={"ID":"4551c113-e2f6-4983-ac8c-dd61dc9d15c6","Type":"ContainerStarted","Data":"7c757553bfa795967cb001cbfe3cacc80d01b002626faabca08689bac9b3d1ff"} Apr 20 19:37:59.408016 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:59.408016 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r" event={"ID":"4551c113-e2f6-4983-ac8c-dd61dc9d15c6","Type":"ContainerStarted","Data":"bd5201a570ac827b42321be48d33a687236efb022a2615117ffd3623d18d4699"} Apr 20 19:37:59.409640 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:59.409611 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d" event={"ID":"fd0732eb-3fc6-4033-bec3-83ece891494d","Type":"ContainerStarted","Data":"6d5581d4b7a8fe83e82c3dc9698dadac1fd471dd28c8e16fa20b37c8b359392d"} Apr 20 19:37:59.409787 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:37:59.409648 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d" event={"ID":"fd0732eb-3fc6-4033-bec3-83ece891494d","Type":"ContainerStarted","Data":"e34add49cb4d96b47ab69eab179004c4911b40933d0031e52ab98dbf7509f59e"} Apr 20 19:38:03.429217 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:03.429149 2572 generic.go:358] "Generic (PLEG): container finished" podID="c95435c9-e556-46c9-b9e9-554b5b82ea0d" containerID="80dd50f36c47577d54c2df5f959cd518355ca7fa53f4badf705043eec74bb77b" exitCode=0 Apr 20 19:38:03.429711 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:03.429224 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq" event={"ID":"c95435c9-e556-46c9-b9e9-554b5b82ea0d","Type":"ContainerDied","Data":"80dd50f36c47577d54c2df5f959cd518355ca7fa53f4badf705043eec74bb77b"} Apr 20 19:38:05.439312 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:05.439204 2572 generic.go:358] "Generic (PLEG): container finished" podID="4551c113-e2f6-4983-ac8c-dd61dc9d15c6" containerID="7c757553bfa795967cb001cbfe3cacc80d01b002626faabca08689bac9b3d1ff" exitCode=0 Apr 20 19:38:05.439312 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:05.439284 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r" event={"ID":"4551c113-e2f6-4983-ac8c-dd61dc9d15c6","Type":"ContainerDied","Data":"7c757553bfa795967cb001cbfe3cacc80d01b002626faabca08689bac9b3d1ff"} Apr 20 19:38:05.441079 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:05.441060 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq" event={"ID":"c95435c9-e556-46c9-b9e9-554b5b82ea0d","Type":"ContainerStarted","Data":"4b25aab037b53258a1632f6586ea261df7457d5b5d4dc463ac00a5887d0b66e1"} Apr 20 19:38:05.441275 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:05.441257 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq" Apr 20 19:38:05.442476 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:05.442430 2572 generic.go:358] "Generic (PLEG): container finished" podID="fd0732eb-3fc6-4033-bec3-83ece891494d" containerID="6d5581d4b7a8fe83e82c3dc9698dadac1fd471dd28c8e16fa20b37c8b359392d" exitCode=0 Apr 20 19:38:05.442557 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:05.442502 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d" event={"ID":"fd0732eb-3fc6-4033-bec3-83ece891494d","Type":"ContainerDied","Data":"6d5581d4b7a8fe83e82c3dc9698dadac1fd471dd28c8e16fa20b37c8b359392d"} Apr 20 19:38:05.516971 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:05.516917 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq" podStartSLOduration=1.6362656709999999 podStartE2EDuration="14.51690145s" podCreationTimestamp="2026-04-20 19:37:51 +0000 UTC" firstStartedPulling="2026-04-20 19:37:51.694941266 +0000 UTC m=+745.685428779" lastFinishedPulling="2026-04-20 19:38:04.57557703 +0000 UTC m=+758.566064558" observedRunningTime="2026-04-20 19:38:05.515241045 +0000 UTC m=+759.505728581" watchObservedRunningTime="2026-04-20 19:38:05.51690145 +0000 UTC m=+759.507388985" Apr 20 19:38:06.447826 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:06.447790 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r" event={"ID":"4551c113-e2f6-4983-ac8c-dd61dc9d15c6","Type":"ContainerStarted","Data":"d47e270cc527c4ff681d15d6500e906601d84fd52e74b74babce53be5934370a"} Apr 20 19:38:06.448280 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:06.448038 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r" Apr 20 19:38:06.449375 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:06.449351 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d" event={"ID":"fd0732eb-3fc6-4033-bec3-83ece891494d","Type":"ContainerStarted","Data":"018ffa595a4cc84d0ca312554da2bec7ec5b2593aaa0ac8b7beeca06c5f64641"} Apr 20 19:38:06.449710 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:06.449691 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d" Apr 20 19:38:06.471505 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:06.471434 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r" podStartSLOduration=8.260593914 podStartE2EDuration="8.471419916s" podCreationTimestamp="2026-04-20 19:37:58 +0000 UTC" firstStartedPulling="2026-04-20 19:38:05.440102844 +0000 UTC m=+759.430590358" lastFinishedPulling="2026-04-20 19:38:05.650928844 +0000 UTC m=+759.641416360" observedRunningTime="2026-04-20 19:38:06.46994828 +0000 UTC m=+760.460435814" watchObservedRunningTime="2026-04-20 19:38:06.471419916 +0000 UTC m=+760.461907451" Apr 20 19:38:06.491122 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:06.491064 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d" podStartSLOduration=8.216117744 podStartE2EDuration="8.491046379s" podCreationTimestamp="2026-04-20 19:37:58 +0000 UTC" firstStartedPulling="2026-04-20 19:38:05.44308349 +0000 UTC m=+759.433571004" lastFinishedPulling="2026-04-20 19:38:05.718012126 +0000 UTC m=+759.708499639" observedRunningTime="2026-04-20 19:38:06.488814437 +0000 UTC m=+760.479301973" watchObservedRunningTime="2026-04-20 19:38:06.491046379 +0000 UTC m=+760.481533913" Apr 20 19:38:09.633342 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:09.633304 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x"] Apr 20 19:38:09.637940 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:09.637918 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x" Apr 20 19:38:09.640537 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:09.640519 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 20 19:38:09.649101 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:09.649068 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x"] Apr 20 19:38:09.757732 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:09.757698 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/68fbf495-013e-40fd-b1f8-2c4205a444d0-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x\" (UID: \"68fbf495-013e-40fd-b1f8-2c4205a444d0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x" Apr 20 19:38:09.757902 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:09.757744 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mt6s\" (UniqueName: \"kubernetes.io/projected/68fbf495-013e-40fd-b1f8-2c4205a444d0-kube-api-access-2mt6s\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x\" (UID: \"68fbf495-013e-40fd-b1f8-2c4205a444d0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x" Apr 20 19:38:09.757902 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:09.757769 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/68fbf495-013e-40fd-b1f8-2c4205a444d0-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x\" (UID: \"68fbf495-013e-40fd-b1f8-2c4205a444d0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x" Apr 20 19:38:09.757902 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:09.757895 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/68fbf495-013e-40fd-b1f8-2c4205a444d0-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x\" (UID: \"68fbf495-013e-40fd-b1f8-2c4205a444d0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x" Apr 20 19:38:09.758025 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:09.757928 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/68fbf495-013e-40fd-b1f8-2c4205a444d0-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x\" (UID: \"68fbf495-013e-40fd-b1f8-2c4205a444d0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x" Apr 20 19:38:09.758059 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:09.758025 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/68fbf495-013e-40fd-b1f8-2c4205a444d0-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x\" (UID: \"68fbf495-013e-40fd-b1f8-2c4205a444d0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x" Apr 20 19:38:09.858963 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:09.858913 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/68fbf495-013e-40fd-b1f8-2c4205a444d0-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x\" (UID: \"68fbf495-013e-40fd-b1f8-2c4205a444d0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x" Apr 20 19:38:09.859175 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:09.858973 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/68fbf495-013e-40fd-b1f8-2c4205a444d0-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x\" (UID: \"68fbf495-013e-40fd-b1f8-2c4205a444d0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x" Apr 20 19:38:09.859175 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:09.859023 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/68fbf495-013e-40fd-b1f8-2c4205a444d0-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x\" (UID: \"68fbf495-013e-40fd-b1f8-2c4205a444d0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x" Apr 20 19:38:09.859175 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:09.859057 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/68fbf495-013e-40fd-b1f8-2c4205a444d0-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x\" (UID: \"68fbf495-013e-40fd-b1f8-2c4205a444d0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x" Apr 20 19:38:09.859175 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:09.859098 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mt6s\" (UniqueName: \"kubernetes.io/projected/68fbf495-013e-40fd-b1f8-2c4205a444d0-kube-api-access-2mt6s\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x\" (UID: \"68fbf495-013e-40fd-b1f8-2c4205a444d0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x" Apr 20 19:38:09.859175 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:09.859127 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/68fbf495-013e-40fd-b1f8-2c4205a444d0-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x\" (UID: \"68fbf495-013e-40fd-b1f8-2c4205a444d0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x" Apr 20 19:38:09.859489 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:09.859401 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/68fbf495-013e-40fd-b1f8-2c4205a444d0-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x\" (UID: \"68fbf495-013e-40fd-b1f8-2c4205a444d0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x" Apr 20 19:38:09.859564 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:09.859517 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/68fbf495-013e-40fd-b1f8-2c4205a444d0-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x\" (UID: \"68fbf495-013e-40fd-b1f8-2c4205a444d0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x" Apr 20 19:38:09.859721 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:09.859702 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/68fbf495-013e-40fd-b1f8-2c4205a444d0-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x\" (UID: \"68fbf495-013e-40fd-b1f8-2c4205a444d0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x" Apr 20 19:38:09.861460 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:09.861410 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/68fbf495-013e-40fd-b1f8-2c4205a444d0-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x\" (UID: \"68fbf495-013e-40fd-b1f8-2c4205a444d0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x" Apr 20 19:38:09.862101 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:09.862078 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/68fbf495-013e-40fd-b1f8-2c4205a444d0-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x\" (UID: \"68fbf495-013e-40fd-b1f8-2c4205a444d0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x" Apr 20 19:38:09.868827 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:09.868800 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mt6s\" (UniqueName: \"kubernetes.io/projected/68fbf495-013e-40fd-b1f8-2c4205a444d0-kube-api-access-2mt6s\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x\" (UID: \"68fbf495-013e-40fd-b1f8-2c4205a444d0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x" Apr 20 19:38:09.949054 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:09.948953 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x" Apr 20 19:38:10.084702 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:10.084675 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x"] Apr 20 19:38:10.086096 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:38:10.086073 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68fbf495_013e_40fd_b1f8_2c4205a444d0.slice/crio-8626659b1e459b62ed6828fc7e99a9205b4febe22cef3bb939754de34e72b5ee WatchSource:0}: Error finding container 8626659b1e459b62ed6828fc7e99a9205b4febe22cef3bb939754de34e72b5ee: Status 404 returned error can't find the container with id 8626659b1e459b62ed6828fc7e99a9205b4febe22cef3bb939754de34e72b5ee Apr 20 19:38:10.467069 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:10.467027 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x" event={"ID":"68fbf495-013e-40fd-b1f8-2c4205a444d0","Type":"ContainerStarted","Data":"26a6e1790866c56db36cde9cbe39672a274e9c84a5c42c6a1a7b165424ebd075"} Apr 20 19:38:10.467069 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:10.467070 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x" event={"ID":"68fbf495-013e-40fd-b1f8-2c4205a444d0","Type":"ContainerStarted","Data":"8626659b1e459b62ed6828fc7e99a9205b4febe22cef3bb939754de34e72b5ee"} Apr 20 19:38:16.462815 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:16.462780 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq" Apr 20 19:38:16.495791 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:16.495759 2572 generic.go:358] "Generic (PLEG): container finished" podID="68fbf495-013e-40fd-b1f8-2c4205a444d0" containerID="26a6e1790866c56db36cde9cbe39672a274e9c84a5c42c6a1a7b165424ebd075" exitCode=0 Apr 20 19:38:16.495952 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:16.495834 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x" event={"ID":"68fbf495-013e-40fd-b1f8-2c4205a444d0","Type":"ContainerDied","Data":"26a6e1790866c56db36cde9cbe39672a274e9c84a5c42c6a1a7b165424ebd075"} Apr 20 19:38:17.466418 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:17.466380 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r" Apr 20 19:38:17.470128 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:17.470100 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d" Apr 20 19:38:17.502455 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:17.502398 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x" event={"ID":"68fbf495-013e-40fd-b1f8-2c4205a444d0","Type":"ContainerStarted","Data":"a38f433be65698b033515177277002872b2e13e340491016b89ac1d01b76ddb8"} Apr 20 19:38:17.502748 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:17.502714 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x" Apr 20 19:38:17.526915 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:17.526866 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x" podStartSLOduration=8.338671922 podStartE2EDuration="8.526847373s" podCreationTimestamp="2026-04-20 19:38:09 +0000 UTC" firstStartedPulling="2026-04-20 19:38:16.496562771 +0000 UTC m=+770.487050284" lastFinishedPulling="2026-04-20 19:38:16.684738221 +0000 UTC m=+770.675225735" observedRunningTime="2026-04-20 19:38:17.525792932 +0000 UTC m=+771.516280468" watchObservedRunningTime="2026-04-20 19:38:17.526847373 +0000 UTC m=+771.517334909" Apr 20 19:38:21.439322 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:21.439285 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-j958z"] Apr 20 19:38:21.463898 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:21.463859 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-j958z"] Apr 20 19:38:21.464052 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:21.463978 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j958z" Apr 20 19:38:21.466599 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:21.466570 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 20 19:38:21.579196 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:21.579152 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr29s\" (UniqueName: \"kubernetes.io/projected/427c1dd4-e3a5-4569-86a5-ca8cdfedacb0-kube-api-access-kr29s\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-j958z\" (UID: \"427c1dd4-e3a5-4569-86a5-ca8cdfedacb0\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j958z" Apr 20 19:38:21.579196 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:21.579201 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/427c1dd4-e3a5-4569-86a5-ca8cdfedacb0-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-j958z\" (UID: \"427c1dd4-e3a5-4569-86a5-ca8cdfedacb0\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j958z" Apr 20 19:38:21.579475 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:21.579272 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/427c1dd4-e3a5-4569-86a5-ca8cdfedacb0-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-j958z\" (UID: \"427c1dd4-e3a5-4569-86a5-ca8cdfedacb0\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j958z" Apr 20 19:38:21.579475 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:21.579374 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/427c1dd4-e3a5-4569-86a5-ca8cdfedacb0-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-j958z\" (UID: \"427c1dd4-e3a5-4569-86a5-ca8cdfedacb0\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j958z" Apr 20 19:38:21.579475 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:21.579468 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/427c1dd4-e3a5-4569-86a5-ca8cdfedacb0-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-j958z\" (UID: \"427c1dd4-e3a5-4569-86a5-ca8cdfedacb0\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j958z" Apr 20 19:38:21.579614 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:21.579497 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/427c1dd4-e3a5-4569-86a5-ca8cdfedacb0-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-j958z\" (UID: \"427c1dd4-e3a5-4569-86a5-ca8cdfedacb0\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j958z" Apr 20 19:38:21.680825 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:21.680782 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/427c1dd4-e3a5-4569-86a5-ca8cdfedacb0-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-j958z\" (UID: \"427c1dd4-e3a5-4569-86a5-ca8cdfedacb0\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j958z" Apr 20 19:38:21.681017 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:21.680891 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/427c1dd4-e3a5-4569-86a5-ca8cdfedacb0-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-j958z\" (UID: \"427c1dd4-e3a5-4569-86a5-ca8cdfedacb0\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j958z" Apr 20 19:38:21.681017 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:21.680918 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/427c1dd4-e3a5-4569-86a5-ca8cdfedacb0-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-j958z\" (UID: \"427c1dd4-e3a5-4569-86a5-ca8cdfedacb0\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j958z" Apr 20 19:38:21.681017 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:21.680940 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kr29s\" (UniqueName: \"kubernetes.io/projected/427c1dd4-e3a5-4569-86a5-ca8cdfedacb0-kube-api-access-kr29s\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-j958z\" (UID: \"427c1dd4-e3a5-4569-86a5-ca8cdfedacb0\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j958z" Apr 20 19:38:21.681017 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:21.680972 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/427c1dd4-e3a5-4569-86a5-ca8cdfedacb0-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-j958z\" (UID: \"427c1dd4-e3a5-4569-86a5-ca8cdfedacb0\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j958z" Apr 20 19:38:21.681250 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:21.681020 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/427c1dd4-e3a5-4569-86a5-ca8cdfedacb0-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-j958z\" (UID: \"427c1dd4-e3a5-4569-86a5-ca8cdfedacb0\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j958z" Apr 20 19:38:21.681354 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:21.681318 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/427c1dd4-e3a5-4569-86a5-ca8cdfedacb0-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-j958z\" (UID: \"427c1dd4-e3a5-4569-86a5-ca8cdfedacb0\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j958z" Apr 20 19:38:21.681760 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:21.681740 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/427c1dd4-e3a5-4569-86a5-ca8cdfedacb0-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-j958z\" (UID: \"427c1dd4-e3a5-4569-86a5-ca8cdfedacb0\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j958z" Apr 20 19:38:21.681916 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:21.681886 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/427c1dd4-e3a5-4569-86a5-ca8cdfedacb0-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-j958z\" (UID: \"427c1dd4-e3a5-4569-86a5-ca8cdfedacb0\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j958z" Apr 20 19:38:21.684130 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:21.684108 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/427c1dd4-e3a5-4569-86a5-ca8cdfedacb0-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-j958z\" (UID: \"427c1dd4-e3a5-4569-86a5-ca8cdfedacb0\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j958z" Apr 20 19:38:21.684240 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:21.684187 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/427c1dd4-e3a5-4569-86a5-ca8cdfedacb0-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-j958z\" (UID: \"427c1dd4-e3a5-4569-86a5-ca8cdfedacb0\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j958z" Apr 20 19:38:21.690051 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:21.689989 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr29s\" (UniqueName: \"kubernetes.io/projected/427c1dd4-e3a5-4569-86a5-ca8cdfedacb0-kube-api-access-kr29s\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-j958z\" (UID: \"427c1dd4-e3a5-4569-86a5-ca8cdfedacb0\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j958z" Apr 20 19:38:21.774119 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:21.774086 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j958z" Apr 20 19:38:21.909585 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:21.909556 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-j958z"] Apr 20 19:38:21.910943 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:38:21.910921 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod427c1dd4_e3a5_4569_86a5_ca8cdfedacb0.slice/crio-4270488c4bb5b966dc38504ae87087ccd7c18ad64d6a703b4ab8bce38c72fe52 WatchSource:0}: Error finding container 4270488c4bb5b966dc38504ae87087ccd7c18ad64d6a703b4ab8bce38c72fe52: Status 404 returned error can't find the container with id 4270488c4bb5b966dc38504ae87087ccd7c18ad64d6a703b4ab8bce38c72fe52 Apr 20 19:38:22.524912 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:22.524868 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j958z" event={"ID":"427c1dd4-e3a5-4569-86a5-ca8cdfedacb0","Type":"ContainerStarted","Data":"fd4f843c5cdd5611b51902f08a3b25e3f3e9cd1b6f343ec8eec205386b4050fb"} Apr 20 19:38:22.524912 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:22.524905 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j958z" event={"ID":"427c1dd4-e3a5-4569-86a5-ca8cdfedacb0","Type":"ContainerStarted","Data":"4270488c4bb5b966dc38504ae87087ccd7c18ad64d6a703b4ab8bce38c72fe52"} Apr 20 19:38:28.525505 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:28.525473 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x" Apr 20 19:38:30.560840 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:30.560798 2572 generic.go:358] "Generic (PLEG): container finished" podID="427c1dd4-e3a5-4569-86a5-ca8cdfedacb0" containerID="fd4f843c5cdd5611b51902f08a3b25e3f3e9cd1b6f343ec8eec205386b4050fb" exitCode=0 Apr 20 19:38:30.561283 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:30.560870 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j958z" event={"ID":"427c1dd4-e3a5-4569-86a5-ca8cdfedacb0","Type":"ContainerDied","Data":"fd4f843c5cdd5611b51902f08a3b25e3f3e9cd1b6f343ec8eec205386b4050fb"} Apr 20 19:38:31.566721 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:31.566686 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j958z" event={"ID":"427c1dd4-e3a5-4569-86a5-ca8cdfedacb0","Type":"ContainerStarted","Data":"7ffd80fb3bd74f9eee45fbe0312914449122c1fccd755fb3edbbed19175f5a36"} Apr 20 19:38:31.567160 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:31.566884 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j958z" Apr 20 19:38:31.587540 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:31.587490 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j958z" podStartSLOduration=10.373235927 podStartE2EDuration="10.587474485s" podCreationTimestamp="2026-04-20 19:38:21 +0000 UTC" firstStartedPulling="2026-04-20 19:38:30.561579558 +0000 UTC m=+784.552067071" lastFinishedPulling="2026-04-20 19:38:30.775818113 +0000 UTC m=+784.766305629" observedRunningTime="2026-04-20 19:38:31.58626722 +0000 UTC m=+785.576754754" watchObservedRunningTime="2026-04-20 19:38:31.587474485 +0000 UTC m=+785.577962020" Apr 20 19:38:38.058640 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:38.058604 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-49fwl"] Apr 20 19:38:38.097511 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:38.097475 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-49fwl"] Apr 20 19:38:38.097681 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:38.097563 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-49fwl" Apr 20 19:38:38.100642 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:38.100622 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 20 19:38:38.248619 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:38.248575 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/243812ff-f0d4-4001-87f5-f4d392e43d50-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-49fwl\" (UID: \"243812ff-f0d4-4001-87f5-f4d392e43d50\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-49fwl" Apr 20 19:38:38.248827 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:38.248641 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knt5x\" (UniqueName: \"kubernetes.io/projected/243812ff-f0d4-4001-87f5-f4d392e43d50-kube-api-access-knt5x\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-49fwl\" (UID: \"243812ff-f0d4-4001-87f5-f4d392e43d50\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-49fwl" Apr 20 19:38:38.248827 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:38.248767 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/243812ff-f0d4-4001-87f5-f4d392e43d50-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-49fwl\" (UID: \"243812ff-f0d4-4001-87f5-f4d392e43d50\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-49fwl" Apr 20 19:38:38.248827 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:38.248811 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/243812ff-f0d4-4001-87f5-f4d392e43d50-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-49fwl\" (UID: \"243812ff-f0d4-4001-87f5-f4d392e43d50\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-49fwl" Apr 20 19:38:38.249007 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:38.248852 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/243812ff-f0d4-4001-87f5-f4d392e43d50-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-49fwl\" (UID: \"243812ff-f0d4-4001-87f5-f4d392e43d50\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-49fwl" Apr 20 19:38:38.249007 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:38.248886 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/243812ff-f0d4-4001-87f5-f4d392e43d50-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-49fwl\" (UID: \"243812ff-f0d4-4001-87f5-f4d392e43d50\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-49fwl" Apr 20 19:38:38.349530 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:38.349490 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/243812ff-f0d4-4001-87f5-f4d392e43d50-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-49fwl\" (UID: \"243812ff-f0d4-4001-87f5-f4d392e43d50\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-49fwl" Apr 20 19:38:38.349530 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:38.349546 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/243812ff-f0d4-4001-87f5-f4d392e43d50-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-49fwl\" (UID: \"243812ff-f0d4-4001-87f5-f4d392e43d50\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-49fwl" Apr 20 19:38:38.349759 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:38.349585 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/243812ff-f0d4-4001-87f5-f4d392e43d50-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-49fwl\" (UID: \"243812ff-f0d4-4001-87f5-f4d392e43d50\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-49fwl" Apr 20 19:38:38.349759 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:38.349641 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/243812ff-f0d4-4001-87f5-f4d392e43d50-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-49fwl\" (UID: \"243812ff-f0d4-4001-87f5-f4d392e43d50\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-49fwl" Apr 20 19:38:38.349759 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:38.349668 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-knt5x\" (UniqueName: \"kubernetes.io/projected/243812ff-f0d4-4001-87f5-f4d392e43d50-kube-api-access-knt5x\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-49fwl\" (UID: \"243812ff-f0d4-4001-87f5-f4d392e43d50\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-49fwl" Apr 20 19:38:38.349759 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:38.349742 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/243812ff-f0d4-4001-87f5-f4d392e43d50-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-49fwl\" (UID: \"243812ff-f0d4-4001-87f5-f4d392e43d50\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-49fwl" Apr 20 19:38:38.349972 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:38.349947 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/243812ff-f0d4-4001-87f5-f4d392e43d50-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-49fwl\" (UID: \"243812ff-f0d4-4001-87f5-f4d392e43d50\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-49fwl" Apr 20 19:38:38.350036 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:38.349968 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/243812ff-f0d4-4001-87f5-f4d392e43d50-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-49fwl\" (UID: \"243812ff-f0d4-4001-87f5-f4d392e43d50\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-49fwl" Apr 20 19:38:38.350036 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:38.350000 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/243812ff-f0d4-4001-87f5-f4d392e43d50-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-49fwl\" (UID: \"243812ff-f0d4-4001-87f5-f4d392e43d50\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-49fwl" Apr 20 19:38:38.351997 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:38.351976 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/243812ff-f0d4-4001-87f5-f4d392e43d50-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-49fwl\" (UID: \"243812ff-f0d4-4001-87f5-f4d392e43d50\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-49fwl" Apr 20 19:38:38.352155 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:38.352138 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/243812ff-f0d4-4001-87f5-f4d392e43d50-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-49fwl\" (UID: \"243812ff-f0d4-4001-87f5-f4d392e43d50\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-49fwl" Apr 20 19:38:38.358604 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:38.358583 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-knt5x\" (UniqueName: \"kubernetes.io/projected/243812ff-f0d4-4001-87f5-f4d392e43d50-kube-api-access-knt5x\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-49fwl\" (UID: \"243812ff-f0d4-4001-87f5-f4d392e43d50\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-49fwl" Apr 20 19:38:38.408022 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:38.407991 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-49fwl" Apr 20 19:38:38.541686 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:38.541663 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-49fwl"] Apr 20 19:38:38.543828 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:38:38.543802 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod243812ff_f0d4_4001_87f5_f4d392e43d50.slice/crio-734bd397af5be98b5b65055e83d56a56fdbddb6f654532d644d5928ac4bf5a9f WatchSource:0}: Error finding container 734bd397af5be98b5b65055e83d56a56fdbddb6f654532d644d5928ac4bf5a9f: Status 404 returned error can't find the container with id 734bd397af5be98b5b65055e83d56a56fdbddb6f654532d644d5928ac4bf5a9f Apr 20 19:38:38.594264 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:38.594235 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-49fwl" event={"ID":"243812ff-f0d4-4001-87f5-f4d392e43d50","Type":"ContainerStarted","Data":"734bd397af5be98b5b65055e83d56a56fdbddb6f654532d644d5928ac4bf5a9f"} Apr 20 19:38:39.600234 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:39.600195 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-49fwl" event={"ID":"243812ff-f0d4-4001-87f5-f4d392e43d50","Type":"ContainerStarted","Data":"90f43ebc2fda819c7e15d1c643b4da1db85b33b23315fd7341385b422338ccb3"} Apr 20 19:38:42.584752 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:42.584721 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j958z" Apr 20 19:38:44.621210 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:44.621172 2572 generic.go:358] "Generic (PLEG): container finished" podID="243812ff-f0d4-4001-87f5-f4d392e43d50" containerID="90f43ebc2fda819c7e15d1c643b4da1db85b33b23315fd7341385b422338ccb3" exitCode=0 Apr 20 19:38:44.621210 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:44.621214 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-49fwl" event={"ID":"243812ff-f0d4-4001-87f5-f4d392e43d50","Type":"ContainerDied","Data":"90f43ebc2fda819c7e15d1c643b4da1db85b33b23315fd7341385b422338ccb3"} Apr 20 19:38:45.626417 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:45.626381 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-49fwl" event={"ID":"243812ff-f0d4-4001-87f5-f4d392e43d50","Type":"ContainerStarted","Data":"b1a269f9120b7b3b3769bd69fd9cdb19a5ba23329831d4642b7011f3eba1cf80"} Apr 20 19:38:45.626831 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:45.626606 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-49fwl" Apr 20 19:38:45.648547 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:45.648499 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-49fwl" podStartSLOduration=7.204917751 podStartE2EDuration="7.648484997s" podCreationTimestamp="2026-04-20 19:38:38 +0000 UTC" firstStartedPulling="2026-04-20 19:38:44.622037982 +0000 UTC m=+798.612525495" lastFinishedPulling="2026-04-20 19:38:45.065605229 +0000 UTC m=+799.056092741" observedRunningTime="2026-04-20 19:38:45.646616127 +0000 UTC m=+799.637103661" watchObservedRunningTime="2026-04-20 19:38:45.648484997 +0000 UTC m=+799.638972531" Apr 20 19:38:56.644858 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:38:56.644822 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-49fwl" Apr 20 19:39:53.025978 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:39:53.025950 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-6996cc8f49-8fj8h_350c00d6-81b0-47a4-9396-0547f4b26823/maas-api/0.log" Apr 20 19:39:53.137062 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:39:53.137023 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-56cb74577d-pp6s4_4122ac80-0aee-4c28-8268-ef519cfb0da8/manager/0.log" Apr 20 19:39:53.366057 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:39:53.366024 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-9f747d685-2n2zg_3507a6b2-e19b-491a-bd27-a1bb5136cda2/manager/0.log" Apr 20 19:39:55.073677 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:39:55.073644 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-sfmzv_bc91b37b-b181-4658-9961-bca7209a70b9/manager/0.log" Apr 20 19:39:55.178607 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:39:55.178574 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-wbk2k_5913892d-89ac-4fc0-8208-805a302a71f1/manager/0.log" Apr 20 19:39:55.400125 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:39:55.400042 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-pv7v7_9b137590-67f2-410c-a41d-ef930a9357e1/registry-server/0.log" Apr 20 19:39:55.748126 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:39:55.748052 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-fmx42_af242632-188d-41a5-b48b-fb8d9ea4acc1/manager/0.log" Apr 20 19:39:56.075667 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:39:56.075636 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-55cc67557f9tbr7_5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8/istio-proxy/0.log" Apr 20 19:39:56.507109 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:39:56.507026 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-845c6b4b48-jkf4g_5888b8ba-846a-49d6-a0b9-b409361b907e/istio-proxy/0.log" Apr 20 19:39:56.937507 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:39:56.937471 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x_68fbf495-013e-40fd-b1f8-2c4205a444d0/storage-initializer/0.log" Apr 20 19:39:56.944302 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:39:56.944281 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-d7f2x_68fbf495-013e-40fd-b1f8-2c4205a444d0/main/0.log" Apr 20 19:39:57.055153 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:39:57.055119 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d_fd0732eb-3fc6-4033-bec3-83ece891494d/storage-initializer/0.log" Apr 20 19:39:57.062891 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:39:57.062865 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-nhr6d_fd0732eb-3fc6-4033-bec3-83ece891494d/main/0.log" Apr 20 19:39:57.171561 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:39:57.171537 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-49fwl_243812ff-f0d4-4001-87f5-f4d392e43d50/storage-initializer/0.log" Apr 20 19:39:57.179117 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:39:57.179091 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-49fwl_243812ff-f0d4-4001-87f5-f4d392e43d50/main/0.log" Apr 20 19:39:57.282462 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:39:57.282372 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq_c95435c9-e556-46c9-b9e9-554b5b82ea0d/storage-initializer/0.log" Apr 20 19:39:57.289670 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:39:57.289650 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcch8wdq_c95435c9-e556-46c9-b9e9-554b5b82ea0d/main/0.log" Apr 20 19:39:57.395745 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:39:57.395719 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r_4551c113-e2f6-4983-ac8c-dd61dc9d15c6/main/0.log" Apr 20 19:39:57.402702 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:39:57.402677 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-hlt8r_4551c113-e2f6-4983-ac8c-dd61dc9d15c6/storage-initializer/0.log" Apr 20 19:39:57.508275 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:39:57.508250 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-j958z_427c1dd4-e3a5-4569-86a5-ca8cdfedacb0/main/0.log" Apr 20 19:39:57.515536 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:39:57.515513 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-j958z_427c1dd4-e3a5-4569-86a5-ca8cdfedacb0/storage-initializer/0.log" Apr 20 19:40:04.435532 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:04.435503 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-fhj74_90320faf-0727-4631-bdba-64de071c97ba/global-pull-secret-syncer/0.log" Apr 20 19:40:04.519310 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:04.519284 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-q97q5_ca9e64aa-e049-4e11-b4ec-79ec745fa7c6/konnectivity-agent/0.log" Apr 20 19:40:04.577695 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:04.577665 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-131-162.ec2.internal_aca2508fa5a89bde3f166bc71272b03f/haproxy/0.log" Apr 20 19:40:09.409421 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:09.409390 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-sfmzv_bc91b37b-b181-4658-9961-bca7209a70b9/manager/0.log" Apr 20 19:40:09.436953 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:09.436925 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-wbk2k_5913892d-89ac-4fc0-8208-805a302a71f1/manager/0.log" Apr 20 19:40:09.514768 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:09.514736 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-pv7v7_9b137590-67f2-410c-a41d-ef930a9357e1/registry-server/0.log" Apr 20 19:40:09.620573 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:09.620539 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-fmx42_af242632-188d-41a5-b48b-fb8d9ea4acc1/manager/0.log" Apr 20 19:40:11.411113 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:11.411080 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-8796n_b6b24d33-cf74-4a2b-8f57-d0f3ecadabd0/cluster-monitoring-operator/0.log" Apr 20 19:40:11.449547 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:11.449519 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-rq7fq_b7295835-02aa-4369-a6a8-e0d1bab163a9/kube-state-metrics/0.log" Apr 20 19:40:11.473920 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:11.473895 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-rq7fq_b7295835-02aa-4369-a6a8-e0d1bab163a9/kube-rbac-proxy-main/0.log" Apr 20 19:40:11.499008 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:11.498938 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-rq7fq_b7295835-02aa-4369-a6a8-e0d1bab163a9/kube-rbac-proxy-self/0.log" Apr 20 19:40:11.772537 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:11.772465 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-sdmgq_71f4e361-c121-46fd-bedb-ab6e0d2489a4/node-exporter/0.log" Apr 20 19:40:11.799066 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:11.799040 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-sdmgq_71f4e361-c121-46fd-bedb-ab6e0d2489a4/kube-rbac-proxy/0.log" Apr 20 19:40:11.823141 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:11.823118 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-sdmgq_71f4e361-c121-46fd-bedb-ab6e0d2489a4/init-textfile/0.log" Apr 20 19:40:11.953673 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:11.953643 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6e5df1e4-62f1-4da8-ab77-58a1cddc3055/prometheus/0.log" Apr 20 19:40:11.979211 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:11.979186 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6e5df1e4-62f1-4da8-ab77-58a1cddc3055/config-reloader/0.log" Apr 20 19:40:12.004037 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:12.004010 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6e5df1e4-62f1-4da8-ab77-58a1cddc3055/thanos-sidecar/0.log" Apr 20 19:40:12.031274 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:12.031191 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6e5df1e4-62f1-4da8-ab77-58a1cddc3055/kube-rbac-proxy-web/0.log" Apr 20 19:40:12.074608 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:12.074583 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6e5df1e4-62f1-4da8-ab77-58a1cddc3055/kube-rbac-proxy/0.log" Apr 20 19:40:12.098990 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:12.098966 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6e5df1e4-62f1-4da8-ab77-58a1cddc3055/kube-rbac-proxy-thanos/0.log" Apr 20 19:40:12.126288 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:12.126258 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6e5df1e4-62f1-4da8-ab77-58a1cddc3055/init-config-reloader/0.log" Apr 20 19:40:12.213628 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:12.213600 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-bldkt_ea330750-040a-4755-8c76-18743a732d31/prometheus-operator-admission-webhook/0.log" Apr 20 19:40:12.251219 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:12.251188 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5cf88d5bcf-5p6r6_72ead48b-c34a-4cb6-80ad-02aa7e0bf463/telemeter-client/0.log" Apr 20 19:40:12.274743 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:12.274717 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5cf88d5bcf-5p6r6_72ead48b-c34a-4cb6-80ad-02aa7e0bf463/reload/0.log" Apr 20 19:40:12.297459 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:12.297383 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5cf88d5bcf-5p6r6_72ead48b-c34a-4cb6-80ad-02aa7e0bf463/kube-rbac-proxy/0.log" Apr 20 19:40:12.330637 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:12.330612 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-98cd4df66-gw2jt_a558665e-4811-4d1f-b02d-c3e1dc6a92c5/thanos-query/0.log" Apr 20 19:40:12.354365 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:12.354337 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-98cd4df66-gw2jt_a558665e-4811-4d1f-b02d-c3e1dc6a92c5/kube-rbac-proxy-web/0.log" Apr 20 19:40:12.378044 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:12.378020 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-98cd4df66-gw2jt_a558665e-4811-4d1f-b02d-c3e1dc6a92c5/kube-rbac-proxy/0.log" Apr 20 19:40:12.401908 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:12.401885 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-98cd4df66-gw2jt_a558665e-4811-4d1f-b02d-c3e1dc6a92c5/prom-label-proxy/0.log" Apr 20 19:40:12.425166 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:12.425142 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-98cd4df66-gw2jt_a558665e-4811-4d1f-b02d-c3e1dc6a92c5/kube-rbac-proxy-rules/0.log" Apr 20 19:40:12.448900 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:12.448873 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-98cd4df66-gw2jt_a558665e-4811-4d1f-b02d-c3e1dc6a92c5/kube-rbac-proxy-metrics/0.log" Apr 20 19:40:13.270837 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:13.270801 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qqfns/perf-node-gather-daemonset-8bwcj"] Apr 20 19:40:13.277042 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:13.277014 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-8bwcj" Apr 20 19:40:13.277937 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:13.277915 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qqfns/perf-node-gather-daemonset-8bwcj"] Apr 20 19:40:13.279977 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:13.279955 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-qqfns\"/\"kube-root-ca.crt\"" Apr 20 19:40:13.280090 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:13.280008 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-qqfns\"/\"openshift-service-ca.crt\"" Apr 20 19:40:13.281078 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:13.281063 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-qqfns\"/\"default-dockercfg-gfb8d\"" Apr 20 19:40:13.416343 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:13.416303 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3c77de0a-afe6-477e-9ddc-3a115c09a920-podres\") pod \"perf-node-gather-daemonset-8bwcj\" (UID: \"3c77de0a-afe6-477e-9ddc-3a115c09a920\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-8bwcj" Apr 20 19:40:13.416562 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:13.416350 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fgt4\" (UniqueName: \"kubernetes.io/projected/3c77de0a-afe6-477e-9ddc-3a115c09a920-kube-api-access-7fgt4\") pod \"perf-node-gather-daemonset-8bwcj\" (UID: \"3c77de0a-afe6-477e-9ddc-3a115c09a920\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-8bwcj" Apr 20 19:40:13.416562 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:13.416421 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3c77de0a-afe6-477e-9ddc-3a115c09a920-proc\") pod \"perf-node-gather-daemonset-8bwcj\" (UID: \"3c77de0a-afe6-477e-9ddc-3a115c09a920\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-8bwcj" Apr 20 19:40:13.416660 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:13.416561 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3c77de0a-afe6-477e-9ddc-3a115c09a920-sys\") pod \"perf-node-gather-daemonset-8bwcj\" (UID: \"3c77de0a-afe6-477e-9ddc-3a115c09a920\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-8bwcj" Apr 20 19:40:13.416660 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:13.416620 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3c77de0a-afe6-477e-9ddc-3a115c09a920-lib-modules\") pod \"perf-node-gather-daemonset-8bwcj\" (UID: \"3c77de0a-afe6-477e-9ddc-3a115c09a920\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-8bwcj" Apr 20 19:40:13.517361 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:13.517310 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3c77de0a-afe6-477e-9ddc-3a115c09a920-proc\") pod \"perf-node-gather-daemonset-8bwcj\" (UID: \"3c77de0a-afe6-477e-9ddc-3a115c09a920\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-8bwcj" Apr 20 19:40:13.517808 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:13.517418 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3c77de0a-afe6-477e-9ddc-3a115c09a920-sys\") pod \"perf-node-gather-daemonset-8bwcj\" (UID: \"3c77de0a-afe6-477e-9ddc-3a115c09a920\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-8bwcj" Apr 20 19:40:13.517808 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:13.517486 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3c77de0a-afe6-477e-9ddc-3a115c09a920-proc\") pod \"perf-node-gather-daemonset-8bwcj\" (UID: \"3c77de0a-afe6-477e-9ddc-3a115c09a920\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-8bwcj" Apr 20 19:40:13.517808 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:13.517495 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3c77de0a-afe6-477e-9ddc-3a115c09a920-lib-modules\") pod \"perf-node-gather-daemonset-8bwcj\" (UID: \"3c77de0a-afe6-477e-9ddc-3a115c09a920\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-8bwcj" Apr 20 19:40:13.517808 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:13.517537 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3c77de0a-afe6-477e-9ddc-3a115c09a920-sys\") pod \"perf-node-gather-daemonset-8bwcj\" (UID: \"3c77de0a-afe6-477e-9ddc-3a115c09a920\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-8bwcj" Apr 20 19:40:13.517808 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:13.517598 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3c77de0a-afe6-477e-9ddc-3a115c09a920-lib-modules\") pod \"perf-node-gather-daemonset-8bwcj\" (UID: \"3c77de0a-afe6-477e-9ddc-3a115c09a920\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-8bwcj" Apr 20 19:40:13.517808 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:13.517604 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3c77de0a-afe6-477e-9ddc-3a115c09a920-podres\") pod \"perf-node-gather-daemonset-8bwcj\" (UID: \"3c77de0a-afe6-477e-9ddc-3a115c09a920\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-8bwcj" Apr 20 19:40:13.517808 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:13.517634 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fgt4\" (UniqueName: \"kubernetes.io/projected/3c77de0a-afe6-477e-9ddc-3a115c09a920-kube-api-access-7fgt4\") pod \"perf-node-gather-daemonset-8bwcj\" (UID: \"3c77de0a-afe6-477e-9ddc-3a115c09a920\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-8bwcj" Apr 20 19:40:13.517808 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:13.517735 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3c77de0a-afe6-477e-9ddc-3a115c09a920-podres\") pod \"perf-node-gather-daemonset-8bwcj\" (UID: \"3c77de0a-afe6-477e-9ddc-3a115c09a920\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-8bwcj" Apr 20 19:40:13.526923 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:13.526853 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fgt4\" (UniqueName: \"kubernetes.io/projected/3c77de0a-afe6-477e-9ddc-3a115c09a920-kube-api-access-7fgt4\") pod \"perf-node-gather-daemonset-8bwcj\" (UID: \"3c77de0a-afe6-477e-9ddc-3a115c09a920\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-8bwcj" Apr 20 19:40:13.609680 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:13.609645 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-8bwcj" Apr 20 19:40:13.949910 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:13.949873 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qqfns/perf-node-gather-daemonset-8bwcj"] Apr 20 19:40:13.953417 ip-10-0-131-162 kubenswrapper[2572]: W0420 19:40:13.953392 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3c77de0a_afe6_477e_9ddc_3a115c09a920.slice/crio-0aa2197b8d1374554799b8a77495549dc57f97ea191a3f72f8be24caf130d3f1 WatchSource:0}: Error finding container 0aa2197b8d1374554799b8a77495549dc57f97ea191a3f72f8be24caf130d3f1: Status 404 returned error can't find the container with id 0aa2197b8d1374554799b8a77495549dc57f97ea191a3f72f8be24caf130d3f1 Apr 20 19:40:13.983899 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:13.983869 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-8bwcj" event={"ID":"3c77de0a-afe6-477e-9ddc-3a115c09a920","Type":"ContainerStarted","Data":"0aa2197b8d1374554799b8a77495549dc57f97ea191a3f72f8be24caf130d3f1"} Apr 20 19:40:14.146089 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:14.146058 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-qpl5s_a25d18d6-5add-4c28-a671-0ee5222cb999/console-operator/1.log" Apr 20 19:40:14.150191 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:14.150164 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-qpl5s_a25d18d6-5add-4c28-a671-0ee5222cb999/console-operator/2.log" Apr 20 19:40:14.649276 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:14.649246 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-65df76fddb-5pzvr_1762578a-ac16-475b-8c21-b6d0085b8549/console/0.log" Apr 20 19:40:14.679928 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:14.679899 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-7xljr_ecc56b03-14e4-4238-8c9a-7974d6774b23/download-server/0.log" Apr 20 19:40:14.989175 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:14.989086 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-8bwcj" event={"ID":"3c77de0a-afe6-477e-9ddc-3a115c09a920","Type":"ContainerStarted","Data":"c145667710ca92078010524f1f0dc8385119c77ccf562981e2e9634c6c179b35"} Apr 20 19:40:14.989348 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:14.989196 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-8bwcj" Apr 20 19:40:15.010267 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:15.010218 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-8bwcj" podStartSLOduration=2.010203637 podStartE2EDuration="2.010203637s" podCreationTimestamp="2026-04-20 19:40:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:40:15.006291379 +0000 UTC m=+888.996778914" watchObservedRunningTime="2026-04-20 19:40:15.010203637 +0000 UTC m=+889.000691171" Apr 20 19:40:15.178983 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:15.178905 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-cf4z7_c42ddb6b-2227-4768-8867-b1506419b88d/volume-data-source-validator/0.log" Apr 20 19:40:15.996110 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:15.996085 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-92xgv_b9a6ffc3-ed3d-4922-acb0-cf3513a1d431/dns/0.log" Apr 20 19:40:16.020142 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:16.020108 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-92xgv_b9a6ffc3-ed3d-4922-acb0-cf3513a1d431/kube-rbac-proxy/0.log" Apr 20 19:40:16.171662 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:16.171634 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-b4sls_b0a25aae-e259-4ad8-b476-5694a4f39d1d/dns-node-resolver/0.log" Apr 20 19:40:16.709460 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:16.709415 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-b69bfb95-snnfp_6d5f453f-1b70-4078-8a96-844251489d5c/registry/0.log" Apr 20 19:40:16.792710 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:16.792684 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vvbgf_83a5e765-c988-4980-8534-55e55f1296d7/node-ca/0.log" Apr 20 19:40:17.656621 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:17.656588 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-55cc67557f9tbr7_5ad2a447-0e3b-4eb4-b9e6-3ac592de3ea8/istio-proxy/0.log" Apr 20 19:40:17.815131 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:17.815100 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-845c6b4b48-jkf4g_5888b8ba-846a-49d6-a0b9-b409361b907e/istio-proxy/0.log" Apr 20 19:40:18.426119 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:18.426091 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-kxblw_bc24b476-7aaf-4c95-b13e-44550d15e793/serve-healthcheck-canary/0.log" Apr 20 19:40:19.163833 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:19.163801 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gjghs_c49e842f-07fc-49d1-a61f-45722a72a1cf/kube-rbac-proxy/0.log" Apr 20 19:40:19.190784 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:19.190755 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gjghs_c49e842f-07fc-49d1-a61f-45722a72a1cf/exporter/0.log" Apr 20 19:40:19.222666 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:19.222633 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gjghs_c49e842f-07fc-49d1-a61f-45722a72a1cf/extractor/0.log" Apr 20 19:40:21.003878 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:21.003848 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-8bwcj" Apr 20 19:40:21.223273 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:21.223247 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-6996cc8f49-8fj8h_350c00d6-81b0-47a4-9396-0547f4b26823/maas-api/0.log" Apr 20 19:40:21.255498 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:21.255396 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-56cb74577d-pp6s4_4122ac80-0aee-4c28-8268-ef519cfb0da8/manager/0.log" Apr 20 19:40:21.324349 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:21.324318 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-9f747d685-2n2zg_3507a6b2-e19b-491a-bd27-a1bb5136cda2/manager/0.log" Apr 20 19:40:22.916219 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:22.916152 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-6ddf46b867-5jsw7_2020f544-06f0-42d2-94b5-697f9b55cd3a/manager/0.log" Apr 20 19:40:26.542385 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:26.542350 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-qpl5s_a25d18d6-5add-4c28-a671-0ee5222cb999/console-operator/1.log" Apr 20 19:40:26.550117 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:26.550092 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-qpl5s_a25d18d6-5add-4c28-a671-0ee5222cb999/console-operator/1.log" Apr 20 19:40:27.884981 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:27.884946 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-b74jx_e7a85080-ad4d-4e33-b890-2483a1f5c762/kube-storage-version-migrator-operator/1.log" Apr 20 19:40:27.885930 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:27.885904 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-b74jx_e7a85080-ad4d-4e33-b890-2483a1f5c762/kube-storage-version-migrator-operator/0.log" Apr 20 19:40:28.931301 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:28.931275 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2z5nt_4f1d0d9b-42cd-49b6-9d9f-41487c76d136/kube-multus-additional-cni-plugins/0.log" Apr 20 19:40:28.960758 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:28.960730 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2z5nt_4f1d0d9b-42cd-49b6-9d9f-41487c76d136/egress-router-binary-copy/0.log" Apr 20 19:40:28.984005 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:28.983981 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2z5nt_4f1d0d9b-42cd-49b6-9d9f-41487c76d136/cni-plugins/0.log" Apr 20 19:40:29.007684 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:29.007657 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2z5nt_4f1d0d9b-42cd-49b6-9d9f-41487c76d136/bond-cni-plugin/0.log" Apr 20 19:40:29.031038 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:29.031007 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2z5nt_4f1d0d9b-42cd-49b6-9d9f-41487c76d136/routeoverride-cni/0.log" Apr 20 19:40:29.056834 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:29.056798 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2z5nt_4f1d0d9b-42cd-49b6-9d9f-41487c76d136/whereabouts-cni-bincopy/0.log" Apr 20 19:40:29.081407 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:29.081383 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2z5nt_4f1d0d9b-42cd-49b6-9d9f-41487c76d136/whereabouts-cni/0.log" Apr 20 19:40:29.609619 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:29.609534 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s7t97_066e3172-90cc-4dbf-9891-089727ab8561/kube-multus/0.log" Apr 20 19:40:29.634160 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:29.634135 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7cd7d_513dd790-7dbf-46da-821a-3493b9941466/network-metrics-daemon/0.log" Apr 20 19:40:29.656099 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:29.656074 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7cd7d_513dd790-7dbf-46da-821a-3493b9941466/kube-rbac-proxy/0.log" Apr 20 19:40:30.615868 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:30.615837 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5ksvj_988fcf46-c192-47b7-a3ad-27d4676cf1f2/ovn-controller/0.log" Apr 20 19:40:30.640840 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:30.640812 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5ksvj_988fcf46-c192-47b7-a3ad-27d4676cf1f2/ovn-acl-logging/0.log" Apr 20 19:40:30.663283 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:30.663261 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5ksvj_988fcf46-c192-47b7-a3ad-27d4676cf1f2/kube-rbac-proxy-node/0.log" Apr 20 19:40:30.687162 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:30.687139 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5ksvj_988fcf46-c192-47b7-a3ad-27d4676cf1f2/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 19:40:30.711059 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:30.711021 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5ksvj_988fcf46-c192-47b7-a3ad-27d4676cf1f2/northd/0.log" Apr 20 19:40:30.735032 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:30.735007 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5ksvj_988fcf46-c192-47b7-a3ad-27d4676cf1f2/nbdb/0.log" Apr 20 19:40:30.760774 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:30.760749 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5ksvj_988fcf46-c192-47b7-a3ad-27d4676cf1f2/sbdb/0.log" Apr 20 19:40:30.869755 ip-10-0-131-162 kubenswrapper[2572]: I0420 19:40:30.869678 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5ksvj_988fcf46-c192-47b7-a3ad-27d4676cf1f2/ovnkube-controller/0.log"