Apr 17 18:49:04.215192 ip-10-0-136-27 systemd[1]: Starting Kubernetes Kubelet... Apr 17 18:49:04.725027 ip-10-0-136-27 kubenswrapper[2580]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 18:49:04.725027 ip-10-0-136-27 kubenswrapper[2580]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 18:49:04.725027 ip-10-0-136-27 kubenswrapper[2580]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 18:49:04.725027 ip-10-0-136-27 kubenswrapper[2580]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 18:49:04.725027 ip-10-0-136-27 kubenswrapper[2580]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 18:49:04.727292 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.727201 2580 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 18:49:04.729627 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729611 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 18:49:04.729663 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729628 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 18:49:04.729663 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729634 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 18:49:04.729663 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729638 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 18:49:04.729663 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729642 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 18:49:04.729663 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729645 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 18:49:04.729663 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729648 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 18:49:04.729663 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729650 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 18:49:04.729663 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729654 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 18:49:04.729663 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729656 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 18:49:04.729663 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729659 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 18:49:04.729663 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729662 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 18:49:04.729663 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729664 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 18:49:04.729663 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729667 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 18:49:04.729663 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729670 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 18:49:04.730027 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729673 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 18:49:04.730027 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729676 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 18:49:04.730027 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729678 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 18:49:04.730027 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729681 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 18:49:04.730027 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729684 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 18:49:04.730027 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729694 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 18:49:04.730027 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729697 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 18:49:04.730027 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729700 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 18:49:04.730027 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729704 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 18:49:04.730027 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729707 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 18:49:04.730027 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729710 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 18:49:04.730027 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729713 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 18:49:04.730027 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729715 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 18:49:04.730027 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729717 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 18:49:04.730027 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729720 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 18:49:04.730027 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729723 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 18:49:04.730027 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729725 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 18:49:04.730027 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729727 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 18:49:04.730027 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729730 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 18:49:04.730027 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729732 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 18:49:04.730503 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729735 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 18:49:04.730503 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729737 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 18:49:04.730503 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729740 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 18:49:04.730503 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729743 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 18:49:04.730503 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729745 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 18:49:04.730503 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729748 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 18:49:04.730503 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729750 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 18:49:04.730503 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729753 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 18:49:04.730503 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729764 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 18:49:04.730503 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729766 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 18:49:04.730503 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729769 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 18:49:04.730503 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729771 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 18:49:04.730503 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729774 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 18:49:04.730503 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729777 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 18:49:04.730503 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729780 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 18:49:04.730503 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729783 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 18:49:04.730503 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729786 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 18:49:04.730503 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729788 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 17 18:49:04.730503 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729791 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 18:49:04.730503 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729794 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 18:49:04.731041 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729796 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 18:49:04.731041 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729799 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 18:49:04.731041 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729801 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 18:49:04.731041 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729804 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 18:49:04.731041 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729806 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 18:49:04.731041 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729809 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 18:49:04.731041 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729812 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 18:49:04.731041 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729814 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 18:49:04.731041 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729817 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 18:49:04.731041 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729819 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 18:49:04.731041 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729822 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 18:49:04.731041 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729825 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 18:49:04.731041 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729827 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 18:49:04.731041 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729830 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 18:49:04.731041 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729832 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 18:49:04.731041 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729835 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 18:49:04.731041 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729837 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 18:49:04.731041 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729840 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 18:49:04.731041 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729842 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 18:49:04.731041 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729845 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 18:49:04.731526 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729848 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 18:49:04.731526 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729850 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 18:49:04.731526 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729854 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 18:49:04.731526 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729856 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 18:49:04.731526 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729859 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 18:49:04.731526 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729863 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 18:49:04.731526 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729866 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 18:49:04.731526 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729869 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 18:49:04.731526 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729872 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 18:49:04.731526 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729874 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 18:49:04.731526 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.729877 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 18:49:04.731526 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731173 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 18:49:04.731526 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731180 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 18:49:04.731526 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731183 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 18:49:04.731526 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731187 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 18:49:04.731526 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731190 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 18:49:04.731526 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731194 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 18:49:04.731526 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731197 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 18:49:04.731526 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731200 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 18:49:04.731997 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731202 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 18:49:04.731997 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731205 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 18:49:04.731997 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731208 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 18:49:04.731997 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731211 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 18:49:04.731997 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731214 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 18:49:04.731997 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731217 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 18:49:04.731997 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731220 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 18:49:04.731997 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731222 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 18:49:04.731997 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731225 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 18:49:04.731997 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731227 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 18:49:04.731997 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731230 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 18:49:04.731997 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731233 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 18:49:04.731997 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731235 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 18:49:04.731997 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731238 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 18:49:04.731997 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731241 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 18:49:04.731997 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731243 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 18:49:04.731997 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731246 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 18:49:04.731997 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731248 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 18:49:04.731997 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731252 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 18:49:04.731997 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731256 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 18:49:04.732501 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731259 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 18:49:04.732501 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731261 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 18:49:04.732501 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731264 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 18:49:04.732501 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731267 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 18:49:04.732501 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731270 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 18:49:04.732501 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731272 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 18:49:04.732501 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731274 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 18:49:04.732501 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731277 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 18:49:04.732501 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731280 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 18:49:04.732501 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731282 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 18:49:04.732501 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731285 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 18:49:04.732501 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731287 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 18:49:04.732501 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731289 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 18:49:04.732501 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731292 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 18:49:04.732501 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731294 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 18:49:04.732501 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731296 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 18:49:04.732501 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731299 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 18:49:04.732501 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731302 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 18:49:04.732501 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731304 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 18:49:04.732969 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731306 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 18:49:04.732969 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731309 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 18:49:04.732969 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731311 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 18:49:04.732969 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731313 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 18:49:04.732969 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731316 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 18:49:04.732969 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731318 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 18:49:04.732969 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731321 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 17 18:49:04.732969 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731324 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 18:49:04.732969 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731327 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 18:49:04.732969 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731329 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 18:49:04.732969 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731332 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 18:49:04.732969 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731335 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 18:49:04.732969 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731340 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 18:49:04.732969 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731343 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 18:49:04.732969 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731347 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 18:49:04.732969 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731349 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 18:49:04.732969 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731352 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 18:49:04.732969 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731354 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 18:49:04.732969 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731357 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 18:49:04.733469 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731359 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 18:49:04.733469 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731361 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 18:49:04.733469 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731364 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 18:49:04.733469 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731367 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 18:49:04.733469 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731369 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 18:49:04.733469 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731372 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 18:49:04.733469 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731374 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 18:49:04.733469 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731377 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 18:49:04.733469 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731380 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 18:49:04.733469 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731383 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 18:49:04.733469 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731385 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 18:49:04.733469 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731388 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 18:49:04.733469 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731390 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 18:49:04.733469 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731392 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 18:49:04.733469 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731395 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 18:49:04.733469 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731398 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 18:49:04.733469 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731400 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 18:49:04.733469 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731402 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 18:49:04.733469 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731405 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 18:49:04.733469 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.731407 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 18:49:04.733469 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731487 2580 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 18:49:04.734007 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731495 2580 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 18:49:04.734007 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731501 2580 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 18:49:04.734007 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731506 2580 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 18:49:04.734007 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731512 2580 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 18:49:04.734007 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731516 2580 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 18:49:04.734007 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731521 2580 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 18:49:04.734007 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731526 2580 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 18:49:04.734007 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731530 2580 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 18:49:04.734007 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731533 2580 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 18:49:04.734007 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731537 2580 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 18:49:04.734007 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731540 2580 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 18:49:04.734007 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731543 2580 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 18:49:04.734007 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731546 2580 flags.go:64] FLAG: --cgroup-root="" Apr 17 18:49:04.734007 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731549 2580 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 18:49:04.734007 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731552 2580 flags.go:64] FLAG: --client-ca-file="" Apr 17 18:49:04.734007 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731555 2580 flags.go:64] FLAG: --cloud-config="" Apr 17 18:49:04.734007 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731557 2580 flags.go:64] FLAG: --cloud-provider="external" Apr 17 18:49:04.734007 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731560 2580 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 18:49:04.734007 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731566 2580 flags.go:64] FLAG: --cluster-domain="" Apr 17 18:49:04.734007 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731569 2580 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 18:49:04.734007 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731572 2580 flags.go:64] FLAG: --config-dir="" Apr 17 18:49:04.734007 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731575 2580 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 18:49:04.734007 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731578 2580 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 18:49:04.734007 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731583 2580 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 18:49:04.734580 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731585 2580 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 18:49:04.734580 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731589 2580 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 18:49:04.734580 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731592 2580 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 18:49:04.734580 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731595 2580 flags.go:64] FLAG: --contention-profiling="false" Apr 17 18:49:04.734580 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731597 2580 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 18:49:04.734580 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731600 2580 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 18:49:04.734580 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731603 2580 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 18:49:04.734580 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731606 2580 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 18:49:04.734580 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731611 2580 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 18:49:04.734580 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731614 2580 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 18:49:04.734580 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731617 2580 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 18:49:04.734580 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731621 2580 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 18:49:04.734580 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731624 2580 flags.go:64] FLAG: --enable-server="true" Apr 17 18:49:04.734580 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731627 2580 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 18:49:04.734580 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731631 2580 flags.go:64] FLAG: --event-burst="100" Apr 17 18:49:04.734580 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731634 2580 flags.go:64] FLAG: --event-qps="50" Apr 17 18:49:04.734580 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731637 2580 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 18:49:04.734580 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731640 2580 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 18:49:04.734580 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731643 2580 flags.go:64] FLAG: --eviction-hard="" Apr 17 18:49:04.734580 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731647 2580 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 18:49:04.734580 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731649 2580 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 18:49:04.734580 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731652 2580 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 18:49:04.734580 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731655 2580 flags.go:64] FLAG: --eviction-soft="" Apr 17 18:49:04.734580 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731658 2580 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 18:49:04.734580 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731662 2580 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 18:49:04.735204 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731664 2580 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 18:49:04.735204 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731667 2580 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 18:49:04.735204 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731670 2580 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 18:49:04.735204 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731673 2580 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 18:49:04.735204 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731676 2580 flags.go:64] FLAG: --feature-gates="" Apr 17 18:49:04.735204 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731679 2580 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 18:49:04.735204 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731682 2580 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 18:49:04.735204 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731685 2580 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 18:49:04.735204 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731688 2580 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 18:49:04.735204 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731691 2580 flags.go:64] FLAG: --healthz-port="10248" Apr 17 18:49:04.735204 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731694 2580 flags.go:64] FLAG: --help="false" Apr 17 18:49:04.735204 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731697 2580 flags.go:64] FLAG: --hostname-override="ip-10-0-136-27.ec2.internal" Apr 17 18:49:04.735204 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731700 2580 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 18:49:04.735204 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731703 2580 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 18:49:04.735204 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731705 2580 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 18:49:04.735204 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731709 2580 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 18:49:04.735204 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731712 2580 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 18:49:04.735204 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731715 2580 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 18:49:04.735204 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731719 2580 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 18:49:04.735204 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731722 2580 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 18:49:04.735204 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731725 2580 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 18:49:04.735204 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731728 2580 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 18:49:04.735204 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731731 2580 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 18:49:04.735204 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731734 2580 flags.go:64] FLAG: --kube-reserved="" Apr 17 18:49:04.735778 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731737 2580 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 18:49:04.735778 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731739 2580 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 18:49:04.735778 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731742 2580 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 18:49:04.735778 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731745 2580 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 18:49:04.735778 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731748 2580 flags.go:64] FLAG: --lock-file="" Apr 17 18:49:04.735778 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731751 2580 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 18:49:04.735778 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731753 2580 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 18:49:04.735778 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731756 2580 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 18:49:04.735778 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731762 2580 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 18:49:04.735778 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731764 2580 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 18:49:04.735778 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731767 2580 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 18:49:04.735778 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731770 2580 flags.go:64] FLAG: --logging-format="text" Apr 17 18:49:04.735778 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731773 2580 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 18:49:04.735778 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731777 2580 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 18:49:04.735778 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731779 2580 flags.go:64] FLAG: --manifest-url="" Apr 17 18:49:04.735778 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731782 2580 flags.go:64] FLAG: --manifest-url-header="" Apr 17 18:49:04.735778 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731786 2580 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 18:49:04.735778 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731789 2580 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 18:49:04.735778 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731793 2580 flags.go:64] FLAG: --max-pods="110" Apr 17 18:49:04.735778 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731796 2580 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 18:49:04.735778 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731799 2580 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 18:49:04.735778 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731802 2580 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 18:49:04.735778 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731805 2580 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 18:49:04.735778 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731807 2580 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 18:49:04.735778 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731810 2580 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 18:49:04.736414 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731814 2580 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 18:49:04.736414 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731822 2580 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 18:49:04.736414 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731825 2580 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 18:49:04.736414 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731828 2580 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 18:49:04.736414 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731831 2580 flags.go:64] FLAG: --pod-cidr="" Apr 17 18:49:04.736414 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731834 2580 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 18:49:04.736414 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731839 2580 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 18:49:04.736414 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731842 2580 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 18:49:04.736414 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731845 2580 flags.go:64] FLAG: --pods-per-core="0" Apr 17 18:49:04.736414 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731848 2580 flags.go:64] FLAG: --port="10250" Apr 17 18:49:04.736414 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731851 2580 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 18:49:04.736414 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731854 2580 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-07484d68c707e26f0" Apr 17 18:49:04.736414 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731857 2580 flags.go:64] FLAG: --qos-reserved="" Apr 17 18:49:04.736414 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731860 2580 flags.go:64] FLAG: --read-only-port="10255" Apr 17 18:49:04.736414 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731863 2580 flags.go:64] FLAG: --register-node="true" Apr 17 18:49:04.736414 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731866 2580 flags.go:64] FLAG: --register-schedulable="true" Apr 17 18:49:04.736414 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731868 2580 flags.go:64] FLAG: --register-with-taints="" Apr 17 18:49:04.736414 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731872 2580 flags.go:64] FLAG: --registry-burst="10" Apr 17 18:49:04.736414 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731875 2580 flags.go:64] FLAG: --registry-qps="5" Apr 17 18:49:04.736414 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731877 2580 flags.go:64] FLAG: --reserved-cpus="" Apr 17 18:49:04.736414 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731880 2580 flags.go:64] FLAG: --reserved-memory="" Apr 17 18:49:04.736414 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731884 2580 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 18:49:04.736414 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731887 2580 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 18:49:04.736414 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731890 2580 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 18:49:04.736414 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731909 2580 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 18:49:04.737019 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731912 2580 flags.go:64] FLAG: --runonce="false" Apr 17 18:49:04.737019 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731915 2580 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 18:49:04.737019 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731918 2580 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 18:49:04.737019 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731921 2580 flags.go:64] FLAG: --seccomp-default="false" Apr 17 18:49:04.737019 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731924 2580 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 18:49:04.737019 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731927 2580 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 18:49:04.737019 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731930 2580 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 18:49:04.737019 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731933 2580 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 18:49:04.737019 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731940 2580 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 18:49:04.737019 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731943 2580 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 18:49:04.737019 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731946 2580 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 18:49:04.737019 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731949 2580 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 18:49:04.737019 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731952 2580 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 18:49:04.737019 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731955 2580 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 18:49:04.737019 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731958 2580 flags.go:64] FLAG: --system-cgroups="" Apr 17 18:49:04.737019 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731960 2580 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 18:49:04.737019 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731967 2580 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 18:49:04.737019 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731969 2580 flags.go:64] FLAG: --tls-cert-file="" Apr 17 18:49:04.737019 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731972 2580 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 18:49:04.737019 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731976 2580 flags.go:64] FLAG: --tls-min-version="" Apr 17 18:49:04.737019 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731979 2580 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 18:49:04.737019 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731982 2580 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 18:49:04.737019 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731985 2580 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 18:49:04.737019 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731987 2580 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 18:49:04.737019 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731990 2580 flags.go:64] FLAG: --v="2" Apr 17 18:49:04.737607 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.731999 2580 flags.go:64] FLAG: --version="false" Apr 17 18:49:04.737607 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.732003 2580 flags.go:64] FLAG: --vmodule="" Apr 17 18:49:04.737607 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.732009 2580 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 18:49:04.737607 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.732012 2580 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 18:49:04.737607 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732100 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 18:49:04.737607 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732104 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 18:49:04.737607 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732107 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 18:49:04.737607 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732109 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 18:49:04.737607 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732112 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 18:49:04.737607 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732114 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 18:49:04.737607 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732117 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 18:49:04.737607 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732119 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 18:49:04.737607 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732122 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 18:49:04.737607 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732125 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 18:49:04.737607 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732127 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 18:49:04.737607 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732130 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 18:49:04.737607 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732133 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 18:49:04.737607 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732136 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 18:49:04.737607 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732138 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 18:49:04.737607 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732141 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 18:49:04.737607 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732143 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 18:49:04.738131 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732146 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 18:49:04.738131 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732148 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 18:49:04.738131 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732151 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 18:49:04.738131 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732153 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 18:49:04.738131 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732156 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 18:49:04.738131 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732158 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 18:49:04.738131 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732160 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 18:49:04.738131 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732163 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 18:49:04.738131 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732165 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 18:49:04.738131 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732168 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 18:49:04.738131 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732178 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 18:49:04.738131 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732183 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 18:49:04.738131 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732185 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 18:49:04.738131 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732188 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 18:49:04.738131 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732190 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 18:49:04.738131 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732193 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 18:49:04.738131 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732195 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 18:49:04.738131 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732198 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 18:49:04.738131 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732202 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 18:49:04.738816 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732205 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 18:49:04.738816 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732208 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 18:49:04.738816 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732211 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 18:49:04.738816 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732214 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 18:49:04.738816 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732217 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 18:49:04.738816 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732219 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 18:49:04.738816 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732221 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 18:49:04.738816 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732224 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 18:49:04.738816 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732226 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 18:49:04.738816 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732230 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 18:49:04.738816 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732235 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 18:49:04.738816 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732238 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 18:49:04.738816 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732241 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 18:49:04.738816 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732243 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 18:49:04.738816 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732246 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 18:49:04.738816 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732249 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 18:49:04.738816 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732251 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 18:49:04.738816 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732253 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 18:49:04.738816 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732256 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 18:49:04.738816 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732258 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 18:49:04.739687 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732261 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 18:49:04.739687 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732263 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 18:49:04.739687 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732265 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 18:49:04.739687 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732268 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 18:49:04.739687 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732271 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 18:49:04.739687 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732274 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 18:49:04.739687 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732276 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 18:49:04.739687 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732279 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 18:49:04.739687 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732281 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 18:49:04.739687 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732284 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 18:49:04.739687 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732286 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 18:49:04.739687 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732288 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 18:49:04.739687 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732291 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 18:49:04.739687 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732294 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 18:49:04.739687 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732297 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 18:49:04.739687 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732299 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 18:49:04.739687 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732302 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 18:49:04.739687 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732304 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 18:49:04.739687 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732306 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 18:49:04.740515 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732309 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 18:49:04.740515 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732311 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 18:49:04.740515 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732314 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 18:49:04.740515 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732316 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 18:49:04.740515 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732319 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 17 18:49:04.740515 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732322 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 18:49:04.740515 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732324 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 18:49:04.740515 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732327 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 18:49:04.740515 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732329 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 18:49:04.740515 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732331 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 18:49:04.740515 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.732334 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 18:49:04.740515 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.732948 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 18:49:04.740515 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.740501 2580 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 18:49:04.741111 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.740528 2580 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 18:49:04.741111 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740602 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 18:49:04.741111 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740610 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 18:49:04.741111 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740615 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 18:49:04.741111 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740619 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 18:49:04.741111 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740623 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 18:49:04.741111 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740628 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 18:49:04.741111 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740633 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 18:49:04.741111 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740637 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 18:49:04.741111 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740641 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 18:49:04.741111 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740645 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 18:49:04.741111 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740650 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 18:49:04.741111 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740654 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 18:49:04.741111 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740659 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 18:49:04.741111 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740663 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 18:49:04.741111 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740669 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 18:49:04.741111 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740676 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 18:49:04.741111 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740680 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 18:49:04.741111 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740684 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 18:49:04.741921 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740688 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 18:49:04.741921 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740692 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 18:49:04.741921 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740696 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 18:49:04.741921 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740701 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 18:49:04.741921 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740704 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 18:49:04.741921 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740709 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 18:49:04.741921 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740713 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 18:49:04.741921 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740717 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 18:49:04.741921 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740721 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 17 18:49:04.741921 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740725 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 18:49:04.741921 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740729 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 18:49:04.741921 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740734 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 18:49:04.741921 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740738 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 18:49:04.741921 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740744 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 18:49:04.741921 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740748 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 18:49:04.741921 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740752 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 18:49:04.741921 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740756 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 18:49:04.741921 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740760 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 18:49:04.741921 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740765 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 18:49:04.741921 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740769 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 18:49:04.742419 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740773 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 18:49:04.742419 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740777 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 18:49:04.742419 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740781 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 18:49:04.742419 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740785 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 18:49:04.742419 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740789 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 18:49:04.742419 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740793 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 18:49:04.742419 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740797 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 18:49:04.742419 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740801 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 18:49:04.742419 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740806 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 18:49:04.742419 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740810 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 18:49:04.742419 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740814 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 18:49:04.742419 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740819 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 18:49:04.742419 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740822 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 18:49:04.742419 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740826 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 18:49:04.742419 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740830 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 18:49:04.742419 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740834 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 18:49:04.742419 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740839 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 18:49:04.742419 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740843 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 18:49:04.742419 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740847 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 18:49:04.742419 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740851 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 18:49:04.743166 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740855 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 18:49:04.743166 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740858 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 18:49:04.743166 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740862 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 18:49:04.743166 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740866 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 18:49:04.743166 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740871 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 18:49:04.743166 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740876 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 18:49:04.743166 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740882 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 18:49:04.743166 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740886 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 18:49:04.743166 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740909 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 18:49:04.743166 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740914 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 18:49:04.743166 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740918 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 18:49:04.743166 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740922 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 18:49:04.743166 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740929 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 18:49:04.743166 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740936 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 18:49:04.743166 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740942 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 18:49:04.743166 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740946 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 18:49:04.743166 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740952 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 18:49:04.743166 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740956 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 18:49:04.743166 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740960 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 18:49:04.743940 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740964 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 18:49:04.743940 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740969 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 18:49:04.743940 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740973 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 18:49:04.743940 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740977 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 18:49:04.743940 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740982 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 18:49:04.743940 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740985 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 18:49:04.743940 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740990 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 18:49:04.743940 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740994 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 18:49:04.743940 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.740998 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 18:49:04.743940 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.741007 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 18:49:04.743940 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741167 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 18:49:04.743940 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741175 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 18:49:04.743940 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741180 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 18:49:04.743940 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741185 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 18:49:04.743940 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741189 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 18:49:04.743940 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741194 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 18:49:04.744709 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741198 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 18:49:04.744709 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741202 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 18:49:04.744709 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741206 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 18:49:04.744709 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741210 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 18:49:04.744709 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741215 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 18:49:04.744709 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741219 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 18:49:04.744709 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741223 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 18:49:04.744709 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741227 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 18:49:04.744709 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741231 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 18:49:04.744709 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741235 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 18:49:04.744709 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741240 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 18:49:04.744709 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741244 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 18:49:04.744709 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741248 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 18:49:04.744709 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741252 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 18:49:04.744709 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741257 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 17 18:49:04.744709 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741261 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 18:49:04.744709 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741265 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 18:49:04.744709 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741269 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 18:49:04.744709 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741273 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 18:49:04.744709 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741277 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 18:49:04.745423 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741281 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 18:49:04.745423 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741285 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 18:49:04.745423 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741289 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 18:49:04.745423 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741294 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 18:49:04.745423 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741298 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 18:49:04.745423 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741302 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 18:49:04.745423 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741306 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 18:49:04.745423 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741311 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 18:49:04.745423 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741314 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 18:49:04.745423 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741319 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 18:49:04.745423 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741322 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 18:49:04.745423 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741326 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 18:49:04.745423 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741331 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 18:49:04.745423 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741335 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 18:49:04.745423 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741339 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 18:49:04.745423 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741343 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 18:49:04.745423 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741347 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 18:49:04.745423 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741352 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 18:49:04.745423 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741359 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 18:49:04.745917 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741364 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 18:49:04.745917 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741369 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 18:49:04.745917 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741374 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 18:49:04.745917 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741379 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 18:49:04.745917 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741383 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 18:49:04.745917 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741387 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 18:49:04.745917 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741391 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 18:49:04.745917 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741396 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 18:49:04.745917 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741400 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 18:49:04.745917 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741404 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 18:49:04.745917 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741408 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 18:49:04.745917 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741413 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 18:49:04.745917 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741418 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 18:49:04.745917 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741422 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 18:49:04.745917 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741425 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 18:49:04.745917 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741429 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 18:49:04.745917 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741434 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 18:49:04.745917 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741438 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 18:49:04.745917 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741443 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 18:49:04.745917 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741447 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 18:49:04.746531 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741451 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 18:49:04.746531 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741455 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 18:49:04.746531 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741459 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 18:49:04.746531 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741462 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 18:49:04.746531 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741467 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 18:49:04.746531 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741470 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 18:49:04.746531 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741475 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 18:49:04.746531 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741479 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 18:49:04.746531 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741484 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 18:49:04.746531 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741488 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 18:49:04.746531 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741492 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 18:49:04.746531 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741497 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 18:49:04.746531 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741501 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 18:49:04.746531 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741505 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 18:49:04.746531 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741511 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 18:49:04.746531 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741517 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 18:49:04.746531 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741523 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 18:49:04.746531 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741527 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 18:49:04.746531 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741532 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 18:49:04.746531 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741536 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 18:49:04.747048 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:04.741540 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 18:49:04.747048 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.741549 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 18:49:04.747048 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.742703 2580 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 18:49:04.747048 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.746063 2580 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 18:49:04.747173 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.747154 2580 server.go:1019] "Starting client certificate rotation" Apr 17 18:49:04.747273 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.747254 2580 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 18:49:04.747308 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.747296 2580 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 18:49:04.777086 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.777057 2580 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 18:49:04.780453 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.780431 2580 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 18:49:04.813494 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.813465 2580 log.go:25] "Validated CRI v1 runtime API" Apr 17 18:49:04.819075 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.819057 2580 log.go:25] "Validated CRI v1 image API" Apr 17 18:49:04.820308 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.820289 2580 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 18:49:04.823073 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.823052 2580 fs.go:135] Filesystem UUIDs: map[1b9e4383-68db-48f9-abcc-5b488bb69033:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 800c018d-b039-4b2d-b5a6-537806817264:/dev/nvme0n1p3] Apr 17 18:49:04.823160 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.823071 2580 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 18:49:04.827498 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.827478 2580 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 18:49:04.828566 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.828457 2580 manager.go:217] Machine: {Timestamp:2026-04-17 18:49:04.826856932 +0000 UTC m=+0.479392849 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099092 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2193298cb4a6fb7017ab5cc8cda156 SystemUUID:ec219329-8cb4-a6fb-7017-ab5cc8cda156 BootID:feff422c-72f4-4d2f-bd9e-7a4292be287e Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:9c:ba:bb:b9:7f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:9c:ba:bb:b9:7f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:8a:c7:de:c3:5b:54 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 18:49:04.828566 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.828563 2580 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 18:49:04.828712 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.828701 2580 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 18:49:04.831304 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.831274 2580 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 18:49:04.831453 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.831307 2580 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-136-27.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 18:49:04.831504 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.831463 2580 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 18:49:04.831504 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.831471 2580 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 18:49:04.831504 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.831483 2580 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 18:49:04.831504 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.831500 2580 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 18:49:04.833096 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.833085 2580 state_mem.go:36] "Initialized new in-memory state store" Apr 17 18:49:04.833376 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.833366 2580 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 18:49:04.836000 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.835990 2580 kubelet.go:491] "Attempting to sync node with API server" Apr 17 18:49:04.836053 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.836004 2580 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 18:49:04.836053 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.836017 2580 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 18:49:04.836053 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.836026 2580 kubelet.go:397] "Adding apiserver pod source" Apr 17 18:49:04.836053 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.836034 2580 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 18:49:04.837516 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.837501 2580 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 18:49:04.837552 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.837522 2580 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 18:49:04.840822 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.840807 2580 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 18:49:04.842175 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.842155 2580 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 18:49:04.844159 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.844144 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 18:49:04.844240 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.844176 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 18:49:04.844240 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.844217 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 18:49:04.844240 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.844227 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 18:49:04.844240 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.844236 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 18:49:04.844417 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.844246 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 18:49:04.844417 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.844257 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 18:49:04.844417 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.844266 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 18:49:04.844417 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.844276 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 18:49:04.844417 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.844284 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 18:49:04.844417 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.844306 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 18:49:04.844417 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.844320 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 18:49:04.845129 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.845117 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 18:49:04.845177 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.845133 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 18:49:04.848636 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.848620 2580 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 18:49:04.848724 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.848662 2580 server.go:1295] "Started kubelet" Apr 17 18:49:04.848812 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.848784 2580 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 18:49:04.848940 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.848861 2580 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 18:49:04.848984 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.848965 2580 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 18:49:04.849561 ip-10-0-136-27 systemd[1]: Started Kubernetes Kubelet. Apr 17 18:49:04.849934 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.849917 2580 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 18:49:04.850184 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.850050 2580 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-136-27.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 18:49:04.850184 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:04.850073 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-136-27.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 18:49:04.850184 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:04.850152 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 18:49:04.856186 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.856158 2580 server.go:317] "Adding debug handlers to kubelet server" Apr 17 18:49:04.858949 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.858926 2580 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 18:49:04.859054 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:04.858947 2580 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 18:49:04.859437 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.859425 2580 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 18:49:04.860304 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.860285 2580 factory.go:153] Registering CRI-O factory Apr 17 18:49:04.860304 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.860305 2580 factory.go:223] Registration of the crio container factory successfully Apr 17 18:49:04.860425 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.860326 2580 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 18:49:04.860425 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.860341 2580 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 18:49:04.860425 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.860328 2580 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 18:49:04.860425 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:04.860337 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-27.ec2.internal\" not found" Apr 17 18:49:04.860425 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.860395 2580 reconstruct.go:97] "Volume reconstruction finished" Apr 17 18:49:04.860425 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.860404 2580 reconciler.go:26] "Reconciler: start to sync state" Apr 17 18:49:04.860695 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.860479 2580 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 18:49:04.860695 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.860489 2580 factory.go:55] Registering systemd factory Apr 17 18:49:04.860695 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.860494 2580 factory.go:223] Registration of the systemd container factory successfully Apr 17 18:49:04.860695 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.860507 2580 factory.go:103] Registering Raw factory Apr 17 18:49:04.860695 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.860516 2580 manager.go:1196] Started watching for new ooms in manager Apr 17 18:49:04.860979 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.860967 2580 manager.go:319] Starting recovery of all containers Apr 17 18:49:04.861120 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:04.859975 2580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-27.ec2.internal.18a739732c535dba default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-27.ec2.internal,UID:ip-10-0-136-27.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-136-27.ec2.internal,},FirstTimestamp:2026-04-17 18:49:04.848633274 +0000 UTC m=+0.501169194,LastTimestamp:2026-04-17 18:49:04.848633274 +0000 UTC m=+0.501169194,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-27.ec2.internal,}" Apr 17 18:49:04.867694 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:04.867670 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 18:49:04.868426 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:04.868292 2580 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-136-27.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 17 18:49:04.869677 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.869644 2580 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-jz4sm" Apr 17 18:49:04.870658 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.870641 2580 manager.go:324] Recovery completed Apr 17 18:49:04.875116 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.875104 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 18:49:04.877778 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.877758 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-27.ec2.internal" event="NodeHasSufficientMemory" Apr 17 18:49:04.877855 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.877792 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-27.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 18:49:04.877855 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.877807 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-27.ec2.internal" event="NodeHasSufficientPID" Apr 17 18:49:04.878350 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.878336 2580 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 18:49:04.878350 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.878349 2580 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 18:49:04.878442 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.878365 2580 state_mem.go:36] "Initialized new in-memory state store" Apr 17 18:49:04.879832 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:04.879767 2580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-27.ec2.internal.18a739732e100c62 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-27.ec2.internal,UID:ip-10-0-136-27.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-136-27.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-136-27.ec2.internal,},FirstTimestamp:2026-04-17 18:49:04.87777597 +0000 UTC m=+0.530311886,LastTimestamp:2026-04-17 18:49:04.87777597 +0000 UTC m=+0.530311886,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-27.ec2.internal,}" Apr 17 18:49:04.880215 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.880197 2580 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-jz4sm" Apr 17 18:49:04.881035 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.881020 2580 policy_none.go:49] "None policy: Start" Apr 17 18:49:04.881076 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.881042 2580 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 18:49:04.881076 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.881056 2580 state_mem.go:35] "Initializing new in-memory state store" Apr 17 18:49:04.924494 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.924478 2580 manager.go:341] "Starting Device Plugin manager" Apr 17 18:49:04.934275 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:04.924526 2580 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 18:49:04.934275 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.924547 2580 server.go:85] "Starting device plugin registration server" Apr 17 18:49:04.934275 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.925159 2580 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 18:49:04.934275 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.925180 2580 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 18:49:04.934275 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.925388 2580 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 18:49:04.934275 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.925463 2580 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 18:49:04.934275 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.925472 2580 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 18:49:04.934275 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:04.926144 2580 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 18:49:04.934275 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:04.926197 2580 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-136-27.ec2.internal\" not found" Apr 17 18:49:04.971357 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.971326 2580 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 18:49:04.972766 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.972741 2580 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 18:49:04.972865 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.972776 2580 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 18:49:04.972865 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.972799 2580 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 18:49:04.972865 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.972809 2580 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 18:49:04.972865 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:04.972847 2580 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 18:49:04.976455 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:04.976399 2580 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 18:49:05.026387 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.026356 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 18:49:05.027365 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.027351 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-27.ec2.internal" event="NodeHasSufficientMemory" Apr 17 18:49:05.027408 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.027381 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-27.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 18:49:05.027408 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.027392 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-27.ec2.internal" event="NodeHasSufficientPID" Apr 17 18:49:05.027473 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.027415 2580 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-136-27.ec2.internal" Apr 17 18:49:05.033715 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.033701 2580 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-136-27.ec2.internal" Apr 17 18:49:05.033765 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:05.033723 2580 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-136-27.ec2.internal\": node \"ip-10-0-136-27.ec2.internal\" not found" Apr 17 18:49:05.044025 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:05.043997 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-27.ec2.internal\" not found" Apr 17 18:49:05.073254 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.073224 2580 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-27.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-136-27.ec2.internal"] Apr 17 18:49:05.073331 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.073301 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 18:49:05.074185 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.074169 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-27.ec2.internal" event="NodeHasSufficientMemory" Apr 17 18:49:05.074265 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.074197 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-27.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 18:49:05.074265 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.074207 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-27.ec2.internal" event="NodeHasSufficientPID" Apr 17 18:49:05.075320 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.075307 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 18:49:05.075459 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.075446 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-27.ec2.internal" Apr 17 18:49:05.075491 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.075474 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 18:49:05.076030 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.076008 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-27.ec2.internal" event="NodeHasSufficientMemory" Apr 17 18:49:05.076030 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.076036 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-27.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 18:49:05.076197 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.076041 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-27.ec2.internal" event="NodeHasSufficientMemory" Apr 17 18:49:05.076197 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.076067 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-27.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 18:49:05.076197 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.076080 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-27.ec2.internal" event="NodeHasSufficientPID" Apr 17 18:49:05.076197 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.076045 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-27.ec2.internal" event="NodeHasSufficientPID" Apr 17 18:49:05.077206 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.077191 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-27.ec2.internal" Apr 17 18:49:05.077280 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.077218 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 18:49:05.077844 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.077828 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-27.ec2.internal" event="NodeHasSufficientMemory" Apr 17 18:49:05.077939 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.077860 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-27.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 18:49:05.077939 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.077875 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-27.ec2.internal" event="NodeHasSufficientPID" Apr 17 18:49:05.112592 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:05.112566 2580 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-27.ec2.internal\" not found" node="ip-10-0-136-27.ec2.internal" Apr 17 18:49:05.117020 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:05.117003 2580 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-27.ec2.internal\" not found" node="ip-10-0-136-27.ec2.internal" Apr 17 18:49:05.144230 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:05.144209 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-27.ec2.internal\" not found" Apr 17 18:49:05.162943 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.162924 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a7b645763bd4d0264284f3e94b95b589-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-27.ec2.internal\" (UID: \"a7b645763bd4d0264284f3e94b95b589\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-27.ec2.internal" Apr 17 18:49:05.163006 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.162950 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a7b645763bd4d0264284f3e94b95b589-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-27.ec2.internal\" (UID: \"a7b645763bd4d0264284f3e94b95b589\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-27.ec2.internal" Apr 17 18:49:05.163006 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.162970 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6402b5e4dc46963653aa05278c9bac43-config\") pod \"kube-apiserver-proxy-ip-10-0-136-27.ec2.internal\" (UID: \"6402b5e4dc46963653aa05278c9bac43\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-27.ec2.internal" Apr 17 18:49:05.245036 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:05.244969 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-27.ec2.internal\" not found" Apr 17 18:49:05.263475 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.263448 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a7b645763bd4d0264284f3e94b95b589-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-27.ec2.internal\" (UID: \"a7b645763bd4d0264284f3e94b95b589\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-27.ec2.internal" Apr 17 18:49:05.263543 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.263488 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a7b645763bd4d0264284f3e94b95b589-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-27.ec2.internal\" (UID: \"a7b645763bd4d0264284f3e94b95b589\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-27.ec2.internal" Apr 17 18:49:05.263543 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.263509 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6402b5e4dc46963653aa05278c9bac43-config\") pod \"kube-apiserver-proxy-ip-10-0-136-27.ec2.internal\" (UID: \"6402b5e4dc46963653aa05278c9bac43\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-27.ec2.internal" Apr 17 18:49:05.263543 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.263537 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a7b645763bd4d0264284f3e94b95b589-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-27.ec2.internal\" (UID: \"a7b645763bd4d0264284f3e94b95b589\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-27.ec2.internal" Apr 17 18:49:05.263660 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.263564 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a7b645763bd4d0264284f3e94b95b589-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-27.ec2.internal\" (UID: \"a7b645763bd4d0264284f3e94b95b589\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-27.ec2.internal" Apr 17 18:49:05.263660 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.263542 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6402b5e4dc46963653aa05278c9bac43-config\") pod \"kube-apiserver-proxy-ip-10-0-136-27.ec2.internal\" (UID: \"6402b5e4dc46963653aa05278c9bac43\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-27.ec2.internal" Apr 17 18:49:05.345809 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:05.345782 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-27.ec2.internal\" not found" Apr 17 18:49:05.416407 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.416372 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-27.ec2.internal" Apr 17 18:49:05.419493 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.419437 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-27.ec2.internal" Apr 17 18:49:05.446921 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:05.446879 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-27.ec2.internal\" not found" Apr 17 18:49:05.547513 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:05.547424 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-27.ec2.internal\" not found" Apr 17 18:49:05.647938 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:05.647885 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-27.ec2.internal\" not found" Apr 17 18:49:05.708299 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.708259 2580 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 18:49:05.747277 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.747246 2580 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 18:49:05.747917 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.747418 2580 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 18:49:05.747917 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.747429 2580 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 18:49:05.748338 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:05.748321 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-27.ec2.internal\" not found" Apr 17 18:49:05.849273 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:05.849242 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-27.ec2.internal\" not found" Apr 17 18:49:05.859157 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.859137 2580 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 18:49:05.868330 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.868303 2580 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 18:49:05.881990 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.881949 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 18:44:04 +0000 UTC" deadline="2027-09-13 19:40:06.823700496 +0000 UTC" Apr 17 18:49:05.881990 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.881988 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12336h51m0.9417157s" Apr 17 18:49:05.887417 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.887400 2580 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-trvk4" Apr 17 18:49:05.894598 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.894578 2580 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-trvk4" Apr 17 18:49:05.924561 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:05.924529 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7b645763bd4d0264284f3e94b95b589.slice/crio-789354e98d7b5fcc2a5a8edebffef32f48d12feecd2932619071d272cf50704a WatchSource:0}: Error finding container 789354e98d7b5fcc2a5a8edebffef32f48d12feecd2932619071d272cf50704a: Status 404 returned error can't find the container with id 789354e98d7b5fcc2a5a8edebffef32f48d12feecd2932619071d272cf50704a Apr 17 18:49:05.925046 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:05.925030 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6402b5e4dc46963653aa05278c9bac43.slice/crio-74df9bc87ae03d3c17b0f786619ec70371e1145ed9c783530e8f3f730b6c43d6 WatchSource:0}: Error finding container 74df9bc87ae03d3c17b0f786619ec70371e1145ed9c783530e8f3f730b6c43d6: Status 404 returned error can't find the container with id 74df9bc87ae03d3c17b0f786619ec70371e1145ed9c783530e8f3f730b6c43d6 Apr 17 18:49:05.932081 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.932068 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 18:49:05.938668 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.938644 2580 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 18:49:05.949344 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:05.949322 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-27.ec2.internal\" not found" Apr 17 18:49:05.975619 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.975573 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-27.ec2.internal" event={"ID":"6402b5e4dc46963653aa05278c9bac43","Type":"ContainerStarted","Data":"74df9bc87ae03d3c17b0f786619ec70371e1145ed9c783530e8f3f730b6c43d6"} Apr 17 18:49:05.976515 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:05.976495 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-27.ec2.internal" event={"ID":"a7b645763bd4d0264284f3e94b95b589","Type":"ContainerStarted","Data":"789354e98d7b5fcc2a5a8edebffef32f48d12feecd2932619071d272cf50704a"} Apr 17 18:49:06.049700 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:06.049670 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-27.ec2.internal\" not found" Apr 17 18:49:06.128570 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.128493 2580 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 18:49:06.160249 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.160226 2580 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-27.ec2.internal" Apr 17 18:49:06.172906 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.172873 2580 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 18:49:06.173817 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.173804 2580 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-27.ec2.internal" Apr 17 18:49:06.179246 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.179230 2580 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 18:49:06.837770 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.837738 2580 apiserver.go:52] "Watching apiserver" Apr 17 18:49:06.845647 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.845619 2580 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 18:49:06.848285 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.848254 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-jhmxj","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n9tf8","openshift-dns/node-resolver-zsszv","openshift-image-registry/node-ca-vnmqw","openshift-multus/multus-additional-cni-plugins-4hnl2","openshift-network-operator/iptables-alerter-5qnfh","kube-system/konnectivity-agent-4vj6d","kube-system/kube-apiserver-proxy-ip-10-0-136-27.ec2.internal","openshift-cluster-node-tuning-operator/tuned-t56ft","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-27.ec2.internal","openshift-multus/multus-vcf4s","openshift-multus/network-metrics-daemon-v24kx","openshift-network-diagnostics/network-check-target-qbtjg","openshift-ovn-kubernetes/ovnkube-node-7klhd"] Apr 17 18:49:06.853051 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.852652 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbtjg" Apr 17 18:49:06.853051 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:06.852728 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbtjg" podUID="699f0c3d-9fe9-46b1-8ca2-967fca78f239" Apr 17 18:49:06.853884 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.853857 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-4vj6d" Apr 17 18:49:06.855145 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.855108 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zsszv" Apr 17 18:49:06.856453 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.856226 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 18:49:06.856453 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.856231 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-rqpqv\"" Apr 17 18:49:06.856453 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.856318 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 18:49:06.856643 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.856471 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vnmqw" Apr 17 18:49:06.857223 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.857205 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 18:49:06.857223 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.857215 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 18:49:06.857377 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.857239 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-xlxz2\"" Apr 17 18:49:06.857799 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.857772 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5qnfh" Apr 17 18:49:06.858781 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.858752 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 18:49:06.859019 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.858853 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-2vzsf\"" Apr 17 18:49:06.859225 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.859189 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 18:49:06.859314 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.859274 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 18:49:06.859557 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.859535 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4hnl2" Apr 17 18:49:06.859817 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.859800 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 18:49:06.861006 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.860048 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 18:49:06.861006 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.860201 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 18:49:06.861006 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.860302 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-gczng\"" Apr 17 18:49:06.861182 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.861079 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-t56ft" Apr 17 18:49:06.861769 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.861517 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 18:49:06.861907 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.861870 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 18:49:06.862097 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.862080 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-dn6ct\"" Apr 17 18:49:06.862164 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.862117 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 18:49:06.862164 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.862138 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 18:49:06.862242 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.862123 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 18:49:06.863021 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.862988 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 18:49:06.863149 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.863111 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 18:49:06.863214 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.863159 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-b6ngv\"" Apr 17 18:49:06.864009 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.863986 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jhmxj" Apr 17 18:49:06.864144 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:06.864114 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jhmxj" podUID="df4bf5b8-bdc2-4ccc-b126-8107588c1304" Apr 17 18:49:06.865402 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.865384 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n9tf8" Apr 17 18:49:06.867122 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.867096 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:06.867831 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.867795 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 18:49:06.867944 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.867851 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-74dfs\"" Apr 17 18:49:06.868227 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.868199 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 18:49:06.868499 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.868482 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 18:49:06.868796 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.868779 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v24kx" Apr 17 18:49:06.868884 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.868831 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vcf4s" Apr 17 18:49:06.869018 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:06.868856 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v24kx" podUID="70ceb0f8-7a3d-4e29-9470-f18b8af1daa1" Apr 17 18:49:06.869178 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.869148 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 18:49:06.869420 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.869400 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 18:49:06.869481 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.869468 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-sfft2\"" Apr 17 18:49:06.870192 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.870173 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 18:49:06.870406 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.870377 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 18:49:06.870479 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.870466 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 18:49:06.870909 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.870873 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 18:49:06.871173 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.871155 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-8mcwb\"" Apr 17 18:49:06.871380 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.871363 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 18:49:06.872809 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.872787 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/df4bf5b8-bdc2-4ccc-b126-8107588c1304-kubelet-config\") pod \"global-pull-secret-syncer-jhmxj\" (UID: \"df4bf5b8-bdc2-4ccc-b126-8107588c1304\") " pod="kube-system/global-pull-secret-syncer-jhmxj" Apr 17 18:49:06.872948 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.872823 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7cc323ad-cefa-4b44-8df6-0997a152f7ba-etc-kubernetes\") pod \"tuned-t56ft\" (UID: \"7cc323ad-cefa-4b44-8df6-0997a152f7ba\") " pod="openshift-cluster-node-tuning-operator/tuned-t56ft" Apr 17 18:49:06.872948 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.872847 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7cc323ad-cefa-4b44-8df6-0997a152f7ba-host\") pod \"tuned-t56ft\" (UID: \"7cc323ad-cefa-4b44-8df6-0997a152f7ba\") " pod="openshift-cluster-node-tuning-operator/tuned-t56ft" Apr 17 18:49:06.872948 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.872869 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7cc323ad-cefa-4b44-8df6-0997a152f7ba-etc-tuned\") pod \"tuned-t56ft\" (UID: \"7cc323ad-cefa-4b44-8df6-0997a152f7ba\") " pod="openshift-cluster-node-tuning-operator/tuned-t56ft" Apr 17 18:49:06.872948 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.872884 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/91f3ddfa-7360-41d5-bb54-9c1c21904eb4-agent-certs\") pod \"konnectivity-agent-4vj6d\" (UID: \"91f3ddfa-7360-41d5-bb54-9c1c21904eb4\") " pod="kube-system/konnectivity-agent-4vj6d" Apr 17 18:49:06.873162 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.872961 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jzdk\" (UniqueName: \"kubernetes.io/projected/39fdb78c-608d-4cb2-8a53-feb04ee1cdcf-kube-api-access-2jzdk\") pod \"node-ca-vnmqw\" (UID: \"39fdb78c-608d-4cb2-8a53-feb04ee1cdcf\") " pod="openshift-image-registry/node-ca-vnmqw" Apr 17 18:49:06.873162 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.873014 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ff6339d8-541a-4947-b00d-5f600d8d08c1-os-release\") pod \"multus-additional-cni-plugins-4hnl2\" (UID: \"ff6339d8-541a-4947-b00d-5f600d8d08c1\") " pod="openshift-multus/multus-additional-cni-plugins-4hnl2" Apr 17 18:49:06.873162 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.873049 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/87361592-a029-4a93-9af8-ed4f1a1cc87c-hosts-file\") pod \"node-resolver-zsszv\" (UID: \"87361592-a029-4a93-9af8-ed4f1a1cc87c\") " pod="openshift-dns/node-resolver-zsszv" Apr 17 18:49:06.873162 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.873074 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8crqt\" (UniqueName: \"kubernetes.io/projected/b33fd58c-e89c-424c-ad23-86a03f08d725-kube-api-access-8crqt\") pod \"iptables-alerter-5qnfh\" (UID: \"b33fd58c-e89c-424c-ad23-86a03f08d725\") " pod="openshift-network-operator/iptables-alerter-5qnfh" Apr 17 18:49:06.873162 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.873098 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7cc323ad-cefa-4b44-8df6-0997a152f7ba-etc-modprobe-d\") pod \"tuned-t56ft\" (UID: \"7cc323ad-cefa-4b44-8df6-0997a152f7ba\") " pod="openshift-cluster-node-tuning-operator/tuned-t56ft" Apr 17 18:49:06.873425 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.873164 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7cc323ad-cefa-4b44-8df6-0997a152f7ba-etc-sysconfig\") pod \"tuned-t56ft\" (UID: \"7cc323ad-cefa-4b44-8df6-0997a152f7ba\") " pod="openshift-cluster-node-tuning-operator/tuned-t56ft" Apr 17 18:49:06.873425 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.873202 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7cc323ad-cefa-4b44-8df6-0997a152f7ba-etc-sysctl-d\") pod \"tuned-t56ft\" (UID: \"7cc323ad-cefa-4b44-8df6-0997a152f7ba\") " pod="openshift-cluster-node-tuning-operator/tuned-t56ft" Apr 17 18:49:06.873425 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.873259 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/39fdb78c-608d-4cb2-8a53-feb04ee1cdcf-serviceca\") pod \"node-ca-vnmqw\" (UID: \"39fdb78c-608d-4cb2-8a53-feb04ee1cdcf\") " pod="openshift-image-registry/node-ca-vnmqw" Apr 17 18:49:06.873425 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.873286 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff6339d8-541a-4947-b00d-5f600d8d08c1-system-cni-dir\") pod \"multus-additional-cni-plugins-4hnl2\" (UID: \"ff6339d8-541a-4947-b00d-5f600d8d08c1\") " pod="openshift-multus/multus-additional-cni-plugins-4hnl2" Apr 17 18:49:06.873425 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.873321 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ff6339d8-541a-4947-b00d-5f600d8d08c1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4hnl2\" (UID: \"ff6339d8-541a-4947-b00d-5f600d8d08c1\") " pod="openshift-multus/multus-additional-cni-plugins-4hnl2" Apr 17 18:49:06.873425 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.873347 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7cc323ad-cefa-4b44-8df6-0997a152f7ba-lib-modules\") pod \"tuned-t56ft\" (UID: \"7cc323ad-cefa-4b44-8df6-0997a152f7ba\") " pod="openshift-cluster-node-tuning-operator/tuned-t56ft" Apr 17 18:49:06.873425 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.873370 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j7jx\" (UniqueName: \"kubernetes.io/projected/7cc323ad-cefa-4b44-8df6-0997a152f7ba-kube-api-access-2j7jx\") pod \"tuned-t56ft\" (UID: \"7cc323ad-cefa-4b44-8df6-0997a152f7ba\") " pod="openshift-cluster-node-tuning-operator/tuned-t56ft" Apr 17 18:49:06.873425 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.873393 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/39fdb78c-608d-4cb2-8a53-feb04ee1cdcf-host\") pod \"node-ca-vnmqw\" (UID: \"39fdb78c-608d-4cb2-8a53-feb04ee1cdcf\") " pod="openshift-image-registry/node-ca-vnmqw" Apr 17 18:49:06.873791 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.873431 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ff6339d8-541a-4947-b00d-5f600d8d08c1-cnibin\") pod \"multus-additional-cni-plugins-4hnl2\" (UID: \"ff6339d8-541a-4947-b00d-5f600d8d08c1\") " pod="openshift-multus/multus-additional-cni-plugins-4hnl2" Apr 17 18:49:06.873791 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.873465 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ff6339d8-541a-4947-b00d-5f600d8d08c1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4hnl2\" (UID: \"ff6339d8-541a-4947-b00d-5f600d8d08c1\") " pod="openshift-multus/multus-additional-cni-plugins-4hnl2" Apr 17 18:49:06.873791 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.873490 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/df4bf5b8-bdc2-4ccc-b126-8107588c1304-original-pull-secret\") pod \"global-pull-secret-syncer-jhmxj\" (UID: \"df4bf5b8-bdc2-4ccc-b126-8107588c1304\") " pod="kube-system/global-pull-secret-syncer-jhmxj" Apr 17 18:49:06.873791 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.873511 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7cc323ad-cefa-4b44-8df6-0997a152f7ba-etc-systemd\") pod \"tuned-t56ft\" (UID: \"7cc323ad-cefa-4b44-8df6-0997a152f7ba\") " pod="openshift-cluster-node-tuning-operator/tuned-t56ft" Apr 17 18:49:06.873791 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.873544 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ff6339d8-541a-4947-b00d-5f600d8d08c1-cni-binary-copy\") pod \"multus-additional-cni-plugins-4hnl2\" (UID: \"ff6339d8-541a-4947-b00d-5f600d8d08c1\") " pod="openshift-multus/multus-additional-cni-plugins-4hnl2" Apr 17 18:49:06.873791 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.873567 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ff6339d8-541a-4947-b00d-5f600d8d08c1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4hnl2\" (UID: \"ff6339d8-541a-4947-b00d-5f600d8d08c1\") " pod="openshift-multus/multus-additional-cni-plugins-4hnl2" Apr 17 18:49:06.873791 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.873589 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b33fd58c-e89c-424c-ad23-86a03f08d725-host-slash\") pod \"iptables-alerter-5qnfh\" (UID: \"b33fd58c-e89c-424c-ad23-86a03f08d725\") " pod="openshift-network-operator/iptables-alerter-5qnfh" Apr 17 18:49:06.873791 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.873630 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/df4bf5b8-bdc2-4ccc-b126-8107588c1304-dbus\") pod \"global-pull-secret-syncer-jhmxj\" (UID: \"df4bf5b8-bdc2-4ccc-b126-8107588c1304\") " pod="kube-system/global-pull-secret-syncer-jhmxj" Apr 17 18:49:06.873791 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.873672 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lc74\" (UniqueName: \"kubernetes.io/projected/87361592-a029-4a93-9af8-ed4f1a1cc87c-kube-api-access-8lc74\") pod \"node-resolver-zsszv\" (UID: \"87361592-a029-4a93-9af8-ed4f1a1cc87c\") " pod="openshift-dns/node-resolver-zsszv" Apr 17 18:49:06.873791 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.873697 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7cc323ad-cefa-4b44-8df6-0997a152f7ba-run\") pod \"tuned-t56ft\" (UID: \"7cc323ad-cefa-4b44-8df6-0997a152f7ba\") " pod="openshift-cluster-node-tuning-operator/tuned-t56ft" Apr 17 18:49:06.873791 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.873750 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7cc323ad-cefa-4b44-8df6-0997a152f7ba-sys\") pod \"tuned-t56ft\" (UID: \"7cc323ad-cefa-4b44-8df6-0997a152f7ba\") " pod="openshift-cluster-node-tuning-operator/tuned-t56ft" Apr 17 18:49:06.873791 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.873773 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k2fp\" (UniqueName: \"kubernetes.io/projected/699f0c3d-9fe9-46b1-8ca2-967fca78f239-kube-api-access-2k2fp\") pod \"network-check-target-qbtjg\" (UID: \"699f0c3d-9fe9-46b1-8ca2-967fca78f239\") " pod="openshift-network-diagnostics/network-check-target-qbtjg" Apr 17 18:49:06.874418 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.873802 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7cc323ad-cefa-4b44-8df6-0997a152f7ba-etc-sysctl-conf\") pod \"tuned-t56ft\" (UID: \"7cc323ad-cefa-4b44-8df6-0997a152f7ba\") " pod="openshift-cluster-node-tuning-operator/tuned-t56ft" Apr 17 18:49:06.874418 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.873846 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7cc323ad-cefa-4b44-8df6-0997a152f7ba-var-lib-kubelet\") pod \"tuned-t56ft\" (UID: \"7cc323ad-cefa-4b44-8df6-0997a152f7ba\") " pod="openshift-cluster-node-tuning-operator/tuned-t56ft" Apr 17 18:49:06.874418 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.873911 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/87361592-a029-4a93-9af8-ed4f1a1cc87c-tmp-dir\") pod \"node-resolver-zsszv\" (UID: \"87361592-a029-4a93-9af8-ed4f1a1cc87c\") " pod="openshift-dns/node-resolver-zsszv" Apr 17 18:49:06.874418 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.873950 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b33fd58c-e89c-424c-ad23-86a03f08d725-iptables-alerter-script\") pod \"iptables-alerter-5qnfh\" (UID: \"b33fd58c-e89c-424c-ad23-86a03f08d725\") " pod="openshift-network-operator/iptables-alerter-5qnfh" Apr 17 18:49:06.874418 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.873970 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7cc323ad-cefa-4b44-8df6-0997a152f7ba-tmp\") pod \"tuned-t56ft\" (UID: \"7cc323ad-cefa-4b44-8df6-0997a152f7ba\") " pod="openshift-cluster-node-tuning-operator/tuned-t56ft" Apr 17 18:49:06.874418 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.873990 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/91f3ddfa-7360-41d5-bb54-9c1c21904eb4-konnectivity-ca\") pod \"konnectivity-agent-4vj6d\" (UID: \"91f3ddfa-7360-41d5-bb54-9c1c21904eb4\") " pod="kube-system/konnectivity-agent-4vj6d" Apr 17 18:49:06.874418 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.874017 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbz4n\" (UniqueName: \"kubernetes.io/projected/ff6339d8-541a-4947-b00d-5f600d8d08c1-kube-api-access-pbz4n\") pod \"multus-additional-cni-plugins-4hnl2\" (UID: \"ff6339d8-541a-4947-b00d-5f600d8d08c1\") " pod="openshift-multus/multus-additional-cni-plugins-4hnl2" Apr 17 18:49:06.895365 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.895341 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 18:44:05 +0000 UTC" deadline="2027-12-30 05:02:58.083898324 +0000 UTC" Apr 17 18:49:06.895365 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.895361 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14914h13m51.188539004s" Apr 17 18:49:06.961611 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.961584 2580 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 18:49:06.974528 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.974493 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8lc74\" (UniqueName: \"kubernetes.io/projected/87361592-a029-4a93-9af8-ed4f1a1cc87c-kube-api-access-8lc74\") pod \"node-resolver-zsszv\" (UID: \"87361592-a029-4a93-9af8-ed4f1a1cc87c\") " pod="openshift-dns/node-resolver-zsszv" Apr 17 18:49:06.974679 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.974534 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7cc323ad-cefa-4b44-8df6-0997a152f7ba-run\") pod \"tuned-t56ft\" (UID: \"7cc323ad-cefa-4b44-8df6-0997a152f7ba\") " pod="openshift-cluster-node-tuning-operator/tuned-t56ft" Apr 17 18:49:06.974679 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.974564 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2k2fp\" (UniqueName: \"kubernetes.io/projected/699f0c3d-9fe9-46b1-8ca2-967fca78f239-kube-api-access-2k2fp\") pod \"network-check-target-qbtjg\" (UID: \"699f0c3d-9fe9-46b1-8ca2-967fca78f239\") " pod="openshift-network-diagnostics/network-check-target-qbtjg" Apr 17 18:49:06.974679 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.974639 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7cc323ad-cefa-4b44-8df6-0997a152f7ba-run\") pod \"tuned-t56ft\" (UID: \"7cc323ad-cefa-4b44-8df6-0997a152f7ba\") " pod="openshift-cluster-node-tuning-operator/tuned-t56ft" Apr 17 18:49:06.974836 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.974714 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xb5k\" (UniqueName: \"kubernetes.io/projected/c0f5f82e-6018-4e89-909a-07d817ebf145-kube-api-access-8xb5k\") pod \"aws-ebs-csi-driver-node-n9tf8\" (UID: \"c0f5f82e-6018-4e89-909a-07d817ebf145\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n9tf8" Apr 17 18:49:06.974836 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.974745 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-host-run-netns\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:06.974836 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.974763 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c0f5f82e-6018-4e89-909a-07d817ebf145-device-dir\") pod \"aws-ebs-csi-driver-node-n9tf8\" (UID: \"c0f5f82e-6018-4e89-909a-07d817ebf145\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n9tf8" Apr 17 18:49:06.974836 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.974787 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-run-openvswitch\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:06.974836 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.974817 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-host-cni-bin\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:06.975127 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.974841 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-multus-socket-dir-parent\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:06.975127 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.974863 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/87361592-a029-4a93-9af8-ed4f1a1cc87c-tmp-dir\") pod \"node-resolver-zsszv\" (UID: \"87361592-a029-4a93-9af8-ed4f1a1cc87c\") " pod="openshift-dns/node-resolver-zsszv" Apr 17 18:49:06.975127 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.974882 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/91f3ddfa-7360-41d5-bb54-9c1c21904eb4-konnectivity-ca\") pod \"konnectivity-agent-4vj6d\" (UID: \"91f3ddfa-7360-41d5-bb54-9c1c21904eb4\") " pod="kube-system/konnectivity-agent-4vj6d" Apr 17 18:49:06.975127 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.974920 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-host-var-lib-kubelet\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:06.975127 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.974935 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-hostroot\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:06.975127 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.974953 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pbz4n\" (UniqueName: \"kubernetes.io/projected/ff6339d8-541a-4947-b00d-5f600d8d08c1-kube-api-access-pbz4n\") pod \"multus-additional-cni-plugins-4hnl2\" (UID: \"ff6339d8-541a-4947-b00d-5f600d8d08c1\") " pod="openshift-multus/multus-additional-cni-plugins-4hnl2" Apr 17 18:49:06.975127 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.974979 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/df4bf5b8-bdc2-4ccc-b126-8107588c1304-kubelet-config\") pod \"global-pull-secret-syncer-jhmxj\" (UID: \"df4bf5b8-bdc2-4ccc-b126-8107588c1304\") " pod="kube-system/global-pull-secret-syncer-jhmxj" Apr 17 18:49:06.975127 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.975001 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7cc323ad-cefa-4b44-8df6-0997a152f7ba-host\") pod \"tuned-t56ft\" (UID: \"7cc323ad-cefa-4b44-8df6-0997a152f7ba\") " pod="openshift-cluster-node-tuning-operator/tuned-t56ft" Apr 17 18:49:06.975127 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.975018 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7cc323ad-cefa-4b44-8df6-0997a152f7ba-etc-tuned\") pod \"tuned-t56ft\" (UID: \"7cc323ad-cefa-4b44-8df6-0997a152f7ba\") " pod="openshift-cluster-node-tuning-operator/tuned-t56ft" Apr 17 18:49:06.975127 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.975043 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/91f3ddfa-7360-41d5-bb54-9c1c21904eb4-agent-certs\") pod \"konnectivity-agent-4vj6d\" (UID: \"91f3ddfa-7360-41d5-bb54-9c1c21904eb4\") " pod="kube-system/konnectivity-agent-4vj6d" Apr 17 18:49:06.975127 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.975075 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2jzdk\" (UniqueName: \"kubernetes.io/projected/39fdb78c-608d-4cb2-8a53-feb04ee1cdcf-kube-api-access-2jzdk\") pod \"node-ca-vnmqw\" (UID: \"39fdb78c-608d-4cb2-8a53-feb04ee1cdcf\") " pod="openshift-image-registry/node-ca-vnmqw" Apr 17 18:49:06.975127 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.975102 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-log-socket\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:06.975127 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.975125 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-cnibin\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:06.975127 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.975125 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7cc323ad-cefa-4b44-8df6-0997a152f7ba-host\") pod \"tuned-t56ft\" (UID: \"7cc323ad-cefa-4b44-8df6-0997a152f7ba\") " pod="openshift-cluster-node-tuning-operator/tuned-t56ft" Apr 17 18:49:06.975705 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.975141 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/df4bf5b8-bdc2-4ccc-b126-8107588c1304-kubelet-config\") pod \"global-pull-secret-syncer-jhmxj\" (UID: \"df4bf5b8-bdc2-4ccc-b126-8107588c1304\") " pod="kube-system/global-pull-secret-syncer-jhmxj" Apr 17 18:49:06.975705 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.975149 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ff6339d8-541a-4947-b00d-5f600d8d08c1-os-release\") pod \"multus-additional-cni-plugins-4hnl2\" (UID: \"ff6339d8-541a-4947-b00d-5f600d8d08c1\") " pod="openshift-multus/multus-additional-cni-plugins-4hnl2" Apr 17 18:49:06.975705 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.975200 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/87361592-a029-4a93-9af8-ed4f1a1cc87c-tmp-dir\") pod \"node-resolver-zsszv\" (UID: \"87361592-a029-4a93-9af8-ed4f1a1cc87c\") " pod="openshift-dns/node-resolver-zsszv" Apr 17 18:49:06.975705 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.975232 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ff6339d8-541a-4947-b00d-5f600d8d08c1-os-release\") pod \"multus-additional-cni-plugins-4hnl2\" (UID: \"ff6339d8-541a-4947-b00d-5f600d8d08c1\") " pod="openshift-multus/multus-additional-cni-plugins-4hnl2" Apr 17 18:49:06.975705 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.975273 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/87361592-a029-4a93-9af8-ed4f1a1cc87c-hosts-file\") pod \"node-resolver-zsszv\" (UID: \"87361592-a029-4a93-9af8-ed4f1a1cc87c\") " pod="openshift-dns/node-resolver-zsszv" Apr 17 18:49:06.975705 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.975326 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8crqt\" (UniqueName: \"kubernetes.io/projected/b33fd58c-e89c-424c-ad23-86a03f08d725-kube-api-access-8crqt\") pod \"iptables-alerter-5qnfh\" (UID: \"b33fd58c-e89c-424c-ad23-86a03f08d725\") " pod="openshift-network-operator/iptables-alerter-5qnfh" Apr 17 18:49:06.975705 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.975352 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7cc323ad-cefa-4b44-8df6-0997a152f7ba-etc-sysconfig\") pod \"tuned-t56ft\" (UID: \"7cc323ad-cefa-4b44-8df6-0997a152f7ba\") " pod="openshift-cluster-node-tuning-operator/tuned-t56ft" Apr 17 18:49:06.975705 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.975406 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/87361592-a029-4a93-9af8-ed4f1a1cc87c-hosts-file\") pod \"node-resolver-zsszv\" (UID: \"87361592-a029-4a93-9af8-ed4f1a1cc87c\") " pod="openshift-dns/node-resolver-zsszv" Apr 17 18:49:06.975705 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.975460 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c0f5f82e-6018-4e89-909a-07d817ebf145-sys-fs\") pod \"aws-ebs-csi-driver-node-n9tf8\" (UID: \"c0f5f82e-6018-4e89-909a-07d817ebf145\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n9tf8" Apr 17 18:49:06.975705 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.975476 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7cc323ad-cefa-4b44-8df6-0997a152f7ba-etc-sysconfig\") pod \"tuned-t56ft\" (UID: \"7cc323ad-cefa-4b44-8df6-0997a152f7ba\") " pod="openshift-cluster-node-tuning-operator/tuned-t56ft" Apr 17 18:49:06.975705 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.975458 2580 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 18:49:06.975705 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.975497 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/91f3ddfa-7360-41d5-bb54-9c1c21904eb4-konnectivity-ca\") pod \"konnectivity-agent-4vj6d\" (UID: \"91f3ddfa-7360-41d5-bb54-9c1c21904eb4\") " pod="kube-system/konnectivity-agent-4vj6d" Apr 17 18:49:06.975705 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.975543 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-host-kubelet\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:06.975705 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.975597 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-systemd-units\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:06.975705 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.975627 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-host-slash\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:06.975705 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.975647 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ff6339d8-541a-4947-b00d-5f600d8d08c1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4hnl2\" (UID: \"ff6339d8-541a-4947-b00d-5f600d8d08c1\") " pod="openshift-multus/multus-additional-cni-plugins-4hnl2" Apr 17 18:49:06.975705 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.975667 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7cc323ad-cefa-4b44-8df6-0997a152f7ba-lib-modules\") pod \"tuned-t56ft\" (UID: \"7cc323ad-cefa-4b44-8df6-0997a152f7ba\") " pod="openshift-cluster-node-tuning-operator/tuned-t56ft" Apr 17 18:49:06.976574 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.975692 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2j7jx\" (UniqueName: \"kubernetes.io/projected/7cc323ad-cefa-4b44-8df6-0997a152f7ba-kube-api-access-2j7jx\") pod \"tuned-t56ft\" (UID: \"7cc323ad-cefa-4b44-8df6-0997a152f7ba\") " pod="openshift-cluster-node-tuning-operator/tuned-t56ft" Apr 17 18:49:06.976574 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.975710 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-run-ovn\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:06.976574 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.975726 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-host-run-ovn-kubernetes\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:06.976574 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.975750 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:06.976574 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.975776 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-multus-cni-dir\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:06.976574 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.975796 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-os-release\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:06.976574 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.975820 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/df4bf5b8-bdc2-4ccc-b126-8107588c1304-original-pull-secret\") pod \"global-pull-secret-syncer-jhmxj\" (UID: \"df4bf5b8-bdc2-4ccc-b126-8107588c1304\") " pod="kube-system/global-pull-secret-syncer-jhmxj" Apr 17 18:49:06.976574 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.975850 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/87d38419-312e-4358-b18c-7e7b24e8189f-ovn-node-metrics-cert\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:06.976574 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.975874 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-host-run-k8s-cni-cncf-io\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:06.976574 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.975915 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-host-run-multus-certs\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:06.976574 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.975939 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-etc-kubernetes\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:06.976574 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.975966 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ff6339d8-541a-4947-b00d-5f600d8d08c1-cni-binary-copy\") pod \"multus-additional-cni-plugins-4hnl2\" (UID: \"ff6339d8-541a-4947-b00d-5f600d8d08c1\") " pod="openshift-multus/multus-additional-cni-plugins-4hnl2" Apr 17 18:49:06.976574 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.975993 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b33fd58c-e89c-424c-ad23-86a03f08d725-host-slash\") pod \"iptables-alerter-5qnfh\" (UID: \"b33fd58c-e89c-424c-ad23-86a03f08d725\") " pod="openshift-network-operator/iptables-alerter-5qnfh" Apr 17 18:49:06.976574 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.975872 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7cc323ad-cefa-4b44-8df6-0997a152f7ba-lib-modules\") pod \"tuned-t56ft\" (UID: \"7cc323ad-cefa-4b44-8df6-0997a152f7ba\") " pod="openshift-cluster-node-tuning-operator/tuned-t56ft" Apr 17 18:49:06.976574 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.976030 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c0f5f82e-6018-4e89-909a-07d817ebf145-kubelet-dir\") pod \"aws-ebs-csi-driver-node-n9tf8\" (UID: \"c0f5f82e-6018-4e89-909a-07d817ebf145\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n9tf8" Apr 17 18:49:06.976574 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.976103 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c0f5f82e-6018-4e89-909a-07d817ebf145-socket-dir\") pod \"aws-ebs-csi-driver-node-n9tf8\" (UID: \"c0f5f82e-6018-4e89-909a-07d817ebf145\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n9tf8" Apr 17 18:49:06.977271 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.976133 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-multus-daemon-config\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:06.977271 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.976171 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/df4bf5b8-bdc2-4ccc-b126-8107588c1304-dbus\") pod \"global-pull-secret-syncer-jhmxj\" (UID: \"df4bf5b8-bdc2-4ccc-b126-8107588c1304\") " pod="kube-system/global-pull-secret-syncer-jhmxj" Apr 17 18:49:06.977271 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.976220 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ff6339d8-541a-4947-b00d-5f600d8d08c1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4hnl2\" (UID: \"ff6339d8-541a-4947-b00d-5f600d8d08c1\") " pod="openshift-multus/multus-additional-cni-plugins-4hnl2" Apr 17 18:49:06.977271 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.976213 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7cc323ad-cefa-4b44-8df6-0997a152f7ba-sys\") pod \"tuned-t56ft\" (UID: \"7cc323ad-cefa-4b44-8df6-0997a152f7ba\") " pod="openshift-cluster-node-tuning-operator/tuned-t56ft" Apr 17 18:49:06.977271 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:06.976343 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 18:49:06.977271 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.976345 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7cc323ad-cefa-4b44-8df6-0997a152f7ba-sys\") pod \"tuned-t56ft\" (UID: \"7cc323ad-cefa-4b44-8df6-0997a152f7ba\") " pod="openshift-cluster-node-tuning-operator/tuned-t56ft" Apr 17 18:49:06.977271 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.976388 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/87d38419-312e-4358-b18c-7e7b24e8189f-env-overrides\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:06.977271 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:06.976460 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df4bf5b8-bdc2-4ccc-b126-8107588c1304-original-pull-secret podName:df4bf5b8-bdc2-4ccc-b126-8107588c1304 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:07.476421955 +0000 UTC m=+3.128957886 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/df4bf5b8-bdc2-4ccc-b126-8107588c1304-original-pull-secret") pod "global-pull-secret-syncer-jhmxj" (UID: "df4bf5b8-bdc2-4ccc-b126-8107588c1304") : object "kube-system"/"original-pull-secret" not registered Apr 17 18:49:06.977271 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.976464 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/df4bf5b8-bdc2-4ccc-b126-8107588c1304-dbus\") pod \"global-pull-secret-syncer-jhmxj\" (UID: \"df4bf5b8-bdc2-4ccc-b126-8107588c1304\") " pod="kube-system/global-pull-secret-syncer-jhmxj" Apr 17 18:49:06.977271 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.976483 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-system-cni-dir\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:06.977271 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.976506 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b33fd58c-e89c-424c-ad23-86a03f08d725-host-slash\") pod \"iptables-alerter-5qnfh\" (UID: \"b33fd58c-e89c-424c-ad23-86a03f08d725\") " pod="openshift-network-operator/iptables-alerter-5qnfh" Apr 17 18:49:06.977271 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.976512 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whvkf\" (UniqueName: \"kubernetes.io/projected/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-kube-api-access-whvkf\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:06.977271 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.976523 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ff6339d8-541a-4947-b00d-5f600d8d08c1-cni-binary-copy\") pod \"multus-additional-cni-plugins-4hnl2\" (UID: \"ff6339d8-541a-4947-b00d-5f600d8d08c1\") " pod="openshift-multus/multus-additional-cni-plugins-4hnl2" Apr 17 18:49:06.977271 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.976548 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7cc323ad-cefa-4b44-8df6-0997a152f7ba-etc-sysctl-conf\") pod \"tuned-t56ft\" (UID: \"7cc323ad-cefa-4b44-8df6-0997a152f7ba\") " pod="openshift-cluster-node-tuning-operator/tuned-t56ft" Apr 17 18:49:06.977271 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.976575 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7cc323ad-cefa-4b44-8df6-0997a152f7ba-var-lib-kubelet\") pod \"tuned-t56ft\" (UID: \"7cc323ad-cefa-4b44-8df6-0997a152f7ba\") " pod="openshift-cluster-node-tuning-operator/tuned-t56ft" Apr 17 18:49:06.977271 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.976618 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7cc323ad-cefa-4b44-8df6-0997a152f7ba-var-lib-kubelet\") pod \"tuned-t56ft\" (UID: \"7cc323ad-cefa-4b44-8df6-0997a152f7ba\") " pod="openshift-cluster-node-tuning-operator/tuned-t56ft" Apr 17 18:49:06.977271 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.976647 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-var-lib-openvswitch\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:06.978358 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.976692 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/87d38419-312e-4358-b18c-7e7b24e8189f-ovnkube-config\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:06.978358 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.976735 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7cc323ad-cefa-4b44-8df6-0997a152f7ba-etc-sysctl-conf\") pod \"tuned-t56ft\" (UID: \"7cc323ad-cefa-4b44-8df6-0997a152f7ba\") " pod="openshift-cluster-node-tuning-operator/tuned-t56ft" Apr 17 18:49:06.978358 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.976741 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/70ceb0f8-7a3d-4e29-9470-f18b8af1daa1-metrics-certs\") pod \"network-metrics-daemon-v24kx\" (UID: \"70ceb0f8-7a3d-4e29-9470-f18b8af1daa1\") " pod="openshift-multus/network-metrics-daemon-v24kx" Apr 17 18:49:06.978358 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.976777 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b33fd58c-e89c-424c-ad23-86a03f08d725-iptables-alerter-script\") pod \"iptables-alerter-5qnfh\" (UID: \"b33fd58c-e89c-424c-ad23-86a03f08d725\") " pod="openshift-network-operator/iptables-alerter-5qnfh" Apr 17 18:49:06.978358 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.976799 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7cc323ad-cefa-4b44-8df6-0997a152f7ba-tmp\") pod \"tuned-t56ft\" (UID: \"7cc323ad-cefa-4b44-8df6-0997a152f7ba\") " pod="openshift-cluster-node-tuning-operator/tuned-t56ft" Apr 17 18:49:06.978358 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.976814 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-run-systemd\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:06.978358 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.976834 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-host-var-lib-cni-multus\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:06.978358 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.976860 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7cc323ad-cefa-4b44-8df6-0997a152f7ba-etc-kubernetes\") pod \"tuned-t56ft\" (UID: \"7cc323ad-cefa-4b44-8df6-0997a152f7ba\") " pod="openshift-cluster-node-tuning-operator/tuned-t56ft" Apr 17 18:49:06.978358 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.976877 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c0f5f82e-6018-4e89-909a-07d817ebf145-registration-dir\") pod \"aws-ebs-csi-driver-node-n9tf8\" (UID: \"c0f5f82e-6018-4e89-909a-07d817ebf145\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n9tf8" Apr 17 18:49:06.978358 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.976910 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c0f5f82e-6018-4e89-909a-07d817ebf145-etc-selinux\") pod \"aws-ebs-csi-driver-node-n9tf8\" (UID: \"c0f5f82e-6018-4e89-909a-07d817ebf145\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n9tf8" Apr 17 18:49:06.978358 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.976945 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q4ps\" (UniqueName: \"kubernetes.io/projected/87d38419-312e-4358-b18c-7e7b24e8189f-kube-api-access-7q4ps\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:06.978358 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.976969 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-cni-binary-copy\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:06.978358 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.976994 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7cc323ad-cefa-4b44-8df6-0997a152f7ba-etc-modprobe-d\") pod \"tuned-t56ft\" (UID: \"7cc323ad-cefa-4b44-8df6-0997a152f7ba\") " pod="openshift-cluster-node-tuning-operator/tuned-t56ft" Apr 17 18:49:06.978358 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.977016 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7cc323ad-cefa-4b44-8df6-0997a152f7ba-etc-sysctl-d\") pod \"tuned-t56ft\" (UID: \"7cc323ad-cefa-4b44-8df6-0997a152f7ba\") " pod="openshift-cluster-node-tuning-operator/tuned-t56ft" Apr 17 18:49:06.978358 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.977041 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/39fdb78c-608d-4cb2-8a53-feb04ee1cdcf-serviceca\") pod \"node-ca-vnmqw\" (UID: \"39fdb78c-608d-4cb2-8a53-feb04ee1cdcf\") " pod="openshift-image-registry/node-ca-vnmqw" Apr 17 18:49:06.978358 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.977065 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/87d38419-312e-4358-b18c-7e7b24e8189f-ovnkube-script-lib\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:06.978358 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.977089 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-host-run-netns\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:06.979150 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.977113 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-host-var-lib-cni-bin\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:06.979150 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.977141 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff6339d8-541a-4947-b00d-5f600d8d08c1-system-cni-dir\") pod \"multus-additional-cni-plugins-4hnl2\" (UID: \"ff6339d8-541a-4947-b00d-5f600d8d08c1\") " pod="openshift-multus/multus-additional-cni-plugins-4hnl2" Apr 17 18:49:06.979150 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.977212 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/39fdb78c-608d-4cb2-8a53-feb04ee1cdcf-host\") pod \"node-ca-vnmqw\" (UID: \"39fdb78c-608d-4cb2-8a53-feb04ee1cdcf\") " pod="openshift-image-registry/node-ca-vnmqw" Apr 17 18:49:06.979150 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.977358 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ff6339d8-541a-4947-b00d-5f600d8d08c1-cnibin\") pod \"multus-additional-cni-plugins-4hnl2\" (UID: \"ff6339d8-541a-4947-b00d-5f600d8d08c1\") " pod="openshift-multus/multus-additional-cni-plugins-4hnl2" Apr 17 18:49:06.979150 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.977360 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b33fd58c-e89c-424c-ad23-86a03f08d725-iptables-alerter-script\") pod \"iptables-alerter-5qnfh\" (UID: \"b33fd58c-e89c-424c-ad23-86a03f08d725\") " pod="openshift-network-operator/iptables-alerter-5qnfh" Apr 17 18:49:06.979150 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.977382 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ff6339d8-541a-4947-b00d-5f600d8d08c1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4hnl2\" (UID: \"ff6339d8-541a-4947-b00d-5f600d8d08c1\") " pod="openshift-multus/multus-additional-cni-plugins-4hnl2" Apr 17 18:49:06.979150 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.977408 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7cc323ad-cefa-4b44-8df6-0997a152f7ba-etc-systemd\") pod \"tuned-t56ft\" (UID: \"7cc323ad-cefa-4b44-8df6-0997a152f7ba\") " pod="openshift-cluster-node-tuning-operator/tuned-t56ft" Apr 17 18:49:06.979150 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.977434 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ff6339d8-541a-4947-b00d-5f600d8d08c1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4hnl2\" (UID: \"ff6339d8-541a-4947-b00d-5f600d8d08c1\") " pod="openshift-multus/multus-additional-cni-plugins-4hnl2" Apr 17 18:49:06.979150 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.977438 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7cc323ad-cefa-4b44-8df6-0997a152f7ba-etc-kubernetes\") pod \"tuned-t56ft\" (UID: \"7cc323ad-cefa-4b44-8df6-0997a152f7ba\") " pod="openshift-cluster-node-tuning-operator/tuned-t56ft" Apr 17 18:49:06.979150 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.977464 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-etc-openvswitch\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:06.979150 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.977493 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-node-log\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:06.979150 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.977568 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/39fdb78c-608d-4cb2-8a53-feb04ee1cdcf-host\") pod \"node-ca-vnmqw\" (UID: \"39fdb78c-608d-4cb2-8a53-feb04ee1cdcf\") " pod="openshift-image-registry/node-ca-vnmqw" Apr 17 18:49:06.979150 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.977672 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7cc323ad-cefa-4b44-8df6-0997a152f7ba-etc-sysctl-d\") pod \"tuned-t56ft\" (UID: \"7cc323ad-cefa-4b44-8df6-0997a152f7ba\") " pod="openshift-cluster-node-tuning-operator/tuned-t56ft" Apr 17 18:49:06.979150 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.977916 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ff6339d8-541a-4947-b00d-5f600d8d08c1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4hnl2\" (UID: \"ff6339d8-541a-4947-b00d-5f600d8d08c1\") " pod="openshift-multus/multus-additional-cni-plugins-4hnl2" Apr 17 18:49:06.979150 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.977959 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/39fdb78c-608d-4cb2-8a53-feb04ee1cdcf-serviceca\") pod \"node-ca-vnmqw\" (UID: \"39fdb78c-608d-4cb2-8a53-feb04ee1cdcf\") " pod="openshift-image-registry/node-ca-vnmqw" Apr 17 18:49:06.979150 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.978004 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7cc323ad-cefa-4b44-8df6-0997a152f7ba-etc-modprobe-d\") pod \"tuned-t56ft\" (UID: \"7cc323ad-cefa-4b44-8df6-0997a152f7ba\") " pod="openshift-cluster-node-tuning-operator/tuned-t56ft" Apr 17 18:49:06.979150 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.978011 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff6339d8-541a-4947-b00d-5f600d8d08c1-system-cni-dir\") pod \"multus-additional-cni-plugins-4hnl2\" (UID: \"ff6339d8-541a-4947-b00d-5f600d8d08c1\") " pod="openshift-multus/multus-additional-cni-plugins-4hnl2" Apr 17 18:49:06.979700 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.978012 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ff6339d8-541a-4947-b00d-5f600d8d08c1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4hnl2\" (UID: \"ff6339d8-541a-4947-b00d-5f600d8d08c1\") " pod="openshift-multus/multus-additional-cni-plugins-4hnl2" Apr 17 18:49:06.979700 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.978055 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-host-cni-netd\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:06.979700 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.978061 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7cc323ad-cefa-4b44-8df6-0997a152f7ba-etc-systemd\") pod \"tuned-t56ft\" (UID: \"7cc323ad-cefa-4b44-8df6-0997a152f7ba\") " pod="openshift-cluster-node-tuning-operator/tuned-t56ft" Apr 17 18:49:06.979700 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.978097 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-multus-conf-dir\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:06.979700 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.978123 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxdv6\" (UniqueName: \"kubernetes.io/projected/70ceb0f8-7a3d-4e29-9470-f18b8af1daa1-kube-api-access-mxdv6\") pod \"network-metrics-daemon-v24kx\" (UID: \"70ceb0f8-7a3d-4e29-9470-f18b8af1daa1\") " pod="openshift-multus/network-metrics-daemon-v24kx" Apr 17 18:49:06.979700 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.978131 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ff6339d8-541a-4947-b00d-5f600d8d08c1-cnibin\") pod \"multus-additional-cni-plugins-4hnl2\" (UID: \"ff6339d8-541a-4947-b00d-5f600d8d08c1\") " pod="openshift-multus/multus-additional-cni-plugins-4hnl2" Apr 17 18:49:06.979700 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.979388 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7cc323ad-cefa-4b44-8df6-0997a152f7ba-etc-tuned\") pod \"tuned-t56ft\" (UID: \"7cc323ad-cefa-4b44-8df6-0997a152f7ba\") " pod="openshift-cluster-node-tuning-operator/tuned-t56ft" Apr 17 18:49:06.979700 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.979408 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7cc323ad-cefa-4b44-8df6-0997a152f7ba-tmp\") pod \"tuned-t56ft\" (UID: \"7cc323ad-cefa-4b44-8df6-0997a152f7ba\") " pod="openshift-cluster-node-tuning-operator/tuned-t56ft" Apr 17 18:49:06.979700 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.979666 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/91f3ddfa-7360-41d5-bb54-9c1c21904eb4-agent-certs\") pod \"konnectivity-agent-4vj6d\" (UID: \"91f3ddfa-7360-41d5-bb54-9c1c21904eb4\") " pod="kube-system/konnectivity-agent-4vj6d" Apr 17 18:49:06.988018 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:06.987703 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 18:49:06.988018 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:06.987796 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 18:49:06.988018 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:06.987907 2580 projected.go:194] Error preparing data for projected volume kube-api-access-2k2fp for pod openshift-network-diagnostics/network-check-target-qbtjg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:49:06.988018 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:06.987992 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/699f0c3d-9fe9-46b1-8ca2-967fca78f239-kube-api-access-2k2fp podName:699f0c3d-9fe9-46b1-8ca2-967fca78f239 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:07.487973753 +0000 UTC m=+3.140509657 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-2k2fp" (UniqueName: "kubernetes.io/projected/699f0c3d-9fe9-46b1-8ca2-967fca78f239-kube-api-access-2k2fp") pod "network-check-target-qbtjg" (UID: "699f0c3d-9fe9-46b1-8ca2-967fca78f239") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:49:06.988657 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.988631 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lc74\" (UniqueName: \"kubernetes.io/projected/87361592-a029-4a93-9af8-ed4f1a1cc87c-kube-api-access-8lc74\") pod \"node-resolver-zsszv\" (UID: \"87361592-a029-4a93-9af8-ed4f1a1cc87c\") " pod="openshift-dns/node-resolver-zsszv" Apr 17 18:49:06.988740 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.988712 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8crqt\" (UniqueName: \"kubernetes.io/projected/b33fd58c-e89c-424c-ad23-86a03f08d725-kube-api-access-8crqt\") pod \"iptables-alerter-5qnfh\" (UID: \"b33fd58c-e89c-424c-ad23-86a03f08d725\") " pod="openshift-network-operator/iptables-alerter-5qnfh" Apr 17 18:49:06.989400 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.989377 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbz4n\" (UniqueName: \"kubernetes.io/projected/ff6339d8-541a-4947-b00d-5f600d8d08c1-kube-api-access-pbz4n\") pod \"multus-additional-cni-plugins-4hnl2\" (UID: \"ff6339d8-541a-4947-b00d-5f600d8d08c1\") " pod="openshift-multus/multus-additional-cni-plugins-4hnl2" Apr 17 18:49:06.989680 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.989663 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j7jx\" (UniqueName: \"kubernetes.io/projected/7cc323ad-cefa-4b44-8df6-0997a152f7ba-kube-api-access-2j7jx\") pod \"tuned-t56ft\" (UID: \"7cc323ad-cefa-4b44-8df6-0997a152f7ba\") " pod="openshift-cluster-node-tuning-operator/tuned-t56ft" Apr 17 18:49:06.991972 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:06.991953 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jzdk\" (UniqueName: \"kubernetes.io/projected/39fdb78c-608d-4cb2-8a53-feb04ee1cdcf-kube-api-access-2jzdk\") pod \"node-ca-vnmqw\" (UID: \"39fdb78c-608d-4cb2-8a53-feb04ee1cdcf\") " pod="openshift-image-registry/node-ca-vnmqw" Apr 17 18:49:07.078718 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.078684 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-multus-socket-dir-parent\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:07.078718 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.078721 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-host-var-lib-kubelet\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:07.078982 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.078763 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-hostroot\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:07.078982 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.078796 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-log-socket\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:07.078982 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.078818 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-cnibin\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:07.078982 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.078814 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-multus-socket-dir-parent\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:07.078982 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.078836 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c0f5f82e-6018-4e89-909a-07d817ebf145-sys-fs\") pod \"aws-ebs-csi-driver-node-n9tf8\" (UID: \"c0f5f82e-6018-4e89-909a-07d817ebf145\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n9tf8" Apr 17 18:49:07.078982 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.078858 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-host-kubelet\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:07.078982 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.078873 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-hostroot\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:07.078982 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.078882 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-systemd-units\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:07.078982 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.078875 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-host-var-lib-kubelet\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:07.078982 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.078886 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-log-socket\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:07.078982 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.078923 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-host-slash\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:07.078982 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.078931 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-host-kubelet\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:07.078982 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.078945 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-cnibin\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:07.078982 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.078950 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-run-ovn\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:07.078982 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.078944 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c0f5f82e-6018-4e89-909a-07d817ebf145-sys-fs\") pod \"aws-ebs-csi-driver-node-n9tf8\" (UID: \"c0f5f82e-6018-4e89-909a-07d817ebf145\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n9tf8" Apr 17 18:49:07.078982 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.078986 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-run-ovn\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:07.078982 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.078952 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-systemd-units\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:07.079644 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.078987 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-host-slash\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:07.079644 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.078988 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-host-run-ovn-kubernetes\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:07.079644 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.079019 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-host-run-ovn-kubernetes\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:07.079644 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.079036 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:07.079644 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.079062 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-multus-cni-dir\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:07.079644 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.079083 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-os-release\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:07.079644 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.079103 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:07.079644 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.079123 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/87d38419-312e-4358-b18c-7e7b24e8189f-ovn-node-metrics-cert\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:07.079644 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.079153 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-multus-cni-dir\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:07.079644 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.079148 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-host-run-k8s-cni-cncf-io\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:07.079644 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.079197 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-host-run-multus-certs\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:07.079644 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.079194 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-host-run-k8s-cni-cncf-io\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:07.079644 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.079213 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-os-release\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:07.079644 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.079240 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-host-run-multus-certs\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:07.079644 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.079265 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-etc-kubernetes\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:07.079644 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.079283 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c0f5f82e-6018-4e89-909a-07d817ebf145-kubelet-dir\") pod \"aws-ebs-csi-driver-node-n9tf8\" (UID: \"c0f5f82e-6018-4e89-909a-07d817ebf145\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n9tf8" Apr 17 18:49:07.079644 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.079302 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c0f5f82e-6018-4e89-909a-07d817ebf145-socket-dir\") pod \"aws-ebs-csi-driver-node-n9tf8\" (UID: \"c0f5f82e-6018-4e89-909a-07d817ebf145\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n9tf8" Apr 17 18:49:07.080224 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.079313 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-etc-kubernetes\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:07.080224 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.079326 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-multus-daemon-config\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:07.080224 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.079344 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c0f5f82e-6018-4e89-909a-07d817ebf145-kubelet-dir\") pod \"aws-ebs-csi-driver-node-n9tf8\" (UID: \"c0f5f82e-6018-4e89-909a-07d817ebf145\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n9tf8" Apr 17 18:49:07.080224 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.079353 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/87d38419-312e-4358-b18c-7e7b24e8189f-env-overrides\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:07.080224 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.079383 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-system-cni-dir\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:07.080224 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.079405 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-whvkf\" (UniqueName: \"kubernetes.io/projected/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-kube-api-access-whvkf\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:07.080224 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.079433 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-var-lib-openvswitch\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:07.080224 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.079446 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c0f5f82e-6018-4e89-909a-07d817ebf145-socket-dir\") pod \"aws-ebs-csi-driver-node-n9tf8\" (UID: \"c0f5f82e-6018-4e89-909a-07d817ebf145\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n9tf8" Apr 17 18:49:07.080224 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.079455 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/87d38419-312e-4358-b18c-7e7b24e8189f-ovnkube-config\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:07.080224 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.079479 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/70ceb0f8-7a3d-4e29-9470-f18b8af1daa1-metrics-certs\") pod \"network-metrics-daemon-v24kx\" (UID: \"70ceb0f8-7a3d-4e29-9470-f18b8af1daa1\") " pod="openshift-multus/network-metrics-daemon-v24kx" Apr 17 18:49:07.080224 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.079504 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-run-systemd\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:07.080224 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.079529 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-host-var-lib-cni-multus\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:07.080224 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.079551 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-var-lib-openvswitch\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:07.080224 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.079556 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c0f5f82e-6018-4e89-909a-07d817ebf145-registration-dir\") pod \"aws-ebs-csi-driver-node-n9tf8\" (UID: \"c0f5f82e-6018-4e89-909a-07d817ebf145\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n9tf8" Apr 17 18:49:07.080224 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.079596 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c0f5f82e-6018-4e89-909a-07d817ebf145-registration-dir\") pod \"aws-ebs-csi-driver-node-n9tf8\" (UID: \"c0f5f82e-6018-4e89-909a-07d817ebf145\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n9tf8" Apr 17 18:49:07.080224 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.079611 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c0f5f82e-6018-4e89-909a-07d817ebf145-etc-selinux\") pod \"aws-ebs-csi-driver-node-n9tf8\" (UID: \"c0f5f82e-6018-4e89-909a-07d817ebf145\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n9tf8" Apr 17 18:49:07.080224 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.079634 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-system-cni-dir\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:07.080883 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.079638 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7q4ps\" (UniqueName: \"kubernetes.io/projected/87d38419-312e-4358-b18c-7e7b24e8189f-kube-api-access-7q4ps\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:07.080883 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.079680 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-cni-binary-copy\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:07.080883 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.079708 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/87d38419-312e-4358-b18c-7e7b24e8189f-ovnkube-script-lib\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:07.080883 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.079730 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-host-run-netns\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:07.080883 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.079752 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-host-var-lib-cni-bin\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:07.080883 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.079782 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-etc-openvswitch\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:07.080883 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.079808 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-node-log\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:07.080883 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.079843 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-run-systemd\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:07.080883 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.079862 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-host-cni-netd\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:07.080883 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:07.079943 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:49:07.080883 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:07.079998 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70ceb0f8-7a3d-4e29-9470-f18b8af1daa1-metrics-certs podName:70ceb0f8-7a3d-4e29-9470-f18b8af1daa1 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:07.57998119 +0000 UTC m=+3.232517097 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/70ceb0f8-7a3d-4e29-9470-f18b8af1daa1-metrics-certs") pod "network-metrics-daemon-v24kx" (UID: "70ceb0f8-7a3d-4e29-9470-f18b8af1daa1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:49:07.080883 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.080009 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-multus-daemon-config\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:07.080883 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.080012 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-multus-conf-dir\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:07.080883 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.080062 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-host-var-lib-cni-multus\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:07.080883 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.080063 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/87d38419-312e-4358-b18c-7e7b24e8189f-ovnkube-config\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:07.080883 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.080075 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxdv6\" (UniqueName: \"kubernetes.io/projected/70ceb0f8-7a3d-4e29-9470-f18b8af1daa1-kube-api-access-mxdv6\") pod \"network-metrics-daemon-v24kx\" (UID: \"70ceb0f8-7a3d-4e29-9470-f18b8af1daa1\") " pod="openshift-multus/network-metrics-daemon-v24kx" Apr 17 18:49:07.080883 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.080099 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-etc-openvswitch\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:07.081749 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.080124 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8xb5k\" (UniqueName: \"kubernetes.io/projected/c0f5f82e-6018-4e89-909a-07d817ebf145-kube-api-access-8xb5k\") pod \"aws-ebs-csi-driver-node-n9tf8\" (UID: \"c0f5f82e-6018-4e89-909a-07d817ebf145\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n9tf8" Apr 17 18:49:07.081749 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.080126 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c0f5f82e-6018-4e89-909a-07d817ebf145-etc-selinux\") pod \"aws-ebs-csi-driver-node-n9tf8\" (UID: \"c0f5f82e-6018-4e89-909a-07d817ebf145\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n9tf8" Apr 17 18:49:07.081749 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.080139 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-host-run-netns\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:07.081749 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.079810 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/87d38419-312e-4358-b18c-7e7b24e8189f-env-overrides\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:07.081749 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.080169 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-host-run-netns\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:07.081749 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.080184 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-host-cni-netd\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:07.081749 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.080197 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c0f5f82e-6018-4e89-909a-07d817ebf145-device-dir\") pod \"aws-ebs-csi-driver-node-n9tf8\" (UID: \"c0f5f82e-6018-4e89-909a-07d817ebf145\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n9tf8" Apr 17 18:49:07.081749 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.080201 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-host-var-lib-cni-bin\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:07.081749 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.080238 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-node-log\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:07.081749 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.080242 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-multus-conf-dir\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:07.081749 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.080283 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-host-run-netns\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:07.081749 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.080349 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c0f5f82e-6018-4e89-909a-07d817ebf145-device-dir\") pod \"aws-ebs-csi-driver-node-n9tf8\" (UID: \"c0f5f82e-6018-4e89-909a-07d817ebf145\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n9tf8" Apr 17 18:49:07.081749 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.080345 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-run-openvswitch\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:07.081749 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.080384 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-run-openvswitch\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:07.081749 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.080387 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-host-cni-bin\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:07.081749 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.080490 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/87d38419-312e-4358-b18c-7e7b24e8189f-host-cni-bin\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:07.081749 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.080645 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-cni-binary-copy\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:07.081749 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.081040 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/87d38419-312e-4358-b18c-7e7b24e8189f-ovnkube-script-lib\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:07.082632 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.081783 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/87d38419-312e-4358-b18c-7e7b24e8189f-ovn-node-metrics-cert\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:07.093169 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.093095 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q4ps\" (UniqueName: \"kubernetes.io/projected/87d38419-312e-4358-b18c-7e7b24e8189f-kube-api-access-7q4ps\") pod \"ovnkube-node-7klhd\" (UID: \"87d38419-312e-4358-b18c-7e7b24e8189f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:07.095645 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.095622 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxdv6\" (UniqueName: \"kubernetes.io/projected/70ceb0f8-7a3d-4e29-9470-f18b8af1daa1-kube-api-access-mxdv6\") pod \"network-metrics-daemon-v24kx\" (UID: \"70ceb0f8-7a3d-4e29-9470-f18b8af1daa1\") " pod="openshift-multus/network-metrics-daemon-v24kx" Apr 17 18:49:07.095798 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.095773 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-whvkf\" (UniqueName: \"kubernetes.io/projected/30fb60c5-1e4d-49ac-bfdc-b8a3ba658316-kube-api-access-whvkf\") pod \"multus-vcf4s\" (UID: \"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316\") " pod="openshift-multus/multus-vcf4s" Apr 17 18:49:07.095946 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.095830 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xb5k\" (UniqueName: \"kubernetes.io/projected/c0f5f82e-6018-4e89-909a-07d817ebf145-kube-api-access-8xb5k\") pod \"aws-ebs-csi-driver-node-n9tf8\" (UID: \"c0f5f82e-6018-4e89-909a-07d817ebf145\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n9tf8" Apr 17 18:49:07.166670 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.166641 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-4vj6d" Apr 17 18:49:07.179311 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.179280 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zsszv" Apr 17 18:49:07.187984 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.187963 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vnmqw" Apr 17 18:49:07.198597 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.198575 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5qnfh" Apr 17 18:49:07.204254 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.204231 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4hnl2" Apr 17 18:49:07.211886 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.211868 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-t56ft" Apr 17 18:49:07.217555 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.217532 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n9tf8" Apr 17 18:49:07.224262 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.224242 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:07.229839 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.229820 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vcf4s" Apr 17 18:49:07.341404 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.341367 2580 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 18:49:07.483215 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.483125 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/df4bf5b8-bdc2-4ccc-b126-8107588c1304-original-pull-secret\") pod \"global-pull-secret-syncer-jhmxj\" (UID: \"df4bf5b8-bdc2-4ccc-b126-8107588c1304\") " pod="kube-system/global-pull-secret-syncer-jhmxj" Apr 17 18:49:07.483350 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:07.483260 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 18:49:07.483350 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:07.483319 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df4bf5b8-bdc2-4ccc-b126-8107588c1304-original-pull-secret podName:df4bf5b8-bdc2-4ccc-b126-8107588c1304 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:08.483299862 +0000 UTC m=+4.135835781 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/df4bf5b8-bdc2-4ccc-b126-8107588c1304-original-pull-secret") pod "global-pull-secret-syncer-jhmxj" (UID: "df4bf5b8-bdc2-4ccc-b126-8107588c1304") : object "kube-system"/"original-pull-secret" not registered Apr 17 18:49:07.583471 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.583432 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2k2fp\" (UniqueName: \"kubernetes.io/projected/699f0c3d-9fe9-46b1-8ca2-967fca78f239-kube-api-access-2k2fp\") pod \"network-check-target-qbtjg\" (UID: \"699f0c3d-9fe9-46b1-8ca2-967fca78f239\") " pod="openshift-network-diagnostics/network-check-target-qbtjg" Apr 17 18:49:07.583649 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.583525 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/70ceb0f8-7a3d-4e29-9470-f18b8af1daa1-metrics-certs\") pod \"network-metrics-daemon-v24kx\" (UID: \"70ceb0f8-7a3d-4e29-9470-f18b8af1daa1\") " pod="openshift-multus/network-metrics-daemon-v24kx" Apr 17 18:49:07.583649 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:07.583581 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 18:49:07.583649 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:07.583603 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 18:49:07.583649 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:07.583616 2580 projected.go:194] Error preparing data for projected volume kube-api-access-2k2fp for pod openshift-network-diagnostics/network-check-target-qbtjg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:49:07.583649 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:07.583638 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:49:07.583831 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:07.583682 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/699f0c3d-9fe9-46b1-8ca2-967fca78f239-kube-api-access-2k2fp podName:699f0c3d-9fe9-46b1-8ca2-967fca78f239 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:08.583664197 +0000 UTC m=+4.236200117 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-2k2fp" (UniqueName: "kubernetes.io/projected/699f0c3d-9fe9-46b1-8ca2-967fca78f239-kube-api-access-2k2fp") pod "network-check-target-qbtjg" (UID: "699f0c3d-9fe9-46b1-8ca2-967fca78f239") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:49:07.583831 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:07.583696 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70ceb0f8-7a3d-4e29-9470-f18b8af1daa1-metrics-certs podName:70ceb0f8-7a3d-4e29-9470-f18b8af1daa1 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:08.583690567 +0000 UTC m=+4.236226474 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/70ceb0f8-7a3d-4e29-9470-f18b8af1daa1-metrics-certs") pod "network-metrics-daemon-v24kx" (UID: "70ceb0f8-7a3d-4e29-9470-f18b8af1daa1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:49:07.756601 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:07.756565 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91f3ddfa_7360_41d5_bb54_9c1c21904eb4.slice/crio-3a9b620938906e70b1be49e6b4acc228865934d974d7e8ff3a73e02b314dabc9 WatchSource:0}: Error finding container 3a9b620938906e70b1be49e6b4acc228865934d974d7e8ff3a73e02b314dabc9: Status 404 returned error can't find the container with id 3a9b620938906e70b1be49e6b4acc228865934d974d7e8ff3a73e02b314dabc9 Apr 17 18:49:07.758545 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:07.758477 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87d38419_312e_4358_b18c_7e7b24e8189f.slice/crio-8a1ee2e4f8b0a04bd1c15c17947bc1f7f30f0619af63ce1a50792c17640aef74 WatchSource:0}: Error finding container 8a1ee2e4f8b0a04bd1c15c17947bc1f7f30f0619af63ce1a50792c17640aef74: Status 404 returned error can't find the container with id 8a1ee2e4f8b0a04bd1c15c17947bc1f7f30f0619af63ce1a50792c17640aef74 Apr 17 18:49:07.762154 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:07.762113 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0f5f82e_6018_4e89_909a_07d817ebf145.slice/crio-e773108963cc609ca3d746aafe613bd286214ae6d777474f054c747a68c1ccd2 WatchSource:0}: Error finding container e773108963cc609ca3d746aafe613bd286214ae6d777474f054c747a68c1ccd2: Status 404 returned error can't find the container with id e773108963cc609ca3d746aafe613bd286214ae6d777474f054c747a68c1ccd2 Apr 17 18:49:07.762790 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:07.762765 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff6339d8_541a_4947_b00d_5f600d8d08c1.slice/crio-59ca30223c3f24656f5f74567bd9a62960e32450323bfab6401f8d6c7432e0ce WatchSource:0}: Error finding container 59ca30223c3f24656f5f74567bd9a62960e32450323bfab6401f8d6c7432e0ce: Status 404 returned error can't find the container with id 59ca30223c3f24656f5f74567bd9a62960e32450323bfab6401f8d6c7432e0ce Apr 17 18:49:07.763654 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:07.763628 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30fb60c5_1e4d_49ac_bfdc_b8a3ba658316.slice/crio-1214591a3e249b1500e2d22fad99297346db5081be1607a52fda04807c2c0e69 WatchSource:0}: Error finding container 1214591a3e249b1500e2d22fad99297346db5081be1607a52fda04807c2c0e69: Status 404 returned error can't find the container with id 1214591a3e249b1500e2d22fad99297346db5081be1607a52fda04807c2c0e69 Apr 17 18:49:07.765066 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:07.764857 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb33fd58c_e89c_424c_ad23_86a03f08d725.slice/crio-afab3cf8f13381780baf4c1bbaaaefbda3bfca4c9c17e5e2e259d6c9a2bc586c WatchSource:0}: Error finding container afab3cf8f13381780baf4c1bbaaaefbda3bfca4c9c17e5e2e259d6c9a2bc586c: Status 404 returned error can't find the container with id afab3cf8f13381780baf4c1bbaaaefbda3bfca4c9c17e5e2e259d6c9a2bc586c Apr 17 18:49:07.766037 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:49:07.766011 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cc323ad_cefa_4b44_8df6_0997a152f7ba.slice/crio-41bd6cb002278e55bde8f7baa42bc5c4d87245804143c37b96fa2e0a6d7f9399 WatchSource:0}: Error finding container 41bd6cb002278e55bde8f7baa42bc5c4d87245804143c37b96fa2e0a6d7f9399: Status 404 returned error can't find the container with id 41bd6cb002278e55bde8f7baa42bc5c4d87245804143c37b96fa2e0a6d7f9399 Apr 17 18:49:07.895611 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.895448 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 18:44:05 +0000 UTC" deadline="2027-12-29 14:15:50.164671538 +0000 UTC" Apr 17 18:49:07.895611 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.895607 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14899h26m42.269067002s" Apr 17 18:49:07.924912 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.924865 2580 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 18:49:07.979541 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.979502 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vnmqw" event={"ID":"39fdb78c-608d-4cb2-8a53-feb04ee1cdcf","Type":"ContainerStarted","Data":"6e597fe3cd2b1cea6382396cfdb730af1bc131360335255ac5c6e05497c02219"} Apr 17 18:49:07.980453 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.980425 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4hnl2" event={"ID":"ff6339d8-541a-4947-b00d-5f600d8d08c1","Type":"ContainerStarted","Data":"59ca30223c3f24656f5f74567bd9a62960e32450323bfab6401f8d6c7432e0ce"} Apr 17 18:49:07.982336 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.982306 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-4vj6d" event={"ID":"91f3ddfa-7360-41d5-bb54-9c1c21904eb4","Type":"ContainerStarted","Data":"3a9b620938906e70b1be49e6b4acc228865934d974d7e8ff3a73e02b314dabc9"} Apr 17 18:49:07.985028 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.985001 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zsszv" event={"ID":"87361592-a029-4a93-9af8-ed4f1a1cc87c","Type":"ContainerStarted","Data":"f1bcc2e46b768fcb6b27f6ec70925e306a0c54f9be41d787752bcdaa4c9a6490"} Apr 17 18:49:07.986142 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.986117 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5qnfh" event={"ID":"b33fd58c-e89c-424c-ad23-86a03f08d725","Type":"ContainerStarted","Data":"afab3cf8f13381780baf4c1bbaaaefbda3bfca4c9c17e5e2e259d6c9a2bc586c"} Apr 17 18:49:07.987105 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.987083 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-t56ft" event={"ID":"7cc323ad-cefa-4b44-8df6-0997a152f7ba","Type":"ContainerStarted","Data":"41bd6cb002278e55bde8f7baa42bc5c4d87245804143c37b96fa2e0a6d7f9399"} Apr 17 18:49:07.987992 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.987961 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vcf4s" event={"ID":"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316","Type":"ContainerStarted","Data":"1214591a3e249b1500e2d22fad99297346db5081be1607a52fda04807c2c0e69"} Apr 17 18:49:07.988875 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.988856 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n9tf8" event={"ID":"c0f5f82e-6018-4e89-909a-07d817ebf145","Type":"ContainerStarted","Data":"e773108963cc609ca3d746aafe613bd286214ae6d777474f054c747a68c1ccd2"} Apr 17 18:49:07.989818 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.989798 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" event={"ID":"87d38419-312e-4358-b18c-7e7b24e8189f","Type":"ContainerStarted","Data":"8a1ee2e4f8b0a04bd1c15c17947bc1f7f30f0619af63ce1a50792c17640aef74"} Apr 17 18:49:07.991239 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:07.991172 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-27.ec2.internal" event={"ID":"6402b5e4dc46963653aa05278c9bac43","Type":"ContainerStarted","Data":"b7f6d1ed9971a2fbfce96746045d54ba909cbe3a7297f48c4c19f50de13b586c"} Apr 17 18:49:08.007173 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:08.007137 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-27.ec2.internal" podStartSLOduration=2.007125417 podStartE2EDuration="2.007125417s" podCreationTimestamp="2026-04-17 18:49:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:49:08.006402052 +0000 UTC m=+3.658938010" watchObservedRunningTime="2026-04-17 18:49:08.007125417 +0000 UTC m=+3.659661343" Apr 17 18:49:08.490995 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:08.490309 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/df4bf5b8-bdc2-4ccc-b126-8107588c1304-original-pull-secret\") pod \"global-pull-secret-syncer-jhmxj\" (UID: \"df4bf5b8-bdc2-4ccc-b126-8107588c1304\") " pod="kube-system/global-pull-secret-syncer-jhmxj" Apr 17 18:49:08.490995 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:08.490446 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 18:49:08.490995 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:08.490506 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df4bf5b8-bdc2-4ccc-b126-8107588c1304-original-pull-secret podName:df4bf5b8-bdc2-4ccc-b126-8107588c1304 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:10.490487479 +0000 UTC m=+6.143023388 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/df4bf5b8-bdc2-4ccc-b126-8107588c1304-original-pull-secret") pod "global-pull-secret-syncer-jhmxj" (UID: "df4bf5b8-bdc2-4ccc-b126-8107588c1304") : object "kube-system"/"original-pull-secret" not registered Apr 17 18:49:08.591511 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:08.591477 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/70ceb0f8-7a3d-4e29-9470-f18b8af1daa1-metrics-certs\") pod \"network-metrics-daemon-v24kx\" (UID: \"70ceb0f8-7a3d-4e29-9470-f18b8af1daa1\") " pod="openshift-multus/network-metrics-daemon-v24kx" Apr 17 18:49:08.591667 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:08.591545 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2k2fp\" (UniqueName: \"kubernetes.io/projected/699f0c3d-9fe9-46b1-8ca2-967fca78f239-kube-api-access-2k2fp\") pod \"network-check-target-qbtjg\" (UID: \"699f0c3d-9fe9-46b1-8ca2-967fca78f239\") " pod="openshift-network-diagnostics/network-check-target-qbtjg" Apr 17 18:49:08.591754 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:08.591736 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 18:49:08.591807 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:08.591762 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 18:49:08.591807 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:08.591776 2580 projected.go:194] Error preparing data for projected volume kube-api-access-2k2fp for pod openshift-network-diagnostics/network-check-target-qbtjg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:49:08.591933 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:08.591832 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/699f0c3d-9fe9-46b1-8ca2-967fca78f239-kube-api-access-2k2fp podName:699f0c3d-9fe9-46b1-8ca2-967fca78f239 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:10.591813717 +0000 UTC m=+6.244349623 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-2k2fp" (UniqueName: "kubernetes.io/projected/699f0c3d-9fe9-46b1-8ca2-967fca78f239-kube-api-access-2k2fp") pod "network-check-target-qbtjg" (UID: "699f0c3d-9fe9-46b1-8ca2-967fca78f239") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:49:08.591933 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:08.591918 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:49:08.592045 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:08.591952 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70ceb0f8-7a3d-4e29-9470-f18b8af1daa1-metrics-certs podName:70ceb0f8-7a3d-4e29-9470-f18b8af1daa1 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:10.591940685 +0000 UTC m=+6.244476594 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/70ceb0f8-7a3d-4e29-9470-f18b8af1daa1-metrics-certs") pod "network-metrics-daemon-v24kx" (UID: "70ceb0f8-7a3d-4e29-9470-f18b8af1daa1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:49:08.980122 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:08.975923 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v24kx" Apr 17 18:49:08.980122 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:08.976063 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v24kx" podUID="70ceb0f8-7a3d-4e29-9470-f18b8af1daa1" Apr 17 18:49:08.980122 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:08.976517 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbtjg" Apr 17 18:49:08.980122 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:08.976607 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbtjg" podUID="699f0c3d-9fe9-46b1-8ca2-967fca78f239" Apr 17 18:49:08.980122 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:08.976693 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jhmxj" Apr 17 18:49:08.980122 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:08.976767 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jhmxj" podUID="df4bf5b8-bdc2-4ccc-b126-8107588c1304" Apr 17 18:49:09.002336 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:09.002303 2580 generic.go:358] "Generic (PLEG): container finished" podID="a7b645763bd4d0264284f3e94b95b589" containerID="da57c2702db590b7545fc0283c42c814bf66f77de0b0c61d18f9a2ddf8ce52ef" exitCode=0 Apr 17 18:49:09.003190 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:09.003161 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-27.ec2.internal" event={"ID":"a7b645763bd4d0264284f3e94b95b589","Type":"ContainerDied","Data":"da57c2702db590b7545fc0283c42c814bf66f77de0b0c61d18f9a2ddf8ce52ef"} Apr 17 18:49:10.012574 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:10.012082 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-27.ec2.internal" event={"ID":"a7b645763bd4d0264284f3e94b95b589","Type":"ContainerStarted","Data":"bca361751153f376add88b9633abbfc6c1a8215c363646a28ee3860b8cf31d50"} Apr 17 18:49:10.509781 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:10.509695 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/df4bf5b8-bdc2-4ccc-b126-8107588c1304-original-pull-secret\") pod \"global-pull-secret-syncer-jhmxj\" (UID: \"df4bf5b8-bdc2-4ccc-b126-8107588c1304\") " pod="kube-system/global-pull-secret-syncer-jhmxj" Apr 17 18:49:10.509977 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:10.509831 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 18:49:10.509977 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:10.509911 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df4bf5b8-bdc2-4ccc-b126-8107588c1304-original-pull-secret podName:df4bf5b8-bdc2-4ccc-b126-8107588c1304 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:14.509873622 +0000 UTC m=+10.162409526 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/df4bf5b8-bdc2-4ccc-b126-8107588c1304-original-pull-secret") pod "global-pull-secret-syncer-jhmxj" (UID: "df4bf5b8-bdc2-4ccc-b126-8107588c1304") : object "kube-system"/"original-pull-secret" not registered Apr 17 18:49:10.610228 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:10.610188 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/70ceb0f8-7a3d-4e29-9470-f18b8af1daa1-metrics-certs\") pod \"network-metrics-daemon-v24kx\" (UID: \"70ceb0f8-7a3d-4e29-9470-f18b8af1daa1\") " pod="openshift-multus/network-metrics-daemon-v24kx" Apr 17 18:49:10.610404 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:10.610256 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2k2fp\" (UniqueName: \"kubernetes.io/projected/699f0c3d-9fe9-46b1-8ca2-967fca78f239-kube-api-access-2k2fp\") pod \"network-check-target-qbtjg\" (UID: \"699f0c3d-9fe9-46b1-8ca2-967fca78f239\") " pod="openshift-network-diagnostics/network-check-target-qbtjg" Apr 17 18:49:10.610463 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:10.610415 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 18:49:10.610463 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:10.610434 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 18:49:10.610463 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:10.610449 2580 projected.go:194] Error preparing data for projected volume kube-api-access-2k2fp for pod openshift-network-diagnostics/network-check-target-qbtjg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:49:10.610604 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:10.610510 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/699f0c3d-9fe9-46b1-8ca2-967fca78f239-kube-api-access-2k2fp podName:699f0c3d-9fe9-46b1-8ca2-967fca78f239 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:14.610491377 +0000 UTC m=+10.263027284 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-2k2fp" (UniqueName: "kubernetes.io/projected/699f0c3d-9fe9-46b1-8ca2-967fca78f239-kube-api-access-2k2fp") pod "network-check-target-qbtjg" (UID: "699f0c3d-9fe9-46b1-8ca2-967fca78f239") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:49:10.610952 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:10.610928 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:49:10.611053 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:10.610983 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70ceb0f8-7a3d-4e29-9470-f18b8af1daa1-metrics-certs podName:70ceb0f8-7a3d-4e29-9470-f18b8af1daa1 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:14.610969239 +0000 UTC m=+10.263505144 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/70ceb0f8-7a3d-4e29-9470-f18b8af1daa1-metrics-certs") pod "network-metrics-daemon-v24kx" (UID: "70ceb0f8-7a3d-4e29-9470-f18b8af1daa1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:49:10.973729 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:10.973643 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbtjg" Apr 17 18:49:10.973866 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:10.973834 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbtjg" podUID="699f0c3d-9fe9-46b1-8ca2-967fca78f239" Apr 17 18:49:10.974305 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:10.974283 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jhmxj" Apr 17 18:49:10.974404 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:10.974384 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jhmxj" podUID="df4bf5b8-bdc2-4ccc-b126-8107588c1304" Apr 17 18:49:10.974579 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:10.974549 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v24kx" Apr 17 18:49:10.974728 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:10.974672 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v24kx" podUID="70ceb0f8-7a3d-4e29-9470-f18b8af1daa1" Apr 17 18:49:12.974030 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:12.973326 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jhmxj" Apr 17 18:49:12.974030 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:12.973461 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jhmxj" podUID="df4bf5b8-bdc2-4ccc-b126-8107588c1304" Apr 17 18:49:12.974030 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:12.973333 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v24kx" Apr 17 18:49:12.974030 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:12.973567 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v24kx" podUID="70ceb0f8-7a3d-4e29-9470-f18b8af1daa1" Apr 17 18:49:12.974030 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:12.973325 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbtjg" Apr 17 18:49:12.974030 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:12.973636 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbtjg" podUID="699f0c3d-9fe9-46b1-8ca2-967fca78f239" Apr 17 18:49:14.544388 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:14.544340 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/df4bf5b8-bdc2-4ccc-b126-8107588c1304-original-pull-secret\") pod \"global-pull-secret-syncer-jhmxj\" (UID: \"df4bf5b8-bdc2-4ccc-b126-8107588c1304\") " pod="kube-system/global-pull-secret-syncer-jhmxj" Apr 17 18:49:14.544754 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:14.544514 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 18:49:14.544754 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:14.544596 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df4bf5b8-bdc2-4ccc-b126-8107588c1304-original-pull-secret podName:df4bf5b8-bdc2-4ccc-b126-8107588c1304 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:22.544574337 +0000 UTC m=+18.197110245 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/df4bf5b8-bdc2-4ccc-b126-8107588c1304-original-pull-secret") pod "global-pull-secret-syncer-jhmxj" (UID: "df4bf5b8-bdc2-4ccc-b126-8107588c1304") : object "kube-system"/"original-pull-secret" not registered Apr 17 18:49:14.645698 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:14.645661 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2k2fp\" (UniqueName: \"kubernetes.io/projected/699f0c3d-9fe9-46b1-8ca2-967fca78f239-kube-api-access-2k2fp\") pod \"network-check-target-qbtjg\" (UID: \"699f0c3d-9fe9-46b1-8ca2-967fca78f239\") " pod="openshift-network-diagnostics/network-check-target-qbtjg" Apr 17 18:49:14.645889 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:14.645750 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/70ceb0f8-7a3d-4e29-9470-f18b8af1daa1-metrics-certs\") pod \"network-metrics-daemon-v24kx\" (UID: \"70ceb0f8-7a3d-4e29-9470-f18b8af1daa1\") " pod="openshift-multus/network-metrics-daemon-v24kx" Apr 17 18:49:14.645889 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:14.645841 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 18:49:14.645889 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:14.645859 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:49:14.645889 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:14.645865 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 18:49:14.645889 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:14.645882 2580 projected.go:194] Error preparing data for projected volume kube-api-access-2k2fp for pod openshift-network-diagnostics/network-check-target-qbtjg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:49:14.646166 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:14.645938 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70ceb0f8-7a3d-4e29-9470-f18b8af1daa1-metrics-certs podName:70ceb0f8-7a3d-4e29-9470-f18b8af1daa1 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:22.645915916 +0000 UTC m=+18.298451821 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/70ceb0f8-7a3d-4e29-9470-f18b8af1daa1-metrics-certs") pod "network-metrics-daemon-v24kx" (UID: "70ceb0f8-7a3d-4e29-9470-f18b8af1daa1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:49:14.646166 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:14.645959 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/699f0c3d-9fe9-46b1-8ca2-967fca78f239-kube-api-access-2k2fp podName:699f0c3d-9fe9-46b1-8ca2-967fca78f239 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:22.645949545 +0000 UTC m=+18.298485449 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-2k2fp" (UniqueName: "kubernetes.io/projected/699f0c3d-9fe9-46b1-8ca2-967fca78f239-kube-api-access-2k2fp") pod "network-check-target-qbtjg" (UID: "699f0c3d-9fe9-46b1-8ca2-967fca78f239") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:49:14.974801 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:14.974296 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v24kx" Apr 17 18:49:14.974801 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:14.974329 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jhmxj" Apr 17 18:49:14.974801 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:14.974296 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbtjg" Apr 17 18:49:14.974801 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:14.974422 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v24kx" podUID="70ceb0f8-7a3d-4e29-9470-f18b8af1daa1" Apr 17 18:49:14.974801 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:14.974490 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbtjg" podUID="699f0c3d-9fe9-46b1-8ca2-967fca78f239" Apr 17 18:49:14.974801 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:14.974556 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jhmxj" podUID="df4bf5b8-bdc2-4ccc-b126-8107588c1304" Apr 17 18:49:16.973106 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:16.973071 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbtjg" Apr 17 18:49:16.973551 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:16.973116 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jhmxj" Apr 17 18:49:16.973551 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:16.973178 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v24kx" Apr 17 18:49:16.973551 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:16.973297 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbtjg" podUID="699f0c3d-9fe9-46b1-8ca2-967fca78f239" Apr 17 18:49:16.973551 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:16.973375 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jhmxj" podUID="df4bf5b8-bdc2-4ccc-b126-8107588c1304" Apr 17 18:49:16.973551 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:16.973491 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v24kx" podUID="70ceb0f8-7a3d-4e29-9470-f18b8af1daa1" Apr 17 18:49:18.973980 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:18.973936 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v24kx" Apr 17 18:49:18.974512 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:18.973936 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbtjg" Apr 17 18:49:18.974512 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:18.974057 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v24kx" podUID="70ceb0f8-7a3d-4e29-9470-f18b8af1daa1" Apr 17 18:49:18.974512 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:18.973955 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jhmxj" Apr 17 18:49:18.974512 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:18.974150 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbtjg" podUID="699f0c3d-9fe9-46b1-8ca2-967fca78f239" Apr 17 18:49:18.974512 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:18.974239 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jhmxj" podUID="df4bf5b8-bdc2-4ccc-b126-8107588c1304" Apr 17 18:49:20.973505 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:20.973391 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jhmxj" Apr 17 18:49:20.973505 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:20.973472 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v24kx" Apr 17 18:49:20.973505 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:20.973483 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbtjg" Apr 17 18:49:20.973948 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:20.973591 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbtjg" podUID="699f0c3d-9fe9-46b1-8ca2-967fca78f239" Apr 17 18:49:20.973948 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:20.973677 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jhmxj" podUID="df4bf5b8-bdc2-4ccc-b126-8107588c1304" Apr 17 18:49:20.974064 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:20.974038 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v24kx" podUID="70ceb0f8-7a3d-4e29-9470-f18b8af1daa1" Apr 17 18:49:22.610087 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:22.610050 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/df4bf5b8-bdc2-4ccc-b126-8107588c1304-original-pull-secret\") pod \"global-pull-secret-syncer-jhmxj\" (UID: \"df4bf5b8-bdc2-4ccc-b126-8107588c1304\") " pod="kube-system/global-pull-secret-syncer-jhmxj" Apr 17 18:49:22.610509 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:22.610176 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 18:49:22.610509 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:22.610242 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df4bf5b8-bdc2-4ccc-b126-8107588c1304-original-pull-secret podName:df4bf5b8-bdc2-4ccc-b126-8107588c1304 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:38.610228724 +0000 UTC m=+34.262764631 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/df4bf5b8-bdc2-4ccc-b126-8107588c1304-original-pull-secret") pod "global-pull-secret-syncer-jhmxj" (UID: "df4bf5b8-bdc2-4ccc-b126-8107588c1304") : object "kube-system"/"original-pull-secret" not registered Apr 17 18:49:22.710597 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:22.710551 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/70ceb0f8-7a3d-4e29-9470-f18b8af1daa1-metrics-certs\") pod \"network-metrics-daemon-v24kx\" (UID: \"70ceb0f8-7a3d-4e29-9470-f18b8af1daa1\") " pod="openshift-multus/network-metrics-daemon-v24kx" Apr 17 18:49:22.710780 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:22.710617 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2k2fp\" (UniqueName: \"kubernetes.io/projected/699f0c3d-9fe9-46b1-8ca2-967fca78f239-kube-api-access-2k2fp\") pod \"network-check-target-qbtjg\" (UID: \"699f0c3d-9fe9-46b1-8ca2-967fca78f239\") " pod="openshift-network-diagnostics/network-check-target-qbtjg" Apr 17 18:49:22.710780 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:22.710680 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:49:22.710780 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:22.710754 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70ceb0f8-7a3d-4e29-9470-f18b8af1daa1-metrics-certs podName:70ceb0f8-7a3d-4e29-9470-f18b8af1daa1 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:38.710734719 +0000 UTC m=+34.363270627 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/70ceb0f8-7a3d-4e29-9470-f18b8af1daa1-metrics-certs") pod "network-metrics-daemon-v24kx" (UID: "70ceb0f8-7a3d-4e29-9470-f18b8af1daa1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:49:22.710975 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:22.710856 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 18:49:22.710975 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:22.710877 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 18:49:22.710975 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:22.710906 2580 projected.go:194] Error preparing data for projected volume kube-api-access-2k2fp for pod openshift-network-diagnostics/network-check-target-qbtjg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:49:22.710975 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:22.710959 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/699f0c3d-9fe9-46b1-8ca2-967fca78f239-kube-api-access-2k2fp podName:699f0c3d-9fe9-46b1-8ca2-967fca78f239 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:38.710942389 +0000 UTC m=+34.363478297 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-2k2fp" (UniqueName: "kubernetes.io/projected/699f0c3d-9fe9-46b1-8ca2-967fca78f239-kube-api-access-2k2fp") pod "network-check-target-qbtjg" (UID: "699f0c3d-9fe9-46b1-8ca2-967fca78f239") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:49:22.973583 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:22.973503 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbtjg" Apr 17 18:49:22.973740 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:22.973607 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbtjg" podUID="699f0c3d-9fe9-46b1-8ca2-967fca78f239" Apr 17 18:49:22.973740 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:22.973502 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jhmxj" Apr 17 18:49:22.973740 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:22.973679 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jhmxj" podUID="df4bf5b8-bdc2-4ccc-b126-8107588c1304" Apr 17 18:49:22.973740 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:22.973502 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v24kx" Apr 17 18:49:22.973978 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:22.973748 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v24kx" podUID="70ceb0f8-7a3d-4e29-9470-f18b8af1daa1" Apr 17 18:49:24.973913 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:24.973858 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v24kx" Apr 17 18:49:24.974306 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:24.973998 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbtjg" Apr 17 18:49:24.974306 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:24.974012 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jhmxj" Apr 17 18:49:24.974306 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:24.973996 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v24kx" podUID="70ceb0f8-7a3d-4e29-9470-f18b8af1daa1" Apr 17 18:49:24.974306 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:24.974120 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jhmxj" podUID="df4bf5b8-bdc2-4ccc-b126-8107588c1304" Apr 17 18:49:24.974306 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:24.974204 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbtjg" podUID="699f0c3d-9fe9-46b1-8ca2-967fca78f239" Apr 17 18:49:26.044340 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:26.043885 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n9tf8" event={"ID":"c0f5f82e-6018-4e89-909a-07d817ebf145","Type":"ContainerStarted","Data":"e1fbbaf9b2c05df2ebb064b6c9e6e9c66bf9a6a71e6256c9885eaac3471f8fe9"} Apr 17 18:49:26.046733 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:26.046712 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7klhd_87d38419-312e-4358-b18c-7e7b24e8189f/ovn-acl-logging/0.log" Apr 17 18:49:26.047063 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:26.047040 2580 generic.go:358] "Generic (PLEG): container finished" podID="87d38419-312e-4358-b18c-7e7b24e8189f" containerID="2bfab49ad437acd1627907be7ba20f8ea0ece199c48641c842ce24adaa881f1b" exitCode=1 Apr 17 18:49:26.047145 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:26.047121 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" event={"ID":"87d38419-312e-4358-b18c-7e7b24e8189f","Type":"ContainerStarted","Data":"1e5c84fc4f1999df28f6c56fff6676ecf3a1bf8cb189e3332d6668444f9a97a2"} Apr 17 18:49:26.047190 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:26.047161 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" event={"ID":"87d38419-312e-4358-b18c-7e7b24e8189f","Type":"ContainerStarted","Data":"6af22cffbaf2c48c31c15b21e6c588c3e335c65ead91db1ceea839670faa4e1d"} Apr 17 18:49:26.047190 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:26.047179 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" event={"ID":"87d38419-312e-4358-b18c-7e7b24e8189f","Type":"ContainerStarted","Data":"abb69b499b3dc10be28ab1f5bb0a18906af8a63af7911a7b0e5027004254d884"} Apr 17 18:49:26.047270 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:26.047192 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" event={"ID":"87d38419-312e-4358-b18c-7e7b24e8189f","Type":"ContainerStarted","Data":"87c93a2f8ddb67cd0254467efa8c54def4a40aecee8d6c7102fdfcb20c201776"} Apr 17 18:49:26.047270 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:26.047206 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" event={"ID":"87d38419-312e-4358-b18c-7e7b24e8189f","Type":"ContainerDied","Data":"2bfab49ad437acd1627907be7ba20f8ea0ece199c48641c842ce24adaa881f1b"} Apr 17 18:49:26.047270 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:26.047220 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" event={"ID":"87d38419-312e-4358-b18c-7e7b24e8189f","Type":"ContainerStarted","Data":"796516ae8ab47bdb1b9309cc009b44764f05452442e9ba7181cdf80168151f8e"} Apr 17 18:49:26.048845 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:26.048824 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vnmqw" event={"ID":"39fdb78c-608d-4cb2-8a53-feb04ee1cdcf","Type":"ContainerStarted","Data":"df2100948ef43ac98dcb9c451abdf3ee3bee7bd0cd09813e185f85d697f9eb34"} Apr 17 18:49:26.050949 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:26.050844 2580 generic.go:358] "Generic (PLEG): container finished" podID="ff6339d8-541a-4947-b00d-5f600d8d08c1" containerID="e843a175237be8e20be3643a29ac40d45944b7d8960aefcc76e3cc330b6a38b6" exitCode=0 Apr 17 18:49:26.050949 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:26.050934 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4hnl2" event={"ID":"ff6339d8-541a-4947-b00d-5f600d8d08c1","Type":"ContainerDied","Data":"e843a175237be8e20be3643a29ac40d45944b7d8960aefcc76e3cc330b6a38b6"} Apr 17 18:49:26.052245 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:26.052218 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-4vj6d" event={"ID":"91f3ddfa-7360-41d5-bb54-9c1c21904eb4","Type":"ContainerStarted","Data":"143809f8d9c549c9be9657af18f83b7026664d3d99993a8502ad3e453ad5961b"} Apr 17 18:49:26.054675 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:26.054657 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zsszv" event={"ID":"87361592-a029-4a93-9af8-ed4f1a1cc87c","Type":"ContainerStarted","Data":"fb68b1233be9dee6c71e4846c5b8c49ebad5b48fd6304f6ba705354631f7c121"} Apr 17 18:49:26.056131 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:26.056108 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-t56ft" event={"ID":"7cc323ad-cefa-4b44-8df6-0997a152f7ba","Type":"ContainerStarted","Data":"b5b4dec39b6a8c91f33cc32502f69b03a04ef58e6dcc7c6fe4cb7e0056e33b50"} Apr 17 18:49:26.058137 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:26.057712 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vcf4s" event={"ID":"30fb60c5-1e4d-49ac-bfdc-b8a3ba658316","Type":"ContainerStarted","Data":"17d4403228783390362ca2f912a82755fb500bf9070567fd0e457f7531888272"} Apr 17 18:49:26.063517 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:26.063471 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-vnmqw" podStartSLOduration=3.956467938 podStartE2EDuration="21.063456588s" podCreationTimestamp="2026-04-17 18:49:05 +0000 UTC" firstStartedPulling="2026-04-17 18:49:07.771042189 +0000 UTC m=+3.423578094" lastFinishedPulling="2026-04-17 18:49:24.87803083 +0000 UTC m=+20.530566744" observedRunningTime="2026-04-17 18:49:26.062987622 +0000 UTC m=+21.715523561" watchObservedRunningTime="2026-04-17 18:49:26.063456588 +0000 UTC m=+21.715992512" Apr 17 18:49:26.063983 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:26.063945 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-27.ec2.internal" podStartSLOduration=20.063935493 podStartE2EDuration="20.063935493s" podCreationTimestamp="2026-04-17 18:49:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:49:10.02714768 +0000 UTC m=+5.679683607" watchObservedRunningTime="2026-04-17 18:49:26.063935493 +0000 UTC m=+21.716471421" Apr 17 18:49:26.078268 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:26.078222 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-vcf4s" podStartSLOduration=3.648160684 podStartE2EDuration="21.078206249s" podCreationTimestamp="2026-04-17 18:49:05 +0000 UTC" firstStartedPulling="2026-04-17 18:49:07.766093006 +0000 UTC m=+3.418628919" lastFinishedPulling="2026-04-17 18:49:25.196138577 +0000 UTC m=+20.848674484" observedRunningTime="2026-04-17 18:49:26.077967503 +0000 UTC m=+21.730503426" watchObservedRunningTime="2026-04-17 18:49:26.078206249 +0000 UTC m=+21.730742176" Apr 17 18:49:26.092840 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:26.092798 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-4vj6d" podStartSLOduration=8.367159337 podStartE2EDuration="21.092788186s" podCreationTimestamp="2026-04-17 18:49:05 +0000 UTC" firstStartedPulling="2026-04-17 18:49:07.758791842 +0000 UTC m=+3.411327749" lastFinishedPulling="2026-04-17 18:49:20.484420681 +0000 UTC m=+16.136956598" observedRunningTime="2026-04-17 18:49:26.092393525 +0000 UTC m=+21.744929452" watchObservedRunningTime="2026-04-17 18:49:26.092788186 +0000 UTC m=+21.745324112" Apr 17 18:49:26.105334 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:26.105289 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-zsszv" podStartSLOduration=3.73560108 podStartE2EDuration="21.105278182s" podCreationTimestamp="2026-04-17 18:49:05 +0000 UTC" firstStartedPulling="2026-04-17 18:49:07.770181923 +0000 UTC m=+3.422717830" lastFinishedPulling="2026-04-17 18:49:25.139859011 +0000 UTC m=+20.792394932" observedRunningTime="2026-04-17 18:49:26.105168793 +0000 UTC m=+21.757704719" watchObservedRunningTime="2026-04-17 18:49:26.105278182 +0000 UTC m=+21.757814108" Apr 17 18:49:26.137586 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:26.137533 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-t56ft" podStartSLOduration=3.752405858 podStartE2EDuration="21.137515013s" podCreationTimestamp="2026-04-17 18:49:05 +0000 UTC" firstStartedPulling="2026-04-17 18:49:07.769001772 +0000 UTC m=+3.421537687" lastFinishedPulling="2026-04-17 18:49:25.154110925 +0000 UTC m=+20.806646842" observedRunningTime="2026-04-17 18:49:26.118925825 +0000 UTC m=+21.771461752" watchObservedRunningTime="2026-04-17 18:49:26.137515013 +0000 UTC m=+21.790050932" Apr 17 18:49:26.883991 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:26.883962 2580 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 18:49:26.941735 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:26.941600 2580 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T18:49:26.883986875Z","UUID":"47d9e1a8-dd3e-4453-8383-1933e83c0076","Handler":null,"Name":"","Endpoint":""} Apr 17 18:49:26.944536 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:26.944513 2580 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 18:49:26.944654 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:26.944548 2580 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 18:49:26.973943 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:26.973856 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jhmxj" Apr 17 18:49:26.973943 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:26.973910 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v24kx" Apr 17 18:49:26.973943 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:26.973910 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbtjg" Apr 17 18:49:26.974162 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:26.974000 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jhmxj" podUID="df4bf5b8-bdc2-4ccc-b126-8107588c1304" Apr 17 18:49:26.974162 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:26.974071 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v24kx" podUID="70ceb0f8-7a3d-4e29-9470-f18b8af1daa1" Apr 17 18:49:26.974250 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:26.974156 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbtjg" podUID="699f0c3d-9fe9-46b1-8ca2-967fca78f239" Apr 17 18:49:27.062090 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:27.062031 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5qnfh" event={"ID":"b33fd58c-e89c-424c-ad23-86a03f08d725","Type":"ContainerStarted","Data":"38eb1d98418223ef8f5d9e6b358d49a352305e90f3a2da6090d8010ac4df609a"} Apr 17 18:49:27.064869 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:27.064805 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n9tf8" event={"ID":"c0f5f82e-6018-4e89-909a-07d817ebf145","Type":"ContainerStarted","Data":"26eded09658fc274411079c0eb17b52485496eb40ab623191a34102f4d8a2316"} Apr 17 18:49:27.075324 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:27.075276 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-5qnfh" podStartSLOduration=4.965051344 podStartE2EDuration="22.075261386s" podCreationTimestamp="2026-04-17 18:49:05 +0000 UTC" firstStartedPulling="2026-04-17 18:49:07.767830063 +0000 UTC m=+3.420365970" lastFinishedPulling="2026-04-17 18:49:24.878040103 +0000 UTC m=+20.530576012" observedRunningTime="2026-04-17 18:49:27.074842014 +0000 UTC m=+22.727377942" watchObservedRunningTime="2026-04-17 18:49:27.075261386 +0000 UTC m=+22.727797311" Apr 17 18:49:28.973888 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:28.973663 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbtjg" Apr 17 18:49:28.973888 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:28.973688 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jhmxj" Apr 17 18:49:28.974426 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:28.973722 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v24kx" Apr 17 18:49:28.974426 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:28.974012 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbtjg" podUID="699f0c3d-9fe9-46b1-8ca2-967fca78f239" Apr 17 18:49:28.974426 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:28.974146 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jhmxj" podUID="df4bf5b8-bdc2-4ccc-b126-8107588c1304" Apr 17 18:49:28.974426 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:28.974248 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v24kx" podUID="70ceb0f8-7a3d-4e29-9470-f18b8af1daa1" Apr 17 18:49:29.069889 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:29.069853 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n9tf8" event={"ID":"c0f5f82e-6018-4e89-909a-07d817ebf145","Type":"ContainerStarted","Data":"6f3361f40e3fc651d4e37d363261ca99f165d288eb316f16665a66516e99b939"} Apr 17 18:49:29.072387 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:29.072367 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7klhd_87d38419-312e-4358-b18c-7e7b24e8189f/ovn-acl-logging/0.log" Apr 17 18:49:29.072712 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:29.072679 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" event={"ID":"87d38419-312e-4358-b18c-7e7b24e8189f","Type":"ContainerStarted","Data":"e2d55df509b9ff6609856ade862a1e25b3d8fd921a946a38fc2b56b42814c5ed"} Apr 17 18:49:29.085354 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:29.085293 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-n9tf8" podStartSLOduration=3.734307973 podStartE2EDuration="24.085277322s" podCreationTimestamp="2026-04-17 18:49:05 +0000 UTC" firstStartedPulling="2026-04-17 18:49:07.763869805 +0000 UTC m=+3.416405709" lastFinishedPulling="2026-04-17 18:49:28.114839151 +0000 UTC m=+23.767375058" observedRunningTime="2026-04-17 18:49:29.084796461 +0000 UTC m=+24.737332387" watchObservedRunningTime="2026-04-17 18:49:29.085277322 +0000 UTC m=+24.737813249" Apr 17 18:49:29.812247 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:29.812209 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-4vj6d" Apr 17 18:49:29.812850 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:29.812834 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-4vj6d" Apr 17 18:49:30.075070 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:30.074998 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-4vj6d" Apr 17 18:49:30.075500 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:30.075316 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-4vj6d" Apr 17 18:49:30.973978 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:30.973941 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jhmxj" Apr 17 18:49:30.974162 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:30.973949 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v24kx" Apr 17 18:49:30.974162 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:30.974086 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jhmxj" podUID="df4bf5b8-bdc2-4ccc-b126-8107588c1304" Apr 17 18:49:30.974162 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:30.973949 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbtjg" Apr 17 18:49:30.974330 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:30.974157 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v24kx" podUID="70ceb0f8-7a3d-4e29-9470-f18b8af1daa1" Apr 17 18:49:30.974330 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:30.974226 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbtjg" podUID="699f0c3d-9fe9-46b1-8ca2-967fca78f239" Apr 17 18:49:32.081235 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:32.081050 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7klhd_87d38419-312e-4358-b18c-7e7b24e8189f/ovn-acl-logging/0.log" Apr 17 18:49:32.081947 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:32.081534 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" event={"ID":"87d38419-312e-4358-b18c-7e7b24e8189f","Type":"ContainerStarted","Data":"20706e6bc4c5e73bf756818c815339a77ba1b1ac82f9cff02ab24f46b112eb86"} Apr 17 18:49:32.081947 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:32.081912 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:32.082093 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:32.082066 2580 scope.go:117] "RemoveContainer" containerID="2bfab49ad437acd1627907be7ba20f8ea0ece199c48641c842ce24adaa881f1b" Apr 17 18:49:32.083376 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:32.083350 2580 generic.go:358] "Generic (PLEG): container finished" podID="ff6339d8-541a-4947-b00d-5f600d8d08c1" containerID="ac7e8f903758ef15eecd48b2d6ae990d33965f62ba7ac7ddc009ff9c3ceae17b" exitCode=0 Apr 17 18:49:32.083486 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:32.083432 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4hnl2" event={"ID":"ff6339d8-541a-4947-b00d-5f600d8d08c1","Type":"ContainerDied","Data":"ac7e8f903758ef15eecd48b2d6ae990d33965f62ba7ac7ddc009ff9c3ceae17b"} Apr 17 18:49:32.100929 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:32.100882 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:32.973761 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:32.973737 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbtjg" Apr 17 18:49:32.973886 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:32.973770 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jhmxj" Apr 17 18:49:32.973886 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:32.973741 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v24kx" Apr 17 18:49:32.973886 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:32.973837 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbtjg" podUID="699f0c3d-9fe9-46b1-8ca2-967fca78f239" Apr 17 18:49:32.974008 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:32.973927 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v24kx" podUID="70ceb0f8-7a3d-4e29-9470-f18b8af1daa1" Apr 17 18:49:32.974008 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:32.973996 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jhmxj" podUID="df4bf5b8-bdc2-4ccc-b126-8107588c1304" Apr 17 18:49:33.088191 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:33.088095 2580 generic.go:358] "Generic (PLEG): container finished" podID="ff6339d8-541a-4947-b00d-5f600d8d08c1" containerID="22a737b82fba03c3cfbadd292f8d9ad5e9c7741e77000682564f529b5d0d252f" exitCode=0 Apr 17 18:49:33.088629 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:33.088225 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4hnl2" event={"ID":"ff6339d8-541a-4947-b00d-5f600d8d08c1","Type":"ContainerDied","Data":"22a737b82fba03c3cfbadd292f8d9ad5e9c7741e77000682564f529b5d0d252f"} Apr 17 18:49:33.097360 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:33.097340 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7klhd_87d38419-312e-4358-b18c-7e7b24e8189f/ovn-acl-logging/0.log" Apr 17 18:49:33.097670 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:33.097646 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" event={"ID":"87d38419-312e-4358-b18c-7e7b24e8189f","Type":"ContainerStarted","Data":"6ba72760a4ead227ca54bf804d301a1e9ba8e9ca4ef4406f16f50026f59c0f20"} Apr 17 18:49:33.097980 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:33.097955 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:33.098067 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:33.097990 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:33.114711 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:33.114679 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:49:33.162053 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:33.161995 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" podStartSLOduration=10.689908483 podStartE2EDuration="28.161976752s" podCreationTimestamp="2026-04-17 18:49:05 +0000 UTC" firstStartedPulling="2026-04-17 18:49:07.760876042 +0000 UTC m=+3.413411947" lastFinishedPulling="2026-04-17 18:49:25.232944309 +0000 UTC m=+20.885480216" observedRunningTime="2026-04-17 18:49:33.136559362 +0000 UTC m=+28.789095288" watchObservedRunningTime="2026-04-17 18:49:33.161976752 +0000 UTC m=+28.814512681" Apr 17 18:49:33.371034 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:33.370949 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-jhmxj"] Apr 17 18:49:33.371199 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:33.371089 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jhmxj" Apr 17 18:49:33.371199 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:33.371181 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jhmxj" podUID="df4bf5b8-bdc2-4ccc-b126-8107588c1304" Apr 17 18:49:33.374605 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:33.374576 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qbtjg"] Apr 17 18:49:33.374752 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:33.374712 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbtjg" Apr 17 18:49:33.374839 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:33.374805 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbtjg" podUID="699f0c3d-9fe9-46b1-8ca2-967fca78f239" Apr 17 18:49:33.375291 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:33.375265 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-v24kx"] Apr 17 18:49:33.375416 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:33.375401 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v24kx" Apr 17 18:49:33.375598 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:33.375576 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v24kx" podUID="70ceb0f8-7a3d-4e29-9470-f18b8af1daa1" Apr 17 18:49:34.101538 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:34.101453 2580 generic.go:358] "Generic (PLEG): container finished" podID="ff6339d8-541a-4947-b00d-5f600d8d08c1" containerID="27d6bec7f5b6488927c295495ca544cd289c3b6bcb649cc6f8da8aa0afac841f" exitCode=0 Apr 17 18:49:34.101872 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:34.101540 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4hnl2" event={"ID":"ff6339d8-541a-4947-b00d-5f600d8d08c1","Type":"ContainerDied","Data":"27d6bec7f5b6488927c295495ca544cd289c3b6bcb649cc6f8da8aa0afac841f"} Apr 17 18:49:34.974295 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:34.974088 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v24kx" Apr 17 18:49:34.974463 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:34.974162 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbtjg" Apr 17 18:49:34.974463 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:34.974407 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v24kx" podUID="70ceb0f8-7a3d-4e29-9470-f18b8af1daa1" Apr 17 18:49:34.974586 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:34.974491 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbtjg" podUID="699f0c3d-9fe9-46b1-8ca2-967fca78f239" Apr 17 18:49:34.974586 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:34.974178 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jhmxj" Apr 17 18:49:34.974754 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:34.974599 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jhmxj" podUID="df4bf5b8-bdc2-4ccc-b126-8107588c1304" Apr 17 18:49:36.973140 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:36.973107 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jhmxj" Apr 17 18:49:36.973585 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:36.973280 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jhmxj" podUID="df4bf5b8-bdc2-4ccc-b126-8107588c1304" Apr 17 18:49:36.973585 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:36.973107 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v24kx" Apr 17 18:49:36.973585 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:36.973465 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v24kx" podUID="70ceb0f8-7a3d-4e29-9470-f18b8af1daa1" Apr 17 18:49:36.973585 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:36.973107 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbtjg" Apr 17 18:49:36.973585 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:36.973559 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbtjg" podUID="699f0c3d-9fe9-46b1-8ca2-967fca78f239" Apr 17 18:49:38.187961 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.187854 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-27.ec2.internal" event="NodeReady" Apr 17 18:49:38.188492 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.188027 2580 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 18:49:38.222656 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.222622 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-67d5f96857-4xrp8"] Apr 17 18:49:38.225511 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.225482 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67d5f96857-4xrp8" Apr 17 18:49:38.228139 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.228111 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 18:49:38.228410 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.228394 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 18:49:38.228605 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.228590 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-2m8h2\"" Apr 17 18:49:38.228787 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.228773 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 18:49:38.234097 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.234067 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 18:49:38.234623 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.234598 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-67d5f96857-4xrp8"] Apr 17 18:49:38.237551 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.237530 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-94ggh"] Apr 17 18:49:38.240387 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.240367 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-l4g6s"] Apr 17 18:49:38.240540 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.240520 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-94ggh" Apr 17 18:49:38.242992 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.242971 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 18:49:38.243097 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.242971 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 18:49:38.243097 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.243011 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ppj8p\"" Apr 17 18:49:38.243416 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.243400 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 18:49:38.243715 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.243701 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-l4g6s" Apr 17 18:49:38.245885 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.245865 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ctvd8\"" Apr 17 18:49:38.245995 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.245886 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 18:49:38.245995 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.245938 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 18:49:38.248917 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.248858 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-94ggh"] Apr 17 18:49:38.249292 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.249271 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-l4g6s"] Apr 17 18:49:38.343720 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.343680 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llnpv\" (UniqueName: \"kubernetes.io/projected/98b160b8-551d-443c-a3a0-4d046919e27c-kube-api-access-llnpv\") pod \"ingress-canary-94ggh\" (UID: \"98b160b8-551d-443c-a3a0-4d046919e27c\") " pod="openshift-ingress-canary/ingress-canary-94ggh" Apr 17 18:49:38.343884 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.343747 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/14c44177-594e-4d5c-a3a1-0e8d4627aabd-ca-trust-extracted\") pod \"image-registry-67d5f96857-4xrp8\" (UID: \"14c44177-594e-4d5c-a3a1-0e8d4627aabd\") " pod="openshift-image-registry/image-registry-67d5f96857-4xrp8" Apr 17 18:49:38.343884 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.343778 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/14c44177-594e-4d5c-a3a1-0e8d4627aabd-registry-certificates\") pod \"image-registry-67d5f96857-4xrp8\" (UID: \"14c44177-594e-4d5c-a3a1-0e8d4627aabd\") " pod="openshift-image-registry/image-registry-67d5f96857-4xrp8" Apr 17 18:49:38.343884 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.343807 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/98b160b8-551d-443c-a3a0-4d046919e27c-cert\") pod \"ingress-canary-94ggh\" (UID: \"98b160b8-551d-443c-a3a0-4d046919e27c\") " pod="openshift-ingress-canary/ingress-canary-94ggh" Apr 17 18:49:38.343884 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.343831 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/14c44177-594e-4d5c-a3a1-0e8d4627aabd-trusted-ca\") pod \"image-registry-67d5f96857-4xrp8\" (UID: \"14c44177-594e-4d5c-a3a1-0e8d4627aabd\") " pod="openshift-image-registry/image-registry-67d5f96857-4xrp8" Apr 17 18:49:38.343884 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.343859 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/14c44177-594e-4d5c-a3a1-0e8d4627aabd-installation-pull-secrets\") pod \"image-registry-67d5f96857-4xrp8\" (UID: \"14c44177-594e-4d5c-a3a1-0e8d4627aabd\") " pod="openshift-image-registry/image-registry-67d5f96857-4xrp8" Apr 17 18:49:38.343884 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.343882 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14c44177-594e-4d5c-a3a1-0e8d4627aabd-bound-sa-token\") pod \"image-registry-67d5f96857-4xrp8\" (UID: \"14c44177-594e-4d5c-a3a1-0e8d4627aabd\") " pod="openshift-image-registry/image-registry-67d5f96857-4xrp8" Apr 17 18:49:38.344185 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.343947 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gschq\" (UniqueName: \"kubernetes.io/projected/648e7199-fd23-4496-ad24-5b9e829d77fa-kube-api-access-gschq\") pod \"dns-default-l4g6s\" (UID: \"648e7199-fd23-4496-ad24-5b9e829d77fa\") " pod="openshift-dns/dns-default-l4g6s" Apr 17 18:49:38.344185 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.343984 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgtvf\" (UniqueName: \"kubernetes.io/projected/14c44177-594e-4d5c-a3a1-0e8d4627aabd-kube-api-access-lgtvf\") pod \"image-registry-67d5f96857-4xrp8\" (UID: \"14c44177-594e-4d5c-a3a1-0e8d4627aabd\") " pod="openshift-image-registry/image-registry-67d5f96857-4xrp8" Apr 17 18:49:38.344185 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.344037 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/14c44177-594e-4d5c-a3a1-0e8d4627aabd-image-registry-private-configuration\") pod \"image-registry-67d5f96857-4xrp8\" (UID: \"14c44177-594e-4d5c-a3a1-0e8d4627aabd\") " pod="openshift-image-registry/image-registry-67d5f96857-4xrp8" Apr 17 18:49:38.344185 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.344059 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14c44177-594e-4d5c-a3a1-0e8d4627aabd-registry-tls\") pod \"image-registry-67d5f96857-4xrp8\" (UID: \"14c44177-594e-4d5c-a3a1-0e8d4627aabd\") " pod="openshift-image-registry/image-registry-67d5f96857-4xrp8" Apr 17 18:49:38.344185 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.344075 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/648e7199-fd23-4496-ad24-5b9e829d77fa-config-volume\") pod \"dns-default-l4g6s\" (UID: \"648e7199-fd23-4496-ad24-5b9e829d77fa\") " pod="openshift-dns/dns-default-l4g6s" Apr 17 18:49:38.344185 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.344146 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/648e7199-fd23-4496-ad24-5b9e829d77fa-metrics-tls\") pod \"dns-default-l4g6s\" (UID: \"648e7199-fd23-4496-ad24-5b9e829d77fa\") " pod="openshift-dns/dns-default-l4g6s" Apr 17 18:49:38.344394 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.344187 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/648e7199-fd23-4496-ad24-5b9e829d77fa-tmp-dir\") pod \"dns-default-l4g6s\" (UID: \"648e7199-fd23-4496-ad24-5b9e829d77fa\") " pod="openshift-dns/dns-default-l4g6s" Apr 17 18:49:38.445143 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.445046 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/14c44177-594e-4d5c-a3a1-0e8d4627aabd-image-registry-private-configuration\") pod \"image-registry-67d5f96857-4xrp8\" (UID: \"14c44177-594e-4d5c-a3a1-0e8d4627aabd\") " pod="openshift-image-registry/image-registry-67d5f96857-4xrp8" Apr 17 18:49:38.445143 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.445106 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14c44177-594e-4d5c-a3a1-0e8d4627aabd-registry-tls\") pod \"image-registry-67d5f96857-4xrp8\" (UID: \"14c44177-594e-4d5c-a3a1-0e8d4627aabd\") " pod="openshift-image-registry/image-registry-67d5f96857-4xrp8" Apr 17 18:49:38.445143 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.445134 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/648e7199-fd23-4496-ad24-5b9e829d77fa-config-volume\") pod \"dns-default-l4g6s\" (UID: \"648e7199-fd23-4496-ad24-5b9e829d77fa\") " pod="openshift-dns/dns-default-l4g6s" Apr 17 18:49:38.445412 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:38.445221 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 18:49:38.445412 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:38.445245 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67d5f96857-4xrp8: secret "image-registry-tls" not found Apr 17 18:49:38.445412 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.445275 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/648e7199-fd23-4496-ad24-5b9e829d77fa-metrics-tls\") pod \"dns-default-l4g6s\" (UID: \"648e7199-fd23-4496-ad24-5b9e829d77fa\") " pod="openshift-dns/dns-default-l4g6s" Apr 17 18:49:38.445412 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:38.445314 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/14c44177-594e-4d5c-a3a1-0e8d4627aabd-registry-tls podName:14c44177-594e-4d5c-a3a1-0e8d4627aabd nodeName:}" failed. No retries permitted until 2026-04-17 18:49:38.945298881 +0000 UTC m=+34.597834790 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/14c44177-594e-4d5c-a3a1-0e8d4627aabd-registry-tls") pod "image-registry-67d5f96857-4xrp8" (UID: "14c44177-594e-4d5c-a3a1-0e8d4627aabd") : secret "image-registry-tls" not found Apr 17 18:49:38.445412 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.445333 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/648e7199-fd23-4496-ad24-5b9e829d77fa-tmp-dir\") pod \"dns-default-l4g6s\" (UID: \"648e7199-fd23-4496-ad24-5b9e829d77fa\") " pod="openshift-dns/dns-default-l4g6s" Apr 17 18:49:38.445412 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.445363 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-llnpv\" (UniqueName: \"kubernetes.io/projected/98b160b8-551d-443c-a3a0-4d046919e27c-kube-api-access-llnpv\") pod \"ingress-canary-94ggh\" (UID: \"98b160b8-551d-443c-a3a0-4d046919e27c\") " pod="openshift-ingress-canary/ingress-canary-94ggh" Apr 17 18:49:38.445412 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:38.445371 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 18:49:38.445412 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.445387 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/14c44177-594e-4d5c-a3a1-0e8d4627aabd-ca-trust-extracted\") pod \"image-registry-67d5f96857-4xrp8\" (UID: \"14c44177-594e-4d5c-a3a1-0e8d4627aabd\") " pod="openshift-image-registry/image-registry-67d5f96857-4xrp8" Apr 17 18:49:38.445412 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.445411 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/14c44177-594e-4d5c-a3a1-0e8d4627aabd-registry-certificates\") pod \"image-registry-67d5f96857-4xrp8\" (UID: \"14c44177-594e-4d5c-a3a1-0e8d4627aabd\") " pod="openshift-image-registry/image-registry-67d5f96857-4xrp8" Apr 17 18:49:38.445791 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:38.445425 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/648e7199-fd23-4496-ad24-5b9e829d77fa-metrics-tls podName:648e7199-fd23-4496-ad24-5b9e829d77fa nodeName:}" failed. No retries permitted until 2026-04-17 18:49:38.945407649 +0000 UTC m=+34.597943567 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/648e7199-fd23-4496-ad24-5b9e829d77fa-metrics-tls") pod "dns-default-l4g6s" (UID: "648e7199-fd23-4496-ad24-5b9e829d77fa") : secret "dns-default-metrics-tls" not found Apr 17 18:49:38.445791 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.445456 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/98b160b8-551d-443c-a3a0-4d046919e27c-cert\") pod \"ingress-canary-94ggh\" (UID: \"98b160b8-551d-443c-a3a0-4d046919e27c\") " pod="openshift-ingress-canary/ingress-canary-94ggh" Apr 17 18:49:38.445791 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.445484 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/14c44177-594e-4d5c-a3a1-0e8d4627aabd-trusted-ca\") pod \"image-registry-67d5f96857-4xrp8\" (UID: \"14c44177-594e-4d5c-a3a1-0e8d4627aabd\") " pod="openshift-image-registry/image-registry-67d5f96857-4xrp8" Apr 17 18:49:38.445791 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.445518 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/14c44177-594e-4d5c-a3a1-0e8d4627aabd-installation-pull-secrets\") pod \"image-registry-67d5f96857-4xrp8\" (UID: \"14c44177-594e-4d5c-a3a1-0e8d4627aabd\") " pod="openshift-image-registry/image-registry-67d5f96857-4xrp8" Apr 17 18:49:38.445791 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.445546 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14c44177-594e-4d5c-a3a1-0e8d4627aabd-bound-sa-token\") pod \"image-registry-67d5f96857-4xrp8\" (UID: \"14c44177-594e-4d5c-a3a1-0e8d4627aabd\") " pod="openshift-image-registry/image-registry-67d5f96857-4xrp8" Apr 17 18:49:38.445791 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.445587 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gschq\" (UniqueName: \"kubernetes.io/projected/648e7199-fd23-4496-ad24-5b9e829d77fa-kube-api-access-gschq\") pod \"dns-default-l4g6s\" (UID: \"648e7199-fd23-4496-ad24-5b9e829d77fa\") " pod="openshift-dns/dns-default-l4g6s" Apr 17 18:49:38.445791 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.445619 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lgtvf\" (UniqueName: \"kubernetes.io/projected/14c44177-594e-4d5c-a3a1-0e8d4627aabd-kube-api-access-lgtvf\") pod \"image-registry-67d5f96857-4xrp8\" (UID: \"14c44177-594e-4d5c-a3a1-0e8d4627aabd\") " pod="openshift-image-registry/image-registry-67d5f96857-4xrp8" Apr 17 18:49:38.445791 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.445692 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/648e7199-fd23-4496-ad24-5b9e829d77fa-config-volume\") pod \"dns-default-l4g6s\" (UID: \"648e7199-fd23-4496-ad24-5b9e829d77fa\") " pod="openshift-dns/dns-default-l4g6s" Apr 17 18:49:38.446164 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.445856 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/648e7199-fd23-4496-ad24-5b9e829d77fa-tmp-dir\") pod \"dns-default-l4g6s\" (UID: \"648e7199-fd23-4496-ad24-5b9e829d77fa\") " pod="openshift-dns/dns-default-l4g6s" Apr 17 18:49:38.446164 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.445974 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/14c44177-594e-4d5c-a3a1-0e8d4627aabd-ca-trust-extracted\") pod \"image-registry-67d5f96857-4xrp8\" (UID: \"14c44177-594e-4d5c-a3a1-0e8d4627aabd\") " pod="openshift-image-registry/image-registry-67d5f96857-4xrp8" Apr 17 18:49:38.446247 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:38.446173 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 18:49:38.446247 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:38.446216 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98b160b8-551d-443c-a3a0-4d046919e27c-cert podName:98b160b8-551d-443c-a3a0-4d046919e27c nodeName:}" failed. No retries permitted until 2026-04-17 18:49:38.946202811 +0000 UTC m=+34.598738715 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/98b160b8-551d-443c-a3a0-4d046919e27c-cert") pod "ingress-canary-94ggh" (UID: "98b160b8-551d-443c-a3a0-4d046919e27c") : secret "canary-serving-cert" not found Apr 17 18:49:38.446390 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.446370 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/14c44177-594e-4d5c-a3a1-0e8d4627aabd-registry-certificates\") pod \"image-registry-67d5f96857-4xrp8\" (UID: \"14c44177-594e-4d5c-a3a1-0e8d4627aabd\") " pod="openshift-image-registry/image-registry-67d5f96857-4xrp8" Apr 17 18:49:38.446654 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.446628 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/14c44177-594e-4d5c-a3a1-0e8d4627aabd-trusted-ca\") pod \"image-registry-67d5f96857-4xrp8\" (UID: \"14c44177-594e-4d5c-a3a1-0e8d4627aabd\") " pod="openshift-image-registry/image-registry-67d5f96857-4xrp8" Apr 17 18:49:38.449691 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.449648 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/14c44177-594e-4d5c-a3a1-0e8d4627aabd-installation-pull-secrets\") pod \"image-registry-67d5f96857-4xrp8\" (UID: \"14c44177-594e-4d5c-a3a1-0e8d4627aabd\") " pod="openshift-image-registry/image-registry-67d5f96857-4xrp8" Apr 17 18:49:38.449691 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.449659 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/14c44177-594e-4d5c-a3a1-0e8d4627aabd-image-registry-private-configuration\") pod \"image-registry-67d5f96857-4xrp8\" (UID: \"14c44177-594e-4d5c-a3a1-0e8d4627aabd\") " pod="openshift-image-registry/image-registry-67d5f96857-4xrp8" Apr 17 18:49:38.454209 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.454165 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14c44177-594e-4d5c-a3a1-0e8d4627aabd-bound-sa-token\") pod \"image-registry-67d5f96857-4xrp8\" (UID: \"14c44177-594e-4d5c-a3a1-0e8d4627aabd\") " pod="openshift-image-registry/image-registry-67d5f96857-4xrp8" Apr 17 18:49:38.454838 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.454817 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgtvf\" (UniqueName: \"kubernetes.io/projected/14c44177-594e-4d5c-a3a1-0e8d4627aabd-kube-api-access-lgtvf\") pod \"image-registry-67d5f96857-4xrp8\" (UID: \"14c44177-594e-4d5c-a3a1-0e8d4627aabd\") " pod="openshift-image-registry/image-registry-67d5f96857-4xrp8" Apr 17 18:49:38.455728 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.455705 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gschq\" (UniqueName: \"kubernetes.io/projected/648e7199-fd23-4496-ad24-5b9e829d77fa-kube-api-access-gschq\") pod \"dns-default-l4g6s\" (UID: \"648e7199-fd23-4496-ad24-5b9e829d77fa\") " pod="openshift-dns/dns-default-l4g6s" Apr 17 18:49:38.455823 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.455805 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-llnpv\" (UniqueName: \"kubernetes.io/projected/98b160b8-551d-443c-a3a0-4d046919e27c-kube-api-access-llnpv\") pod \"ingress-canary-94ggh\" (UID: \"98b160b8-551d-443c-a3a0-4d046919e27c\") " pod="openshift-ingress-canary/ingress-canary-94ggh" Apr 17 18:49:38.646529 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.646499 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/df4bf5b8-bdc2-4ccc-b126-8107588c1304-original-pull-secret\") pod \"global-pull-secret-syncer-jhmxj\" (UID: \"df4bf5b8-bdc2-4ccc-b126-8107588c1304\") " pod="kube-system/global-pull-secret-syncer-jhmxj" Apr 17 18:49:38.646697 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:38.646659 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 18:49:38.646747 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:38.646725 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df4bf5b8-bdc2-4ccc-b126-8107588c1304-original-pull-secret podName:df4bf5b8-bdc2-4ccc-b126-8107588c1304 nodeName:}" failed. No retries permitted until 2026-04-17 18:50:10.646710002 +0000 UTC m=+66.299245911 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/df4bf5b8-bdc2-4ccc-b126-8107588c1304-original-pull-secret") pod "global-pull-secret-syncer-jhmxj" (UID: "df4bf5b8-bdc2-4ccc-b126-8107588c1304") : object "kube-system"/"original-pull-secret" not registered Apr 17 18:49:38.747649 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.747601 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2k2fp\" (UniqueName: \"kubernetes.io/projected/699f0c3d-9fe9-46b1-8ca2-967fca78f239-kube-api-access-2k2fp\") pod \"network-check-target-qbtjg\" (UID: \"699f0c3d-9fe9-46b1-8ca2-967fca78f239\") " pod="openshift-network-diagnostics/network-check-target-qbtjg" Apr 17 18:49:38.747828 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.747729 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/70ceb0f8-7a3d-4e29-9470-f18b8af1daa1-metrics-certs\") pod \"network-metrics-daemon-v24kx\" (UID: \"70ceb0f8-7a3d-4e29-9470-f18b8af1daa1\") " pod="openshift-multus/network-metrics-daemon-v24kx" Apr 17 18:49:38.747828 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:38.747808 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 18:49:38.747960 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:38.747834 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 18:49:38.747960 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:38.747848 2580 projected.go:194] Error preparing data for projected volume kube-api-access-2k2fp for pod openshift-network-diagnostics/network-check-target-qbtjg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:49:38.747960 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:38.747866 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:49:38.747960 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:38.747923 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/699f0c3d-9fe9-46b1-8ca2-967fca78f239-kube-api-access-2k2fp podName:699f0c3d-9fe9-46b1-8ca2-967fca78f239 nodeName:}" failed. No retries permitted until 2026-04-17 18:50:10.747889272 +0000 UTC m=+66.400425176 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-2k2fp" (UniqueName: "kubernetes.io/projected/699f0c3d-9fe9-46b1-8ca2-967fca78f239-kube-api-access-2k2fp") pod "network-check-target-qbtjg" (UID: "699f0c3d-9fe9-46b1-8ca2-967fca78f239") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:49:38.747960 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:38.747949 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70ceb0f8-7a3d-4e29-9470-f18b8af1daa1-metrics-certs podName:70ceb0f8-7a3d-4e29-9470-f18b8af1daa1 nodeName:}" failed. No retries permitted until 2026-04-17 18:50:10.747930688 +0000 UTC m=+66.400466612 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/70ceb0f8-7a3d-4e29-9470-f18b8af1daa1-metrics-certs") pod "network-metrics-daemon-v24kx" (UID: "70ceb0f8-7a3d-4e29-9470-f18b8af1daa1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:49:38.949122 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.949084 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/648e7199-fd23-4496-ad24-5b9e829d77fa-metrics-tls\") pod \"dns-default-l4g6s\" (UID: \"648e7199-fd23-4496-ad24-5b9e829d77fa\") " pod="openshift-dns/dns-default-l4g6s" Apr 17 18:49:38.949310 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.949139 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/98b160b8-551d-443c-a3a0-4d046919e27c-cert\") pod \"ingress-canary-94ggh\" (UID: \"98b160b8-551d-443c-a3a0-4d046919e27c\") " pod="openshift-ingress-canary/ingress-canary-94ggh" Apr 17 18:49:38.949310 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:38.949207 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 18:49:38.949310 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.949229 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14c44177-594e-4d5c-a3a1-0e8d4627aabd-registry-tls\") pod \"image-registry-67d5f96857-4xrp8\" (UID: \"14c44177-594e-4d5c-a3a1-0e8d4627aabd\") " pod="openshift-image-registry/image-registry-67d5f96857-4xrp8" Apr 17 18:49:38.949310 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:38.949276 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/648e7199-fd23-4496-ad24-5b9e829d77fa-metrics-tls podName:648e7199-fd23-4496-ad24-5b9e829d77fa nodeName:}" failed. No retries permitted until 2026-04-17 18:49:39.949260602 +0000 UTC m=+35.601796510 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/648e7199-fd23-4496-ad24-5b9e829d77fa-metrics-tls") pod "dns-default-l4g6s" (UID: "648e7199-fd23-4496-ad24-5b9e829d77fa") : secret "dns-default-metrics-tls" not found Apr 17 18:49:38.949310 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:38.949306 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 18:49:38.949535 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:38.949318 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 18:49:38.949535 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:38.949332 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67d5f96857-4xrp8: secret "image-registry-tls" not found Apr 17 18:49:38.949535 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:38.949367 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98b160b8-551d-443c-a3a0-4d046919e27c-cert podName:98b160b8-551d-443c-a3a0-4d046919e27c nodeName:}" failed. No retries permitted until 2026-04-17 18:49:39.949350031 +0000 UTC m=+35.601885946 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/98b160b8-551d-443c-a3a0-4d046919e27c-cert") pod "ingress-canary-94ggh" (UID: "98b160b8-551d-443c-a3a0-4d046919e27c") : secret "canary-serving-cert" not found Apr 17 18:49:38.949535 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:38.949389 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/14c44177-594e-4d5c-a3a1-0e8d4627aabd-registry-tls podName:14c44177-594e-4d5c-a3a1-0e8d4627aabd nodeName:}" failed. No retries permitted until 2026-04-17 18:49:39.949378923 +0000 UTC m=+35.601914839 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/14c44177-594e-4d5c-a3a1-0e8d4627aabd-registry-tls") pod "image-registry-67d5f96857-4xrp8" (UID: "14c44177-594e-4d5c-a3a1-0e8d4627aabd") : secret "image-registry-tls" not found Apr 17 18:49:38.973162 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.973131 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jhmxj" Apr 17 18:49:38.973923 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.973722 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v24kx" Apr 17 18:49:38.974056 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.973944 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbtjg" Apr 17 18:49:38.975717 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.975698 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 18:49:38.976117 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.976088 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 18:49:38.976737 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.976442 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 18:49:38.976737 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.976496 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 18:49:38.976737 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.976602 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-tlcmp\"" Apr 17 18:49:38.976737 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:38.976662 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-dwb49\"" Apr 17 18:49:39.957333 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:39.957294 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/98b160b8-551d-443c-a3a0-4d046919e27c-cert\") pod \"ingress-canary-94ggh\" (UID: \"98b160b8-551d-443c-a3a0-4d046919e27c\") " pod="openshift-ingress-canary/ingress-canary-94ggh" Apr 17 18:49:39.957947 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:39.957393 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14c44177-594e-4d5c-a3a1-0e8d4627aabd-registry-tls\") pod \"image-registry-67d5f96857-4xrp8\" (UID: \"14c44177-594e-4d5c-a3a1-0e8d4627aabd\") " pod="openshift-image-registry/image-registry-67d5f96857-4xrp8" Apr 17 18:49:39.957947 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:39.957437 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/648e7199-fd23-4496-ad24-5b9e829d77fa-metrics-tls\") pod \"dns-default-l4g6s\" (UID: \"648e7199-fd23-4496-ad24-5b9e829d77fa\") " pod="openshift-dns/dns-default-l4g6s" Apr 17 18:49:39.957947 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:39.957467 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 18:49:39.957947 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:39.957531 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98b160b8-551d-443c-a3a0-4d046919e27c-cert podName:98b160b8-551d-443c-a3a0-4d046919e27c nodeName:}" failed. No retries permitted until 2026-04-17 18:49:41.95751072 +0000 UTC m=+37.610046641 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/98b160b8-551d-443c-a3a0-4d046919e27c-cert") pod "ingress-canary-94ggh" (UID: "98b160b8-551d-443c-a3a0-4d046919e27c") : secret "canary-serving-cert" not found Apr 17 18:49:39.957947 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:39.957531 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 18:49:39.957947 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:39.957568 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/648e7199-fd23-4496-ad24-5b9e829d77fa-metrics-tls podName:648e7199-fd23-4496-ad24-5b9e829d77fa nodeName:}" failed. No retries permitted until 2026-04-17 18:49:41.957558369 +0000 UTC m=+37.610094276 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/648e7199-fd23-4496-ad24-5b9e829d77fa-metrics-tls") pod "dns-default-l4g6s" (UID: "648e7199-fd23-4496-ad24-5b9e829d77fa") : secret "dns-default-metrics-tls" not found Apr 17 18:49:39.957947 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:39.957592 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 18:49:39.957947 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:39.957615 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67d5f96857-4xrp8: secret "image-registry-tls" not found Apr 17 18:49:39.957947 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:39.957677 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/14c44177-594e-4d5c-a3a1-0e8d4627aabd-registry-tls podName:14c44177-594e-4d5c-a3a1-0e8d4627aabd nodeName:}" failed. No retries permitted until 2026-04-17 18:49:41.957658885 +0000 UTC m=+37.610194795 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/14c44177-594e-4d5c-a3a1-0e8d4627aabd-registry-tls") pod "image-registry-67d5f96857-4xrp8" (UID: "14c44177-594e-4d5c-a3a1-0e8d4627aabd") : secret "image-registry-tls" not found Apr 17 18:49:41.975833 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:41.975800 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/98b160b8-551d-443c-a3a0-4d046919e27c-cert\") pod \"ingress-canary-94ggh\" (UID: \"98b160b8-551d-443c-a3a0-4d046919e27c\") " pod="openshift-ingress-canary/ingress-canary-94ggh" Apr 17 18:49:41.976567 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:41.975875 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14c44177-594e-4d5c-a3a1-0e8d4627aabd-registry-tls\") pod \"image-registry-67d5f96857-4xrp8\" (UID: \"14c44177-594e-4d5c-a3a1-0e8d4627aabd\") " pod="openshift-image-registry/image-registry-67d5f96857-4xrp8" Apr 17 18:49:41.976567 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:41.975927 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/648e7199-fd23-4496-ad24-5b9e829d77fa-metrics-tls\") pod \"dns-default-l4g6s\" (UID: \"648e7199-fd23-4496-ad24-5b9e829d77fa\") " pod="openshift-dns/dns-default-l4g6s" Apr 17 18:49:41.976567 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:41.975984 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 18:49:41.976567 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:41.976010 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 18:49:41.976567 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:41.976016 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 18:49:41.976567 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:41.976031 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67d5f96857-4xrp8: secret "image-registry-tls" not found Apr 17 18:49:41.976567 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:41.976053 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/648e7199-fd23-4496-ad24-5b9e829d77fa-metrics-tls podName:648e7199-fd23-4496-ad24-5b9e829d77fa nodeName:}" failed. No retries permitted until 2026-04-17 18:49:45.976039514 +0000 UTC m=+41.628575426 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/648e7199-fd23-4496-ad24-5b9e829d77fa-metrics-tls") pod "dns-default-l4g6s" (UID: "648e7199-fd23-4496-ad24-5b9e829d77fa") : secret "dns-default-metrics-tls" not found Apr 17 18:49:41.976567 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:41.976066 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98b160b8-551d-443c-a3a0-4d046919e27c-cert podName:98b160b8-551d-443c-a3a0-4d046919e27c nodeName:}" failed. No retries permitted until 2026-04-17 18:49:45.976059365 +0000 UTC m=+41.628595269 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/98b160b8-551d-443c-a3a0-4d046919e27c-cert") pod "ingress-canary-94ggh" (UID: "98b160b8-551d-443c-a3a0-4d046919e27c") : secret "canary-serving-cert" not found Apr 17 18:49:41.976567 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:41.976078 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/14c44177-594e-4d5c-a3a1-0e8d4627aabd-registry-tls podName:14c44177-594e-4d5c-a3a1-0e8d4627aabd nodeName:}" failed. No retries permitted until 2026-04-17 18:49:45.9760734 +0000 UTC m=+41.628609304 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/14c44177-594e-4d5c-a3a1-0e8d4627aabd-registry-tls") pod "image-registry-67d5f96857-4xrp8" (UID: "14c44177-594e-4d5c-a3a1-0e8d4627aabd") : secret "image-registry-tls" not found Apr 17 18:49:42.120274 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:42.120244 2580 generic.go:358] "Generic (PLEG): container finished" podID="ff6339d8-541a-4947-b00d-5f600d8d08c1" containerID="ab3702f9be54125e85e6cdec7045678cb14436c7fbb7e65f2532774f07ca1314" exitCode=0 Apr 17 18:49:42.120507 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:42.120300 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4hnl2" event={"ID":"ff6339d8-541a-4947-b00d-5f600d8d08c1","Type":"ContainerDied","Data":"ab3702f9be54125e85e6cdec7045678cb14436c7fbb7e65f2532774f07ca1314"} Apr 17 18:49:43.125040 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:43.125002 2580 generic.go:358] "Generic (PLEG): container finished" podID="ff6339d8-541a-4947-b00d-5f600d8d08c1" containerID="bfeb1f61ff07ea7977bfd1b30710fea9139c2c135cc1069a0d0f92408d4fdcea" exitCode=0 Apr 17 18:49:43.125472 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:43.125047 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4hnl2" event={"ID":"ff6339d8-541a-4947-b00d-5f600d8d08c1","Type":"ContainerDied","Data":"bfeb1f61ff07ea7977bfd1b30710fea9139c2c135cc1069a0d0f92408d4fdcea"} Apr 17 18:49:44.130226 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:44.130033 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4hnl2" event={"ID":"ff6339d8-541a-4947-b00d-5f600d8d08c1","Type":"ContainerStarted","Data":"cba86f004812e74a43af43318c5b9c36f4b0bccb25decb8b8b8405b8e0e32179"} Apr 17 18:49:44.153166 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:44.153115 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-4hnl2" podStartSLOduration=5.872532674 podStartE2EDuration="39.15310344s" podCreationTimestamp="2026-04-17 18:49:05 +0000 UTC" firstStartedPulling="2026-04-17 18:49:07.764917463 +0000 UTC m=+3.417453380" lastFinishedPulling="2026-04-17 18:49:41.045488243 +0000 UTC m=+36.698024146" observedRunningTime="2026-04-17 18:49:44.152556966 +0000 UTC m=+39.805092891" watchObservedRunningTime="2026-04-17 18:49:44.15310344 +0000 UTC m=+39.805639366" Apr 17 18:49:46.005509 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:46.005472 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/98b160b8-551d-443c-a3a0-4d046919e27c-cert\") pod \"ingress-canary-94ggh\" (UID: \"98b160b8-551d-443c-a3a0-4d046919e27c\") " pod="openshift-ingress-canary/ingress-canary-94ggh" Apr 17 18:49:46.005929 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:46.005553 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14c44177-594e-4d5c-a3a1-0e8d4627aabd-registry-tls\") pod \"image-registry-67d5f96857-4xrp8\" (UID: \"14c44177-594e-4d5c-a3a1-0e8d4627aabd\") " pod="openshift-image-registry/image-registry-67d5f96857-4xrp8" Apr 17 18:49:46.005929 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:46.005582 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/648e7199-fd23-4496-ad24-5b9e829d77fa-metrics-tls\") pod \"dns-default-l4g6s\" (UID: \"648e7199-fd23-4496-ad24-5b9e829d77fa\") " pod="openshift-dns/dns-default-l4g6s" Apr 17 18:49:46.005929 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:46.005622 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 18:49:46.005929 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:46.005672 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 18:49:46.005929 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:46.005684 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 18:49:46.005929 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:46.005699 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67d5f96857-4xrp8: secret "image-registry-tls" not found Apr 17 18:49:46.005929 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:46.005688 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98b160b8-551d-443c-a3a0-4d046919e27c-cert podName:98b160b8-551d-443c-a3a0-4d046919e27c nodeName:}" failed. No retries permitted until 2026-04-17 18:49:54.005671585 +0000 UTC m=+49.658207489 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/98b160b8-551d-443c-a3a0-4d046919e27c-cert") pod "ingress-canary-94ggh" (UID: "98b160b8-551d-443c-a3a0-4d046919e27c") : secret "canary-serving-cert" not found Apr 17 18:49:46.005929 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:46.005767 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/648e7199-fd23-4496-ad24-5b9e829d77fa-metrics-tls podName:648e7199-fd23-4496-ad24-5b9e829d77fa nodeName:}" failed. No retries permitted until 2026-04-17 18:49:54.005748 +0000 UTC m=+49.658283906 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/648e7199-fd23-4496-ad24-5b9e829d77fa-metrics-tls") pod "dns-default-l4g6s" (UID: "648e7199-fd23-4496-ad24-5b9e829d77fa") : secret "dns-default-metrics-tls" not found Apr 17 18:49:46.005929 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:46.005786 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/14c44177-594e-4d5c-a3a1-0e8d4627aabd-registry-tls podName:14c44177-594e-4d5c-a3a1-0e8d4627aabd nodeName:}" failed. No retries permitted until 2026-04-17 18:49:54.005776173 +0000 UTC m=+49.658312080 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/14c44177-594e-4d5c-a3a1-0e8d4627aabd-registry-tls") pod "image-registry-67d5f96857-4xrp8" (UID: "14c44177-594e-4d5c-a3a1-0e8d4627aabd") : secret "image-registry-tls" not found Apr 17 18:49:54.064327 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:54.064291 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14c44177-594e-4d5c-a3a1-0e8d4627aabd-registry-tls\") pod \"image-registry-67d5f96857-4xrp8\" (UID: \"14c44177-594e-4d5c-a3a1-0e8d4627aabd\") " pod="openshift-image-registry/image-registry-67d5f96857-4xrp8" Apr 17 18:49:54.064702 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:54.064338 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/648e7199-fd23-4496-ad24-5b9e829d77fa-metrics-tls\") pod \"dns-default-l4g6s\" (UID: \"648e7199-fd23-4496-ad24-5b9e829d77fa\") " pod="openshift-dns/dns-default-l4g6s" Apr 17 18:49:54.064702 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:54.064440 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 18:49:54.064702 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:54.064459 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67d5f96857-4xrp8: secret "image-registry-tls" not found Apr 17 18:49:54.064702 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:54.064484 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 18:49:54.064702 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:54.064511 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/14c44177-594e-4d5c-a3a1-0e8d4627aabd-registry-tls podName:14c44177-594e-4d5c-a3a1-0e8d4627aabd nodeName:}" failed. No retries permitted until 2026-04-17 18:50:10.064494667 +0000 UTC m=+65.717030575 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/14c44177-594e-4d5c-a3a1-0e8d4627aabd-registry-tls") pod "image-registry-67d5f96857-4xrp8" (UID: "14c44177-594e-4d5c-a3a1-0e8d4627aabd") : secret "image-registry-tls" not found Apr 17 18:49:54.064702 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:49:54.064524 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/98b160b8-551d-443c-a3a0-4d046919e27c-cert\") pod \"ingress-canary-94ggh\" (UID: \"98b160b8-551d-443c-a3a0-4d046919e27c\") " pod="openshift-ingress-canary/ingress-canary-94ggh" Apr 17 18:49:54.064702 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:54.064571 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/648e7199-fd23-4496-ad24-5b9e829d77fa-metrics-tls podName:648e7199-fd23-4496-ad24-5b9e829d77fa nodeName:}" failed. No retries permitted until 2026-04-17 18:50:10.064560566 +0000 UTC m=+65.717096470 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/648e7199-fd23-4496-ad24-5b9e829d77fa-metrics-tls") pod "dns-default-l4g6s" (UID: "648e7199-fd23-4496-ad24-5b9e829d77fa") : secret "dns-default-metrics-tls" not found Apr 17 18:49:54.064702 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:54.064620 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 18:49:54.064702 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:49:54.064667 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98b160b8-551d-443c-a3a0-4d046919e27c-cert podName:98b160b8-551d-443c-a3a0-4d046919e27c nodeName:}" failed. No retries permitted until 2026-04-17 18:50:10.064656921 +0000 UTC m=+65.717192824 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/98b160b8-551d-443c-a3a0-4d046919e27c-cert") pod "ingress-canary-94ggh" (UID: "98b160b8-551d-443c-a3a0-4d046919e27c") : secret "canary-serving-cert" not found Apr 17 18:50:05.114221 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:05.114189 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7klhd" Apr 17 18:50:10.081634 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:10.081592 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/98b160b8-551d-443c-a3a0-4d046919e27c-cert\") pod \"ingress-canary-94ggh\" (UID: \"98b160b8-551d-443c-a3a0-4d046919e27c\") " pod="openshift-ingress-canary/ingress-canary-94ggh" Apr 17 18:50:10.082122 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:10.081694 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14c44177-594e-4d5c-a3a1-0e8d4627aabd-registry-tls\") pod \"image-registry-67d5f96857-4xrp8\" (UID: \"14c44177-594e-4d5c-a3a1-0e8d4627aabd\") " pod="openshift-image-registry/image-registry-67d5f96857-4xrp8" Apr 17 18:50:10.082122 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:10.081737 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/648e7199-fd23-4496-ad24-5b9e829d77fa-metrics-tls\") pod \"dns-default-l4g6s\" (UID: \"648e7199-fd23-4496-ad24-5b9e829d77fa\") " pod="openshift-dns/dns-default-l4g6s" Apr 17 18:50:10.082122 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:10.081742 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 18:50:10.082122 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:10.081805 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98b160b8-551d-443c-a3a0-4d046919e27c-cert podName:98b160b8-551d-443c-a3a0-4d046919e27c nodeName:}" failed. No retries permitted until 2026-04-17 18:50:42.081786434 +0000 UTC m=+97.734322339 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/98b160b8-551d-443c-a3a0-4d046919e27c-cert") pod "ingress-canary-94ggh" (UID: "98b160b8-551d-443c-a3a0-4d046919e27c") : secret "canary-serving-cert" not found Apr 17 18:50:10.082122 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:10.081840 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 18:50:10.082122 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:10.081858 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67d5f96857-4xrp8: secret "image-registry-tls" not found Apr 17 18:50:10.082122 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:10.081881 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 18:50:10.082122 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:10.081928 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/14c44177-594e-4d5c-a3a1-0e8d4627aabd-registry-tls podName:14c44177-594e-4d5c-a3a1-0e8d4627aabd nodeName:}" failed. No retries permitted until 2026-04-17 18:50:42.081910408 +0000 UTC m=+97.734446328 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/14c44177-594e-4d5c-a3a1-0e8d4627aabd-registry-tls") pod "image-registry-67d5f96857-4xrp8" (UID: "14c44177-594e-4d5c-a3a1-0e8d4627aabd") : secret "image-registry-tls" not found Apr 17 18:50:10.082122 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:10.081949 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/648e7199-fd23-4496-ad24-5b9e829d77fa-metrics-tls podName:648e7199-fd23-4496-ad24-5b9e829d77fa nodeName:}" failed. No retries permitted until 2026-04-17 18:50:42.081934829 +0000 UTC m=+97.734470734 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/648e7199-fd23-4496-ad24-5b9e829d77fa-metrics-tls") pod "dns-default-l4g6s" (UID: "648e7199-fd23-4496-ad24-5b9e829d77fa") : secret "dns-default-metrics-tls" not found Apr 17 18:50:10.686337 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:10.686302 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/df4bf5b8-bdc2-4ccc-b126-8107588c1304-original-pull-secret\") pod \"global-pull-secret-syncer-jhmxj\" (UID: \"df4bf5b8-bdc2-4ccc-b126-8107588c1304\") " pod="kube-system/global-pull-secret-syncer-jhmxj" Apr 17 18:50:10.689119 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:10.689099 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 18:50:10.699910 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:10.699874 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/df4bf5b8-bdc2-4ccc-b126-8107588c1304-original-pull-secret\") pod \"global-pull-secret-syncer-jhmxj\" (UID: \"df4bf5b8-bdc2-4ccc-b126-8107588c1304\") " pod="kube-system/global-pull-secret-syncer-jhmxj" Apr 17 18:50:10.786251 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:10.786216 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jhmxj" Apr 17 18:50:10.786723 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:10.786698 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/70ceb0f8-7a3d-4e29-9470-f18b8af1daa1-metrics-certs\") pod \"network-metrics-daemon-v24kx\" (UID: \"70ceb0f8-7a3d-4e29-9470-f18b8af1daa1\") " pod="openshift-multus/network-metrics-daemon-v24kx" Apr 17 18:50:10.786806 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:10.786774 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2k2fp\" (UniqueName: \"kubernetes.io/projected/699f0c3d-9fe9-46b1-8ca2-967fca78f239-kube-api-access-2k2fp\") pod \"network-check-target-qbtjg\" (UID: \"699f0c3d-9fe9-46b1-8ca2-967fca78f239\") " pod="openshift-network-diagnostics/network-check-target-qbtjg" Apr 17 18:50:10.789214 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:10.789195 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 18:50:10.789283 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:10.789264 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 18:50:10.797371 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:10.797348 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 18:50:10.797472 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:10.797424 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70ceb0f8-7a3d-4e29-9470-f18b8af1daa1-metrics-certs podName:70ceb0f8-7a3d-4e29-9470-f18b8af1daa1 nodeName:}" failed. No retries permitted until 2026-04-17 18:51:14.79740304 +0000 UTC m=+130.449938945 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/70ceb0f8-7a3d-4e29-9470-f18b8af1daa1-metrics-certs") pod "network-metrics-daemon-v24kx" (UID: "70ceb0f8-7a3d-4e29-9470-f18b8af1daa1") : secret "metrics-daemon-secret" not found Apr 17 18:50:10.799469 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:10.799445 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 18:50:10.810316 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:10.810292 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k2fp\" (UniqueName: \"kubernetes.io/projected/699f0c3d-9fe9-46b1-8ca2-967fca78f239-kube-api-access-2k2fp\") pod \"network-check-target-qbtjg\" (UID: \"699f0c3d-9fe9-46b1-8ca2-967fca78f239\") " pod="openshift-network-diagnostics/network-check-target-qbtjg" Apr 17 18:50:10.949925 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:10.949835 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-jhmxj"] Apr 17 18:50:10.959740 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:50:10.959711 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf4bf5b8_bdc2_4ccc_b126_8107588c1304.slice/crio-e054b1c4f09afd56bac1850a91fc5870ec07c295a14fecb9f12581b03fc77488 WatchSource:0}: Error finding container e054b1c4f09afd56bac1850a91fc5870ec07c295a14fecb9f12581b03fc77488: Status 404 returned error can't find the container with id e054b1c4f09afd56bac1850a91fc5870ec07c295a14fecb9f12581b03fc77488 Apr 17 18:50:11.096046 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:11.096017 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-tlcmp\"" Apr 17 18:50:11.104147 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:11.104120 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbtjg" Apr 17 18:50:11.183714 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:11.183676 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-jhmxj" event={"ID":"df4bf5b8-bdc2-4ccc-b126-8107588c1304","Type":"ContainerStarted","Data":"e054b1c4f09afd56bac1850a91fc5870ec07c295a14fecb9f12581b03fc77488"} Apr 17 18:50:11.229180 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:11.229151 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qbtjg"] Apr 17 18:50:11.232569 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:50:11.232543 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod699f0c3d_9fe9_46b1_8ca2_967fca78f239.slice/crio-3dbbfde02a6e0b89102a9a2b4dd67d9133374c9d96a191f122f7d78f7b3fd310 WatchSource:0}: Error finding container 3dbbfde02a6e0b89102a9a2b4dd67d9133374c9d96a191f122f7d78f7b3fd310: Status 404 returned error can't find the container with id 3dbbfde02a6e0b89102a9a2b4dd67d9133374c9d96a191f122f7d78f7b3fd310 Apr 17 18:50:12.189325 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:12.189267 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qbtjg" event={"ID":"699f0c3d-9fe9-46b1-8ca2-967fca78f239","Type":"ContainerStarted","Data":"3dbbfde02a6e0b89102a9a2b4dd67d9133374c9d96a191f122f7d78f7b3fd310"} Apr 17 18:50:16.199148 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:16.199105 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-jhmxj" event={"ID":"df4bf5b8-bdc2-4ccc-b126-8107588c1304","Type":"ContainerStarted","Data":"155b71fb6a7aef28aa423817996894d9a0f7e5781111b333ca210cad5da73b4c"} Apr 17 18:50:16.200454 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:16.200432 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qbtjg" event={"ID":"699f0c3d-9fe9-46b1-8ca2-967fca78f239","Type":"ContainerStarted","Data":"5ef6139e5ba7ea1d9eab75a2123322465da43231c6a2ece202eec1592c3f1926"} Apr 17 18:50:16.200574 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:16.200562 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-qbtjg" Apr 17 18:50:16.215310 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:16.215271 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-jhmxj" podStartSLOduration=66.649778016 podStartE2EDuration="1m11.215259348s" podCreationTimestamp="2026-04-17 18:49:05 +0000 UTC" firstStartedPulling="2026-04-17 18:50:10.961345849 +0000 UTC m=+66.613881759" lastFinishedPulling="2026-04-17 18:50:15.526827186 +0000 UTC m=+71.179363091" observedRunningTime="2026-04-17 18:50:16.214127994 +0000 UTC m=+71.866663920" watchObservedRunningTime="2026-04-17 18:50:16.215259348 +0000 UTC m=+71.867795274" Apr 17 18:50:16.228289 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:16.228253 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-qbtjg" podStartSLOduration=66.939324477 podStartE2EDuration="1m11.228240607s" podCreationTimestamp="2026-04-17 18:49:05 +0000 UTC" firstStartedPulling="2026-04-17 18:50:11.234464963 +0000 UTC m=+66.887000867" lastFinishedPulling="2026-04-17 18:50:15.523381086 +0000 UTC m=+71.175916997" observedRunningTime="2026-04-17 18:50:16.227431887 +0000 UTC m=+71.879967813" watchObservedRunningTime="2026-04-17 18:50:16.228240607 +0000 UTC m=+71.880776523" Apr 17 18:50:37.206035 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.205999 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-7sfxc"] Apr 17 18:50:37.210275 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.210259 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7sfxc" Apr 17 18:50:37.212709 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.212682 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 18:50:37.212832 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.212684 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-bcdl4\"" Apr 17 18:50:37.212832 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.212722 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 17 18:50:37.212832 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.212722 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 18:50:37.214939 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.214920 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 17 18:50:37.215979 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.215957 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-7sfxc"] Apr 17 18:50:37.284991 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.284957 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fb1d88d0-276e-45c5-8a26-0c045d19801b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7sfxc\" (UID: \"fb1d88d0-276e-45c5-8a26-0c045d19801b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7sfxc" Apr 17 18:50:37.285150 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.285024 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brd2f\" (UniqueName: \"kubernetes.io/projected/fb1d88d0-276e-45c5-8a26-0c045d19801b-kube-api-access-brd2f\") pod \"cluster-monitoring-operator-75587bd455-7sfxc\" (UID: \"fb1d88d0-276e-45c5-8a26-0c045d19801b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7sfxc" Apr 17 18:50:37.285150 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.285068 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/fb1d88d0-276e-45c5-8a26-0c045d19801b-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-7sfxc\" (UID: \"fb1d88d0-276e-45c5-8a26-0c045d19801b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7sfxc" Apr 17 18:50:37.307263 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.307235 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-g9wkh"] Apr 17 18:50:37.310090 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.310076 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-g9wkh" Apr 17 18:50:37.312682 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.312653 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 17 18:50:37.312939 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.312708 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-b67fl\"" Apr 17 18:50:37.313429 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.313404 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 17 18:50:37.313753 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.313732 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f5nwl"] Apr 17 18:50:37.316652 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.316633 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-whmp8"] Apr 17 18:50:37.316861 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.316840 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f5nwl" Apr 17 18:50:37.319108 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.319086 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 17 18:50:37.319212 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.319165 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 17 18:50:37.319275 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.319171 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 17 18:50:37.319402 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.319382 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-g9wkh"] Apr 17 18:50:37.319511 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.319488 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-whmp8" Apr 17 18:50:37.319580 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.319564 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-jx96r\"" Apr 17 18:50:37.322074 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.322055 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-fh756\"" Apr 17 18:50:37.322170 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.322109 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 17 18:50:37.322376 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.322359 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 17 18:50:37.322437 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.322387 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 17 18:50:37.322489 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.322388 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 17 18:50:37.327365 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.327347 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f5nwl"] Apr 17 18:50:37.333284 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.333248 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 17 18:50:37.341192 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.341171 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-whmp8"] Apr 17 18:50:37.386391 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.386358 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/fb1d88d0-276e-45c5-8a26-0c045d19801b-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-7sfxc\" (UID: \"fb1d88d0-276e-45c5-8a26-0c045d19801b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7sfxc" Apr 17 18:50:37.386391 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.386396 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbc6v\" (UniqueName: \"kubernetes.io/projected/ea018e6b-4b5a-464d-ba7a-d5dd8fb8f0b6-kube-api-access-kbc6v\") pod \"volume-data-source-validator-7c6cbb6c87-g9wkh\" (UID: \"ea018e6b-4b5a-464d-ba7a-d5dd8fb8f0b6\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-g9wkh" Apr 17 18:50:37.386587 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.386431 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fb1d88d0-276e-45c5-8a26-0c045d19801b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7sfxc\" (UID: \"fb1d88d0-276e-45c5-8a26-0c045d19801b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7sfxc" Apr 17 18:50:37.386587 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.386482 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-brd2f\" (UniqueName: \"kubernetes.io/projected/fb1d88d0-276e-45c5-8a26-0c045d19801b-kube-api-access-brd2f\") pod \"cluster-monitoring-operator-75587bd455-7sfxc\" (UID: \"fb1d88d0-276e-45c5-8a26-0c045d19801b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7sfxc" Apr 17 18:50:37.386651 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:37.386582 2580 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 18:50:37.386683 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:37.386659 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb1d88d0-276e-45c5-8a26-0c045d19801b-cluster-monitoring-operator-tls podName:fb1d88d0-276e-45c5-8a26-0c045d19801b nodeName:}" failed. No retries permitted until 2026-04-17 18:50:37.886639403 +0000 UTC m=+93.539175324 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/fb1d88d0-276e-45c5-8a26-0c045d19801b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-7sfxc" (UID: "fb1d88d0-276e-45c5-8a26-0c045d19801b") : secret "cluster-monitoring-operator-tls" not found Apr 17 18:50:37.387598 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.387580 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/fb1d88d0-276e-45c5-8a26-0c045d19801b-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-7sfxc\" (UID: \"fb1d88d0-276e-45c5-8a26-0c045d19801b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7sfxc" Apr 17 18:50:37.394404 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.394377 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-brd2f\" (UniqueName: \"kubernetes.io/projected/fb1d88d0-276e-45c5-8a26-0c045d19801b-kube-api-access-brd2f\") pod \"cluster-monitoring-operator-75587bd455-7sfxc\" (UID: \"fb1d88d0-276e-45c5-8a26-0c045d19801b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7sfxc" Apr 17 18:50:37.405414 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.405391 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h9jnl"] Apr 17 18:50:37.408504 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.408480 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h9jnl" Apr 17 18:50:37.409742 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.409721 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-f4f48c65b-6jj2h"] Apr 17 18:50:37.410722 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.410697 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 17 18:50:37.410835 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.410721 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 17 18:50:37.410835 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.410729 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-wmlfq\"" Apr 17 18:50:37.410835 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.410796 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 17 18:50:37.410835 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.410831 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 17 18:50:37.412738 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.412720 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-f4f48c65b-6jj2h" Apr 17 18:50:37.414819 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.414801 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 17 18:50:37.414959 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.414829 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 17 18:50:37.414959 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.414832 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 18:50:37.415120 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.415104 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 17 18:50:37.415169 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.415125 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 17 18:50:37.415169 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.415155 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 18:50:37.415254 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.415213 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-hrrp5\"" Apr 17 18:50:37.418477 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.418452 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h9jnl"] Apr 17 18:50:37.422012 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.421989 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-f4f48c65b-6jj2h"] Apr 17 18:50:37.487636 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.487602 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2p9f\" (UniqueName: \"kubernetes.io/projected/b2f737c1-0835-4a54-9ad6-12e1bcb1e33e-kube-api-access-h2p9f\") pod \"cluster-samples-operator-6dc5bdb6b4-f5nwl\" (UID: \"b2f737c1-0835-4a54-9ad6-12e1bcb1e33e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f5nwl" Apr 17 18:50:37.487823 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.487639 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40e1d37f-b2bf-44fe-a623-a1c64ed6ba58-config\") pod \"service-ca-operator-d6fc45fc5-h9jnl\" (UID: \"40e1d37f-b2bf-44fe-a623-a1c64ed6ba58\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h9jnl" Apr 17 18:50:37.487823 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.487677 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kbc6v\" (UniqueName: \"kubernetes.io/projected/ea018e6b-4b5a-464d-ba7a-d5dd8fb8f0b6-kube-api-access-kbc6v\") pod \"volume-data-source-validator-7c6cbb6c87-g9wkh\" (UID: \"ea018e6b-4b5a-464d-ba7a-d5dd8fb8f0b6\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-g9wkh" Apr 17 18:50:37.487823 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.487706 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvb8n\" (UniqueName: \"kubernetes.io/projected/160e7e3f-dd7c-4341-8e04-ec0fc5728152-kube-api-access-zvb8n\") pod \"console-operator-9d4b6777b-whmp8\" (UID: \"160e7e3f-dd7c-4341-8e04-ec0fc5728152\") " pod="openshift-console-operator/console-operator-9d4b6777b-whmp8" Apr 17 18:50:37.487823 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.487775 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/160e7e3f-dd7c-4341-8e04-ec0fc5728152-trusted-ca\") pod \"console-operator-9d4b6777b-whmp8\" (UID: \"160e7e3f-dd7c-4341-8e04-ec0fc5728152\") " pod="openshift-console-operator/console-operator-9d4b6777b-whmp8" Apr 17 18:50:37.487823 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.487793 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2f737c1-0835-4a54-9ad6-12e1bcb1e33e-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-f5nwl\" (UID: \"b2f737c1-0835-4a54-9ad6-12e1bcb1e33e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f5nwl" Apr 17 18:50:37.488081 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.487860 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/160e7e3f-dd7c-4341-8e04-ec0fc5728152-serving-cert\") pod \"console-operator-9d4b6777b-whmp8\" (UID: \"160e7e3f-dd7c-4341-8e04-ec0fc5728152\") " pod="openshift-console-operator/console-operator-9d4b6777b-whmp8" Apr 17 18:50:37.488081 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.487921 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/40e1d37f-b2bf-44fe-a623-a1c64ed6ba58-serving-cert\") pod \"service-ca-operator-d6fc45fc5-h9jnl\" (UID: \"40e1d37f-b2bf-44fe-a623-a1c64ed6ba58\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h9jnl" Apr 17 18:50:37.488081 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.487974 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nln2h\" (UniqueName: \"kubernetes.io/projected/40e1d37f-b2bf-44fe-a623-a1c64ed6ba58-kube-api-access-nln2h\") pod \"service-ca-operator-d6fc45fc5-h9jnl\" (UID: \"40e1d37f-b2bf-44fe-a623-a1c64ed6ba58\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h9jnl" Apr 17 18:50:37.488081 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.488020 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/160e7e3f-dd7c-4341-8e04-ec0fc5728152-config\") pod \"console-operator-9d4b6777b-whmp8\" (UID: \"160e7e3f-dd7c-4341-8e04-ec0fc5728152\") " pod="openshift-console-operator/console-operator-9d4b6777b-whmp8" Apr 17 18:50:37.495111 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.495084 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbc6v\" (UniqueName: \"kubernetes.io/projected/ea018e6b-4b5a-464d-ba7a-d5dd8fb8f0b6-kube-api-access-kbc6v\") pod \"volume-data-source-validator-7c6cbb6c87-g9wkh\" (UID: \"ea018e6b-4b5a-464d-ba7a-d5dd8fb8f0b6\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-g9wkh" Apr 17 18:50:37.588369 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.588330 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsmmv\" (UniqueName: \"kubernetes.io/projected/2050f398-269d-4da3-873f-4885dc5f98eb-kube-api-access-gsmmv\") pod \"router-default-f4f48c65b-6jj2h\" (UID: \"2050f398-269d-4da3-873f-4885dc5f98eb\") " pod="openshift-ingress/router-default-f4f48c65b-6jj2h" Apr 17 18:50:37.588533 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.588384 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2050f398-269d-4da3-873f-4885dc5f98eb-default-certificate\") pod \"router-default-f4f48c65b-6jj2h\" (UID: \"2050f398-269d-4da3-873f-4885dc5f98eb\") " pod="openshift-ingress/router-default-f4f48c65b-6jj2h" Apr 17 18:50:37.588533 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.588433 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/160e7e3f-dd7c-4341-8e04-ec0fc5728152-trusted-ca\") pod \"console-operator-9d4b6777b-whmp8\" (UID: \"160e7e3f-dd7c-4341-8e04-ec0fc5728152\") " pod="openshift-console-operator/console-operator-9d4b6777b-whmp8" Apr 17 18:50:37.588533 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.588470 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2f737c1-0835-4a54-9ad6-12e1bcb1e33e-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-f5nwl\" (UID: \"b2f737c1-0835-4a54-9ad6-12e1bcb1e33e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f5nwl" Apr 17 18:50:37.588533 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.588502 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/160e7e3f-dd7c-4341-8e04-ec0fc5728152-serving-cert\") pod \"console-operator-9d4b6777b-whmp8\" (UID: \"160e7e3f-dd7c-4341-8e04-ec0fc5728152\") " pod="openshift-console-operator/console-operator-9d4b6777b-whmp8" Apr 17 18:50:37.588533 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.588518 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/40e1d37f-b2bf-44fe-a623-a1c64ed6ba58-serving-cert\") pod \"service-ca-operator-d6fc45fc5-h9jnl\" (UID: \"40e1d37f-b2bf-44fe-a623-a1c64ed6ba58\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h9jnl" Apr 17 18:50:37.588777 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.588607 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nln2h\" (UniqueName: \"kubernetes.io/projected/40e1d37f-b2bf-44fe-a623-a1c64ed6ba58-kube-api-access-nln2h\") pod \"service-ca-operator-d6fc45fc5-h9jnl\" (UID: \"40e1d37f-b2bf-44fe-a623-a1c64ed6ba58\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h9jnl" Apr 17 18:50:37.588777 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.588654 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2050f398-269d-4da3-873f-4885dc5f98eb-stats-auth\") pod \"router-default-f4f48c65b-6jj2h\" (UID: \"2050f398-269d-4da3-873f-4885dc5f98eb\") " pod="openshift-ingress/router-default-f4f48c65b-6jj2h" Apr 17 18:50:37.588777 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:37.588615 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 18:50:37.588777 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.588699 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/160e7e3f-dd7c-4341-8e04-ec0fc5728152-config\") pod \"console-operator-9d4b6777b-whmp8\" (UID: \"160e7e3f-dd7c-4341-8e04-ec0fc5728152\") " pod="openshift-console-operator/console-operator-9d4b6777b-whmp8" Apr 17 18:50:37.588777 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.588736 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h2p9f\" (UniqueName: \"kubernetes.io/projected/b2f737c1-0835-4a54-9ad6-12e1bcb1e33e-kube-api-access-h2p9f\") pod \"cluster-samples-operator-6dc5bdb6b4-f5nwl\" (UID: \"b2f737c1-0835-4a54-9ad6-12e1bcb1e33e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f5nwl" Apr 17 18:50:37.588777 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:37.588770 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2f737c1-0835-4a54-9ad6-12e1bcb1e33e-samples-operator-tls podName:b2f737c1-0835-4a54-9ad6-12e1bcb1e33e nodeName:}" failed. No retries permitted until 2026-04-17 18:50:38.088749757 +0000 UTC m=+93.741285668 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/b2f737c1-0835-4a54-9ad6-12e1bcb1e33e-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-f5nwl" (UID: "b2f737c1-0835-4a54-9ad6-12e1bcb1e33e") : secret "samples-operator-tls" not found Apr 17 18:50:37.589110 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.588796 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40e1d37f-b2bf-44fe-a623-a1c64ed6ba58-config\") pod \"service-ca-operator-d6fc45fc5-h9jnl\" (UID: \"40e1d37f-b2bf-44fe-a623-a1c64ed6ba58\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h9jnl" Apr 17 18:50:37.589110 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.588844 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zvb8n\" (UniqueName: \"kubernetes.io/projected/160e7e3f-dd7c-4341-8e04-ec0fc5728152-kube-api-access-zvb8n\") pod \"console-operator-9d4b6777b-whmp8\" (UID: \"160e7e3f-dd7c-4341-8e04-ec0fc5728152\") " pod="openshift-console-operator/console-operator-9d4b6777b-whmp8" Apr 17 18:50:37.589110 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.588961 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2050f398-269d-4da3-873f-4885dc5f98eb-service-ca-bundle\") pod \"router-default-f4f48c65b-6jj2h\" (UID: \"2050f398-269d-4da3-873f-4885dc5f98eb\") " pod="openshift-ingress/router-default-f4f48c65b-6jj2h" Apr 17 18:50:37.589110 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.589024 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2050f398-269d-4da3-873f-4885dc5f98eb-metrics-certs\") pod \"router-default-f4f48c65b-6jj2h\" (UID: \"2050f398-269d-4da3-873f-4885dc5f98eb\") " pod="openshift-ingress/router-default-f4f48c65b-6jj2h" Apr 17 18:50:37.589408 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.589387 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/160e7e3f-dd7c-4341-8e04-ec0fc5728152-trusted-ca\") pod \"console-operator-9d4b6777b-whmp8\" (UID: \"160e7e3f-dd7c-4341-8e04-ec0fc5728152\") " pod="openshift-console-operator/console-operator-9d4b6777b-whmp8" Apr 17 18:50:37.589569 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.589543 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40e1d37f-b2bf-44fe-a623-a1c64ed6ba58-config\") pod \"service-ca-operator-d6fc45fc5-h9jnl\" (UID: \"40e1d37f-b2bf-44fe-a623-a1c64ed6ba58\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h9jnl" Apr 17 18:50:37.589710 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.589690 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/160e7e3f-dd7c-4341-8e04-ec0fc5728152-config\") pod \"console-operator-9d4b6777b-whmp8\" (UID: \"160e7e3f-dd7c-4341-8e04-ec0fc5728152\") " pod="openshift-console-operator/console-operator-9d4b6777b-whmp8" Apr 17 18:50:37.591035 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.591018 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/40e1d37f-b2bf-44fe-a623-a1c64ed6ba58-serving-cert\") pod \"service-ca-operator-d6fc45fc5-h9jnl\" (UID: \"40e1d37f-b2bf-44fe-a623-a1c64ed6ba58\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h9jnl" Apr 17 18:50:37.591035 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.591029 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/160e7e3f-dd7c-4341-8e04-ec0fc5728152-serving-cert\") pod \"console-operator-9d4b6777b-whmp8\" (UID: \"160e7e3f-dd7c-4341-8e04-ec0fc5728152\") " pod="openshift-console-operator/console-operator-9d4b6777b-whmp8" Apr 17 18:50:37.597395 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.597373 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nln2h\" (UniqueName: \"kubernetes.io/projected/40e1d37f-b2bf-44fe-a623-a1c64ed6ba58-kube-api-access-nln2h\") pod \"service-ca-operator-d6fc45fc5-h9jnl\" (UID: \"40e1d37f-b2bf-44fe-a623-a1c64ed6ba58\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h9jnl" Apr 17 18:50:37.597647 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.597630 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2p9f\" (UniqueName: \"kubernetes.io/projected/b2f737c1-0835-4a54-9ad6-12e1bcb1e33e-kube-api-access-h2p9f\") pod \"cluster-samples-operator-6dc5bdb6b4-f5nwl\" (UID: \"b2f737c1-0835-4a54-9ad6-12e1bcb1e33e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f5nwl" Apr 17 18:50:37.597702 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.597631 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvb8n\" (UniqueName: \"kubernetes.io/projected/160e7e3f-dd7c-4341-8e04-ec0fc5728152-kube-api-access-zvb8n\") pod \"console-operator-9d4b6777b-whmp8\" (UID: \"160e7e3f-dd7c-4341-8e04-ec0fc5728152\") " pod="openshift-console-operator/console-operator-9d4b6777b-whmp8" Apr 17 18:50:37.619543 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.619523 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-g9wkh" Apr 17 18:50:37.636434 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.636413 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-whmp8" Apr 17 18:50:37.691187 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.690123 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2050f398-269d-4da3-873f-4885dc5f98eb-default-certificate\") pod \"router-default-f4f48c65b-6jj2h\" (UID: \"2050f398-269d-4da3-873f-4885dc5f98eb\") " pod="openshift-ingress/router-default-f4f48c65b-6jj2h" Apr 17 18:50:37.691187 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.690227 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2050f398-269d-4da3-873f-4885dc5f98eb-stats-auth\") pod \"router-default-f4f48c65b-6jj2h\" (UID: \"2050f398-269d-4da3-873f-4885dc5f98eb\") " pod="openshift-ingress/router-default-f4f48c65b-6jj2h" Apr 17 18:50:37.691187 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.690305 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2050f398-269d-4da3-873f-4885dc5f98eb-service-ca-bundle\") pod \"router-default-f4f48c65b-6jj2h\" (UID: \"2050f398-269d-4da3-873f-4885dc5f98eb\") " pod="openshift-ingress/router-default-f4f48c65b-6jj2h" Apr 17 18:50:37.691187 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.690345 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2050f398-269d-4da3-873f-4885dc5f98eb-metrics-certs\") pod \"router-default-f4f48c65b-6jj2h\" (UID: \"2050f398-269d-4da3-873f-4885dc5f98eb\") " pod="openshift-ingress/router-default-f4f48c65b-6jj2h" Apr 17 18:50:37.691187 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.690379 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gsmmv\" (UniqueName: \"kubernetes.io/projected/2050f398-269d-4da3-873f-4885dc5f98eb-kube-api-access-gsmmv\") pod \"router-default-f4f48c65b-6jj2h\" (UID: \"2050f398-269d-4da3-873f-4885dc5f98eb\") " pod="openshift-ingress/router-default-f4f48c65b-6jj2h" Apr 17 18:50:37.691187 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:37.691011 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2050f398-269d-4da3-873f-4885dc5f98eb-service-ca-bundle podName:2050f398-269d-4da3-873f-4885dc5f98eb nodeName:}" failed. No retries permitted until 2026-04-17 18:50:38.190985646 +0000 UTC m=+93.843521556 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/2050f398-269d-4da3-873f-4885dc5f98eb-service-ca-bundle") pod "router-default-f4f48c65b-6jj2h" (UID: "2050f398-269d-4da3-873f-4885dc5f98eb") : configmap references non-existent config key: service-ca.crt Apr 17 18:50:37.691187 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:37.691043 2580 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 18:50:37.691187 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:37.691086 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2050f398-269d-4da3-873f-4885dc5f98eb-metrics-certs podName:2050f398-269d-4da3-873f-4885dc5f98eb nodeName:}" failed. No retries permitted until 2026-04-17 18:50:38.191074804 +0000 UTC m=+93.843610729 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2050f398-269d-4da3-873f-4885dc5f98eb-metrics-certs") pod "router-default-f4f48c65b-6jj2h" (UID: "2050f398-269d-4da3-873f-4885dc5f98eb") : secret "router-metrics-certs-default" not found Apr 17 18:50:37.693920 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.693873 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2050f398-269d-4da3-873f-4885dc5f98eb-default-certificate\") pod \"router-default-f4f48c65b-6jj2h\" (UID: \"2050f398-269d-4da3-873f-4885dc5f98eb\") " pod="openshift-ingress/router-default-f4f48c65b-6jj2h" Apr 17 18:50:37.694352 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.694332 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2050f398-269d-4da3-873f-4885dc5f98eb-stats-auth\") pod \"router-default-f4f48c65b-6jj2h\" (UID: \"2050f398-269d-4da3-873f-4885dc5f98eb\") " pod="openshift-ingress/router-default-f4f48c65b-6jj2h" Apr 17 18:50:37.701777 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.701742 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsmmv\" (UniqueName: \"kubernetes.io/projected/2050f398-269d-4da3-873f-4885dc5f98eb-kube-api-access-gsmmv\") pod \"router-default-f4f48c65b-6jj2h\" (UID: \"2050f398-269d-4da3-873f-4885dc5f98eb\") " pod="openshift-ingress/router-default-f4f48c65b-6jj2h" Apr 17 18:50:37.720565 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.720225 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h9jnl" Apr 17 18:50:37.741953 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.741827 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-g9wkh"] Apr 17 18:50:37.762911 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.762849 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-whmp8"] Apr 17 18:50:37.766507 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:50:37.766463 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod160e7e3f_dd7c_4341_8e04_ec0fc5728152.slice/crio-d0ce2a1f68d701475a682fda0d4c451984582e50a13d9cd64e199ef4e33d5852 WatchSource:0}: Error finding container d0ce2a1f68d701475a682fda0d4c451984582e50a13d9cd64e199ef4e33d5852: Status 404 returned error can't find the container with id d0ce2a1f68d701475a682fda0d4c451984582e50a13d9cd64e199ef4e33d5852 Apr 17 18:50:37.839798 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.839770 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h9jnl"] Apr 17 18:50:37.843304 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:50:37.843280 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40e1d37f_b2bf_44fe_a623_a1c64ed6ba58.slice/crio-8ce58c7f3df1f726044dd19f58dc2f76b0ba3a40471cc4cec2ffa9236f6d0f24 WatchSource:0}: Error finding container 8ce58c7f3df1f726044dd19f58dc2f76b0ba3a40471cc4cec2ffa9236f6d0f24: Status 404 returned error can't find the container with id 8ce58c7f3df1f726044dd19f58dc2f76b0ba3a40471cc4cec2ffa9236f6d0f24 Apr 17 18:50:37.891762 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:37.891725 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fb1d88d0-276e-45c5-8a26-0c045d19801b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7sfxc\" (UID: \"fb1d88d0-276e-45c5-8a26-0c045d19801b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7sfxc" Apr 17 18:50:37.891945 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:37.891869 2580 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 18:50:37.892005 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:37.891955 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb1d88d0-276e-45c5-8a26-0c045d19801b-cluster-monitoring-operator-tls podName:fb1d88d0-276e-45c5-8a26-0c045d19801b nodeName:}" failed. No retries permitted until 2026-04-17 18:50:38.891938882 +0000 UTC m=+94.544474786 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/fb1d88d0-276e-45c5-8a26-0c045d19801b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-7sfxc" (UID: "fb1d88d0-276e-45c5-8a26-0c045d19801b") : secret "cluster-monitoring-operator-tls" not found Apr 17 18:50:38.094363 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:38.094260 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2f737c1-0835-4a54-9ad6-12e1bcb1e33e-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-f5nwl\" (UID: \"b2f737c1-0835-4a54-9ad6-12e1bcb1e33e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f5nwl" Apr 17 18:50:38.094531 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:38.094384 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 18:50:38.094531 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:38.094463 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2f737c1-0835-4a54-9ad6-12e1bcb1e33e-samples-operator-tls podName:b2f737c1-0835-4a54-9ad6-12e1bcb1e33e nodeName:}" failed. No retries permitted until 2026-04-17 18:50:39.094443434 +0000 UTC m=+94.746979338 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/b2f737c1-0835-4a54-9ad6-12e1bcb1e33e-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-f5nwl" (UID: "b2f737c1-0835-4a54-9ad6-12e1bcb1e33e") : secret "samples-operator-tls" not found Apr 17 18:50:38.195077 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:38.195040 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2050f398-269d-4da3-873f-4885dc5f98eb-service-ca-bundle\") pod \"router-default-f4f48c65b-6jj2h\" (UID: \"2050f398-269d-4da3-873f-4885dc5f98eb\") " pod="openshift-ingress/router-default-f4f48c65b-6jj2h" Apr 17 18:50:38.195249 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:38.195087 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2050f398-269d-4da3-873f-4885dc5f98eb-metrics-certs\") pod \"router-default-f4f48c65b-6jj2h\" (UID: \"2050f398-269d-4da3-873f-4885dc5f98eb\") " pod="openshift-ingress/router-default-f4f48c65b-6jj2h" Apr 17 18:50:38.195249 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:38.195184 2580 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 18:50:38.195249 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:38.195217 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2050f398-269d-4da3-873f-4885dc5f98eb-service-ca-bundle podName:2050f398-269d-4da3-873f-4885dc5f98eb nodeName:}" failed. No retries permitted until 2026-04-17 18:50:39.195199285 +0000 UTC m=+94.847735189 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/2050f398-269d-4da3-873f-4885dc5f98eb-service-ca-bundle") pod "router-default-f4f48c65b-6jj2h" (UID: "2050f398-269d-4da3-873f-4885dc5f98eb") : configmap references non-existent config key: service-ca.crt Apr 17 18:50:38.195249 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:38.195242 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2050f398-269d-4da3-873f-4885dc5f98eb-metrics-certs podName:2050f398-269d-4da3-873f-4885dc5f98eb nodeName:}" failed. No retries permitted until 2026-04-17 18:50:39.195235595 +0000 UTC m=+94.847771499 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2050f398-269d-4da3-873f-4885dc5f98eb-metrics-certs") pod "router-default-f4f48c65b-6jj2h" (UID: "2050f398-269d-4da3-873f-4885dc5f98eb") : secret "router-metrics-certs-default" not found Apr 17 18:50:38.243222 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:38.243187 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-whmp8" event={"ID":"160e7e3f-dd7c-4341-8e04-ec0fc5728152","Type":"ContainerStarted","Data":"d0ce2a1f68d701475a682fda0d4c451984582e50a13d9cd64e199ef4e33d5852"} Apr 17 18:50:38.244412 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:38.244379 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-g9wkh" event={"ID":"ea018e6b-4b5a-464d-ba7a-d5dd8fb8f0b6","Type":"ContainerStarted","Data":"cb6104c0500d9657ec0ded01cd210e2b4dfd673572e7dac35e5080c483256e97"} Apr 17 18:50:38.245439 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:38.245401 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h9jnl" event={"ID":"40e1d37f-b2bf-44fe-a623-a1c64ed6ba58","Type":"ContainerStarted","Data":"8ce58c7f3df1f726044dd19f58dc2f76b0ba3a40471cc4cec2ffa9236f6d0f24"} Apr 17 18:50:38.902340 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:38.902298 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fb1d88d0-276e-45c5-8a26-0c045d19801b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7sfxc\" (UID: \"fb1d88d0-276e-45c5-8a26-0c045d19801b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7sfxc" Apr 17 18:50:38.902535 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:38.902459 2580 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 18:50:38.902599 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:38.902543 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb1d88d0-276e-45c5-8a26-0c045d19801b-cluster-monitoring-operator-tls podName:fb1d88d0-276e-45c5-8a26-0c045d19801b nodeName:}" failed. No retries permitted until 2026-04-17 18:50:40.902521922 +0000 UTC m=+96.555057828 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/fb1d88d0-276e-45c5-8a26-0c045d19801b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-7sfxc" (UID: "fb1d88d0-276e-45c5-8a26-0c045d19801b") : secret "cluster-monitoring-operator-tls" not found Apr 17 18:50:39.104800 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:39.104761 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2f737c1-0835-4a54-9ad6-12e1bcb1e33e-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-f5nwl\" (UID: \"b2f737c1-0835-4a54-9ad6-12e1bcb1e33e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f5nwl" Apr 17 18:50:39.104997 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:39.104944 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 18:50:39.105044 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:39.105027 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2f737c1-0835-4a54-9ad6-12e1bcb1e33e-samples-operator-tls podName:b2f737c1-0835-4a54-9ad6-12e1bcb1e33e nodeName:}" failed. No retries permitted until 2026-04-17 18:50:41.105010029 +0000 UTC m=+96.757545938 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/b2f737c1-0835-4a54-9ad6-12e1bcb1e33e-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-f5nwl" (UID: "b2f737c1-0835-4a54-9ad6-12e1bcb1e33e") : secret "samples-operator-tls" not found Apr 17 18:50:39.205789 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:39.205745 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2050f398-269d-4da3-873f-4885dc5f98eb-service-ca-bundle\") pod \"router-default-f4f48c65b-6jj2h\" (UID: \"2050f398-269d-4da3-873f-4885dc5f98eb\") " pod="openshift-ingress/router-default-f4f48c65b-6jj2h" Apr 17 18:50:39.205935 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:39.205802 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2050f398-269d-4da3-873f-4885dc5f98eb-metrics-certs\") pod \"router-default-f4f48c65b-6jj2h\" (UID: \"2050f398-269d-4da3-873f-4885dc5f98eb\") " pod="openshift-ingress/router-default-f4f48c65b-6jj2h" Apr 17 18:50:39.206001 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:39.205933 2580 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 18:50:39.206001 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:39.206000 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2050f398-269d-4da3-873f-4885dc5f98eb-metrics-certs podName:2050f398-269d-4da3-873f-4885dc5f98eb nodeName:}" failed. No retries permitted until 2026-04-17 18:50:41.205979979 +0000 UTC m=+96.858515898 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2050f398-269d-4da3-873f-4885dc5f98eb-metrics-certs") pod "router-default-f4f48c65b-6jj2h" (UID: "2050f398-269d-4da3-873f-4885dc5f98eb") : secret "router-metrics-certs-default" not found Apr 17 18:50:39.206092 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:39.206026 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2050f398-269d-4da3-873f-4885dc5f98eb-service-ca-bundle podName:2050f398-269d-4da3-873f-4885dc5f98eb nodeName:}" failed. No retries permitted until 2026-04-17 18:50:41.206006934 +0000 UTC m=+96.858542861 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/2050f398-269d-4da3-873f-4885dc5f98eb-service-ca-bundle") pod "router-default-f4f48c65b-6jj2h" (UID: "2050f398-269d-4da3-873f-4885dc5f98eb") : configmap references non-existent config key: service-ca.crt Apr 17 18:50:40.251307 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:40.251264 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-g9wkh" event={"ID":"ea018e6b-4b5a-464d-ba7a-d5dd8fb8f0b6","Type":"ContainerStarted","Data":"d4b140ebb2869480762b09b029fe724b26307db3324ea0d7fc062221161c05c9"} Apr 17 18:50:40.920578 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:40.920478 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fb1d88d0-276e-45c5-8a26-0c045d19801b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7sfxc\" (UID: \"fb1d88d0-276e-45c5-8a26-0c045d19801b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7sfxc" Apr 17 18:50:40.920708 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:40.920619 2580 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 18:50:40.920708 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:40.920691 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb1d88d0-276e-45c5-8a26-0c045d19801b-cluster-monitoring-operator-tls podName:fb1d88d0-276e-45c5-8a26-0c045d19801b nodeName:}" failed. No retries permitted until 2026-04-17 18:50:44.920674826 +0000 UTC m=+100.573210729 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/fb1d88d0-276e-45c5-8a26-0c045d19801b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-7sfxc" (UID: "fb1d88d0-276e-45c5-8a26-0c045d19801b") : secret "cluster-monitoring-operator-tls" not found Apr 17 18:50:41.122724 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:41.122687 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2f737c1-0835-4a54-9ad6-12e1bcb1e33e-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-f5nwl\" (UID: \"b2f737c1-0835-4a54-9ad6-12e1bcb1e33e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f5nwl" Apr 17 18:50:41.122916 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:41.122834 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 18:50:41.122981 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:41.122917 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2f737c1-0835-4a54-9ad6-12e1bcb1e33e-samples-operator-tls podName:b2f737c1-0835-4a54-9ad6-12e1bcb1e33e nodeName:}" failed. No retries permitted until 2026-04-17 18:50:45.122882971 +0000 UTC m=+100.775418878 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/b2f737c1-0835-4a54-9ad6-12e1bcb1e33e-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-f5nwl" (UID: "b2f737c1-0835-4a54-9ad6-12e1bcb1e33e") : secret "samples-operator-tls" not found Apr 17 18:50:41.223690 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:41.223602 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2050f398-269d-4da3-873f-4885dc5f98eb-service-ca-bundle\") pod \"router-default-f4f48c65b-6jj2h\" (UID: \"2050f398-269d-4da3-873f-4885dc5f98eb\") " pod="openshift-ingress/router-default-f4f48c65b-6jj2h" Apr 17 18:50:41.223690 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:41.223652 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2050f398-269d-4da3-873f-4885dc5f98eb-metrics-certs\") pod \"router-default-f4f48c65b-6jj2h\" (UID: \"2050f398-269d-4da3-873f-4885dc5f98eb\") " pod="openshift-ingress/router-default-f4f48c65b-6jj2h" Apr 17 18:50:41.223884 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:41.223731 2580 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 18:50:41.223884 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:41.223765 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2050f398-269d-4da3-873f-4885dc5f98eb-service-ca-bundle podName:2050f398-269d-4da3-873f-4885dc5f98eb nodeName:}" failed. No retries permitted until 2026-04-17 18:50:45.223748722 +0000 UTC m=+100.876284626 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/2050f398-269d-4da3-873f-4885dc5f98eb-service-ca-bundle") pod "router-default-f4f48c65b-6jj2h" (UID: "2050f398-269d-4da3-873f-4885dc5f98eb") : configmap references non-existent config key: service-ca.crt Apr 17 18:50:41.223884 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:41.223789 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2050f398-269d-4da3-873f-4885dc5f98eb-metrics-certs podName:2050f398-269d-4da3-873f-4885dc5f98eb nodeName:}" failed. No retries permitted until 2026-04-17 18:50:45.223780523 +0000 UTC m=+100.876316427 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2050f398-269d-4da3-873f-4885dc5f98eb-metrics-certs") pod "router-default-f4f48c65b-6jj2h" (UID: "2050f398-269d-4da3-873f-4885dc5f98eb") : secret "router-metrics-certs-default" not found Apr 17 18:50:41.256237 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:41.256199 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h9jnl" event={"ID":"40e1d37f-b2bf-44fe-a623-a1c64ed6ba58","Type":"ContainerStarted","Data":"e55912faa2293745ea8d5ba2cc5a94b711c2e323a4fd96a3daecbde7ba08739f"} Apr 17 18:50:41.260169 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:41.260148 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-whmp8_160e7e3f-dd7c-4341-8e04-ec0fc5728152/console-operator/0.log" Apr 17 18:50:41.260320 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:41.260192 2580 generic.go:358] "Generic (PLEG): container finished" podID="160e7e3f-dd7c-4341-8e04-ec0fc5728152" containerID="090b51d4c7fe4be6de82b582df75c4df20f57fa99ae2f9190b8c5dc5fbe5a24d" exitCode=255 Apr 17 18:50:41.260320 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:41.260233 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-whmp8" event={"ID":"160e7e3f-dd7c-4341-8e04-ec0fc5728152","Type":"ContainerDied","Data":"090b51d4c7fe4be6de82b582df75c4df20f57fa99ae2f9190b8c5dc5fbe5a24d"} Apr 17 18:50:41.260487 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:41.260474 2580 scope.go:117] "RemoveContainer" containerID="090b51d4c7fe4be6de82b582df75c4df20f57fa99ae2f9190b8c5dc5fbe5a24d" Apr 17 18:50:41.270765 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:41.270730 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h9jnl" podStartSLOduration=1.523190449 podStartE2EDuration="4.270716563s" podCreationTimestamp="2026-04-17 18:50:37 +0000 UTC" firstStartedPulling="2026-04-17 18:50:37.84512644 +0000 UTC m=+93.497662343" lastFinishedPulling="2026-04-17 18:50:40.592652553 +0000 UTC m=+96.245188457" observedRunningTime="2026-04-17 18:50:41.270611951 +0000 UTC m=+96.923147879" watchObservedRunningTime="2026-04-17 18:50:41.270716563 +0000 UTC m=+96.923252489" Apr 17 18:50:41.271391 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:41.271357 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-g9wkh" podStartSLOduration=2.816386257 podStartE2EDuration="4.27134911s" podCreationTimestamp="2026-04-17 18:50:37 +0000 UTC" firstStartedPulling="2026-04-17 18:50:37.748723368 +0000 UTC m=+93.401259271" lastFinishedPulling="2026-04-17 18:50:39.203686205 +0000 UTC m=+94.856222124" observedRunningTime="2026-04-17 18:50:40.26483273 +0000 UTC m=+95.917368655" watchObservedRunningTime="2026-04-17 18:50:41.27134911 +0000 UTC m=+96.923885035" Apr 17 18:50:42.130113 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:42.130077 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14c44177-594e-4d5c-a3a1-0e8d4627aabd-registry-tls\") pod \"image-registry-67d5f96857-4xrp8\" (UID: \"14c44177-594e-4d5c-a3a1-0e8d4627aabd\") " pod="openshift-image-registry/image-registry-67d5f96857-4xrp8" Apr 17 18:50:42.130259 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:42.130122 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/648e7199-fd23-4496-ad24-5b9e829d77fa-metrics-tls\") pod \"dns-default-l4g6s\" (UID: \"648e7199-fd23-4496-ad24-5b9e829d77fa\") " pod="openshift-dns/dns-default-l4g6s" Apr 17 18:50:42.130259 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:42.130150 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/98b160b8-551d-443c-a3a0-4d046919e27c-cert\") pod \"ingress-canary-94ggh\" (UID: \"98b160b8-551d-443c-a3a0-4d046919e27c\") " pod="openshift-ingress-canary/ingress-canary-94ggh" Apr 17 18:50:42.130259 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:42.130221 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 18:50:42.130259 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:42.130240 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67d5f96857-4xrp8: secret "image-registry-tls" not found Apr 17 18:50:42.130259 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:42.130241 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 18:50:42.130427 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:42.130259 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 18:50:42.130427 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:42.130300 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/14c44177-594e-4d5c-a3a1-0e8d4627aabd-registry-tls podName:14c44177-594e-4d5c-a3a1-0e8d4627aabd nodeName:}" failed. No retries permitted until 2026-04-17 18:51:46.13028465 +0000 UTC m=+161.782820555 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/14c44177-594e-4d5c-a3a1-0e8d4627aabd-registry-tls") pod "image-registry-67d5f96857-4xrp8" (UID: "14c44177-594e-4d5c-a3a1-0e8d4627aabd") : secret "image-registry-tls" not found Apr 17 18:50:42.130427 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:42.130315 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98b160b8-551d-443c-a3a0-4d046919e27c-cert podName:98b160b8-551d-443c-a3a0-4d046919e27c nodeName:}" failed. No retries permitted until 2026-04-17 18:51:46.130307786 +0000 UTC m=+161.782843690 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/98b160b8-551d-443c-a3a0-4d046919e27c-cert") pod "ingress-canary-94ggh" (UID: "98b160b8-551d-443c-a3a0-4d046919e27c") : secret "canary-serving-cert" not found Apr 17 18:50:42.130427 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:42.130324 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/648e7199-fd23-4496-ad24-5b9e829d77fa-metrics-tls podName:648e7199-fd23-4496-ad24-5b9e829d77fa nodeName:}" failed. No retries permitted until 2026-04-17 18:51:46.130319281 +0000 UTC m=+161.782855199 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/648e7199-fd23-4496-ad24-5b9e829d77fa-metrics-tls") pod "dns-default-l4g6s" (UID: "648e7199-fd23-4496-ad24-5b9e829d77fa") : secret "dns-default-metrics-tls" not found Apr 17 18:50:42.267365 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:42.267337 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-whmp8_160e7e3f-dd7c-4341-8e04-ec0fc5728152/console-operator/1.log" Apr 17 18:50:42.267811 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:42.267738 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-whmp8_160e7e3f-dd7c-4341-8e04-ec0fc5728152/console-operator/0.log" Apr 17 18:50:42.267811 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:42.267778 2580 generic.go:358] "Generic (PLEG): container finished" podID="160e7e3f-dd7c-4341-8e04-ec0fc5728152" containerID="1bb25c75df9b714c75b97dbd79ff186c5a36e0147c6d85f7ab525e0a8377f146" exitCode=255 Apr 17 18:50:42.267952 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:42.267807 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-whmp8" event={"ID":"160e7e3f-dd7c-4341-8e04-ec0fc5728152","Type":"ContainerDied","Data":"1bb25c75df9b714c75b97dbd79ff186c5a36e0147c6d85f7ab525e0a8377f146"} Apr 17 18:50:42.267952 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:42.267844 2580 scope.go:117] "RemoveContainer" containerID="090b51d4c7fe4be6de82b582df75c4df20f57fa99ae2f9190b8c5dc5fbe5a24d" Apr 17 18:50:42.268124 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:42.268108 2580 scope.go:117] "RemoveContainer" containerID="1bb25c75df9b714c75b97dbd79ff186c5a36e0147c6d85f7ab525e0a8377f146" Apr 17 18:50:42.268336 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:42.268313 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-whmp8_openshift-console-operator(160e7e3f-dd7c-4341-8e04-ec0fc5728152)\"" pod="openshift-console-operator/console-operator-9d4b6777b-whmp8" podUID="160e7e3f-dd7c-4341-8e04-ec0fc5728152" Apr 17 18:50:43.271944 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:43.271918 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-whmp8_160e7e3f-dd7c-4341-8e04-ec0fc5728152/console-operator/1.log" Apr 17 18:50:43.272352 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:43.272273 2580 scope.go:117] "RemoveContainer" containerID="1bb25c75df9b714c75b97dbd79ff186c5a36e0147c6d85f7ab525e0a8377f146" Apr 17 18:50:43.272449 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:43.272431 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-whmp8_openshift-console-operator(160e7e3f-dd7c-4341-8e04-ec0fc5728152)\"" pod="openshift-console-operator/console-operator-9d4b6777b-whmp8" podUID="160e7e3f-dd7c-4341-8e04-ec0fc5728152" Apr 17 18:50:43.381915 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:43.381869 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-f4v84"] Apr 17 18:50:43.385944 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:43.385928 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-f4v84" Apr 17 18:50:43.388295 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:43.388273 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-nxzgl\"" Apr 17 18:50:43.392237 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:43.392175 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-f4v84"] Apr 17 18:50:43.406382 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:43.406358 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-zsszv_87361592-a029-4a93-9af8-ed4f1a1cc87c/dns-node-resolver/0.log" Apr 17 18:50:43.543968 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:43.543837 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7s99\" (UniqueName: \"kubernetes.io/projected/2e87ce7c-13b6-4964-bc26-6ccd9b000527-kube-api-access-k7s99\") pod \"network-check-source-8894fc9bd-f4v84\" (UID: \"2e87ce7c-13b6-4964-bc26-6ccd9b000527\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-f4v84" Apr 17 18:50:43.645171 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:43.645131 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k7s99\" (UniqueName: \"kubernetes.io/projected/2e87ce7c-13b6-4964-bc26-6ccd9b000527-kube-api-access-k7s99\") pod \"network-check-source-8894fc9bd-f4v84\" (UID: \"2e87ce7c-13b6-4964-bc26-6ccd9b000527\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-f4v84" Apr 17 18:50:43.652606 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:43.652580 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7s99\" (UniqueName: \"kubernetes.io/projected/2e87ce7c-13b6-4964-bc26-6ccd9b000527-kube-api-access-k7s99\") pod \"network-check-source-8894fc9bd-f4v84\" (UID: \"2e87ce7c-13b6-4964-bc26-6ccd9b000527\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-f4v84" Apr 17 18:50:43.695484 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:43.695449 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-f4v84" Apr 17 18:50:43.810429 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:43.810348 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-f4v84"] Apr 17 18:50:43.813651 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:50:43.813623 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e87ce7c_13b6_4964_bc26_6ccd9b000527.slice/crio-c107bdd415db10e2d3235369fb949f669176a1768f343c00f260ac48c7db9f70 WatchSource:0}: Error finding container c107bdd415db10e2d3235369fb949f669176a1768f343c00f260ac48c7db9f70: Status 404 returned error can't find the container with id c107bdd415db10e2d3235369fb949f669176a1768f343c00f260ac48c7db9f70 Apr 17 18:50:44.275569 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:44.275539 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-f4v84" event={"ID":"2e87ce7c-13b6-4964-bc26-6ccd9b000527","Type":"ContainerStarted","Data":"393dc7e4a81ae10897282d423734af01d7d06d5ba81e86030d59c2b8bfdd7c15"} Apr 17 18:50:44.275569 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:44.275573 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-f4v84" event={"ID":"2e87ce7c-13b6-4964-bc26-6ccd9b000527","Type":"ContainerStarted","Data":"c107bdd415db10e2d3235369fb949f669176a1768f343c00f260ac48c7db9f70"} Apr 17 18:50:44.292140 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:44.292091 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-f4v84" podStartSLOduration=1.292076788 podStartE2EDuration="1.292076788s" podCreationTimestamp="2026-04-17 18:50:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:50:44.291306822 +0000 UTC m=+99.943842748" watchObservedRunningTime="2026-04-17 18:50:44.292076788 +0000 UTC m=+99.944612714" Apr 17 18:50:44.406688 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:44.406658 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vnmqw_39fdb78c-608d-4cb2-8a53-feb04ee1cdcf/node-ca/0.log" Apr 17 18:50:44.957990 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:44.957953 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fb1d88d0-276e-45c5-8a26-0c045d19801b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7sfxc\" (UID: \"fb1d88d0-276e-45c5-8a26-0c045d19801b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7sfxc" Apr 17 18:50:44.958143 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:44.958095 2580 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 18:50:44.958199 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:44.958175 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb1d88d0-276e-45c5-8a26-0c045d19801b-cluster-monitoring-operator-tls podName:fb1d88d0-276e-45c5-8a26-0c045d19801b nodeName:}" failed. No retries permitted until 2026-04-17 18:50:52.958160437 +0000 UTC m=+108.610696341 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/fb1d88d0-276e-45c5-8a26-0c045d19801b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-7sfxc" (UID: "fb1d88d0-276e-45c5-8a26-0c045d19801b") : secret "cluster-monitoring-operator-tls" not found Apr 17 18:50:45.159958 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:45.159916 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2f737c1-0835-4a54-9ad6-12e1bcb1e33e-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-f5nwl\" (UID: \"b2f737c1-0835-4a54-9ad6-12e1bcb1e33e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f5nwl" Apr 17 18:50:45.160137 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:45.160040 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 18:50:45.160137 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:45.160108 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2f737c1-0835-4a54-9ad6-12e1bcb1e33e-samples-operator-tls podName:b2f737c1-0835-4a54-9ad6-12e1bcb1e33e nodeName:}" failed. No retries permitted until 2026-04-17 18:50:53.160092493 +0000 UTC m=+108.812628398 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/b2f737c1-0835-4a54-9ad6-12e1bcb1e33e-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-f5nwl" (UID: "b2f737c1-0835-4a54-9ad6-12e1bcb1e33e") : secret "samples-operator-tls" not found Apr 17 18:50:45.260751 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:45.260717 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2050f398-269d-4da3-873f-4885dc5f98eb-service-ca-bundle\") pod \"router-default-f4f48c65b-6jj2h\" (UID: \"2050f398-269d-4da3-873f-4885dc5f98eb\") " pod="openshift-ingress/router-default-f4f48c65b-6jj2h" Apr 17 18:50:45.260935 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:45.260761 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2050f398-269d-4da3-873f-4885dc5f98eb-metrics-certs\") pod \"router-default-f4f48c65b-6jj2h\" (UID: \"2050f398-269d-4da3-873f-4885dc5f98eb\") " pod="openshift-ingress/router-default-f4f48c65b-6jj2h" Apr 17 18:50:45.260935 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:45.260918 2580 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 18:50:45.261025 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:45.260933 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2050f398-269d-4da3-873f-4885dc5f98eb-service-ca-bundle podName:2050f398-269d-4da3-873f-4885dc5f98eb nodeName:}" failed. No retries permitted until 2026-04-17 18:50:53.260890896 +0000 UTC m=+108.913447165 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/2050f398-269d-4da3-873f-4885dc5f98eb-service-ca-bundle") pod "router-default-f4f48c65b-6jj2h" (UID: "2050f398-269d-4da3-873f-4885dc5f98eb") : configmap references non-existent config key: service-ca.crt Apr 17 18:50:45.261025 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:45.260972 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2050f398-269d-4da3-873f-4885dc5f98eb-metrics-certs podName:2050f398-269d-4da3-873f-4885dc5f98eb nodeName:}" failed. No retries permitted until 2026-04-17 18:50:53.2609577 +0000 UTC m=+108.913493604 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2050f398-269d-4da3-873f-4885dc5f98eb-metrics-certs") pod "router-default-f4f48c65b-6jj2h" (UID: "2050f398-269d-4da3-873f-4885dc5f98eb") : secret "router-metrics-certs-default" not found Apr 17 18:50:47.204569 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:47.204539 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-qbtjg" Apr 17 18:50:47.637399 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:47.637354 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-whmp8" Apr 17 18:50:47.637399 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:47.637401 2580 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-whmp8" Apr 17 18:50:47.637744 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:47.637732 2580 scope.go:117] "RemoveContainer" containerID="1bb25c75df9b714c75b97dbd79ff186c5a36e0147c6d85f7ab525e0a8377f146" Apr 17 18:50:47.637938 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:47.637918 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-whmp8_openshift-console-operator(160e7e3f-dd7c-4341-8e04-ec0fc5728152)\"" pod="openshift-console-operator/console-operator-9d4b6777b-whmp8" podUID="160e7e3f-dd7c-4341-8e04-ec0fc5728152" Apr 17 18:50:53.019494 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:53.019460 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fb1d88d0-276e-45c5-8a26-0c045d19801b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7sfxc\" (UID: \"fb1d88d0-276e-45c5-8a26-0c045d19801b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7sfxc" Apr 17 18:50:53.020011 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:53.019626 2580 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 18:50:53.020011 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:53.019717 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb1d88d0-276e-45c5-8a26-0c045d19801b-cluster-monitoring-operator-tls podName:fb1d88d0-276e-45c5-8a26-0c045d19801b nodeName:}" failed. No retries permitted until 2026-04-17 18:51:09.019696699 +0000 UTC m=+124.672232606 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/fb1d88d0-276e-45c5-8a26-0c045d19801b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-7sfxc" (UID: "fb1d88d0-276e-45c5-8a26-0c045d19801b") : secret "cluster-monitoring-operator-tls" not found Apr 17 18:50:53.220944 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:53.220883 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2f737c1-0835-4a54-9ad6-12e1bcb1e33e-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-f5nwl\" (UID: \"b2f737c1-0835-4a54-9ad6-12e1bcb1e33e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f5nwl" Apr 17 18:50:53.223535 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:53.223504 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2f737c1-0835-4a54-9ad6-12e1bcb1e33e-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-f5nwl\" (UID: \"b2f737c1-0835-4a54-9ad6-12e1bcb1e33e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f5nwl" Apr 17 18:50:53.230260 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:53.230243 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f5nwl" Apr 17 18:50:53.321416 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:53.321386 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2050f398-269d-4da3-873f-4885dc5f98eb-service-ca-bundle\") pod \"router-default-f4f48c65b-6jj2h\" (UID: \"2050f398-269d-4da3-873f-4885dc5f98eb\") " pod="openshift-ingress/router-default-f4f48c65b-6jj2h" Apr 17 18:50:53.321593 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:53.321435 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2050f398-269d-4da3-873f-4885dc5f98eb-metrics-certs\") pod \"router-default-f4f48c65b-6jj2h\" (UID: \"2050f398-269d-4da3-873f-4885dc5f98eb\") " pod="openshift-ingress/router-default-f4f48c65b-6jj2h" Apr 17 18:50:53.321593 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:53.321523 2580 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 18:50:53.321593 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:53.321561 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2050f398-269d-4da3-873f-4885dc5f98eb-service-ca-bundle podName:2050f398-269d-4da3-873f-4885dc5f98eb nodeName:}" failed. No retries permitted until 2026-04-17 18:51:09.321542626 +0000 UTC m=+124.974078552 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/2050f398-269d-4da3-873f-4885dc5f98eb-service-ca-bundle") pod "router-default-f4f48c65b-6jj2h" (UID: "2050f398-269d-4da3-873f-4885dc5f98eb") : configmap references non-existent config key: service-ca.crt Apr 17 18:50:53.321768 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:50:53.321613 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2050f398-269d-4da3-873f-4885dc5f98eb-metrics-certs podName:2050f398-269d-4da3-873f-4885dc5f98eb nodeName:}" failed. No retries permitted until 2026-04-17 18:51:09.321597626 +0000 UTC m=+124.974133529 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2050f398-269d-4da3-873f-4885dc5f98eb-metrics-certs") pod "router-default-f4f48c65b-6jj2h" (UID: "2050f398-269d-4da3-873f-4885dc5f98eb") : secret "router-metrics-certs-default" not found Apr 17 18:50:53.350238 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:53.350209 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f5nwl"] Apr 17 18:50:54.305367 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:54.305330 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f5nwl" event={"ID":"b2f737c1-0835-4a54-9ad6-12e1bcb1e33e","Type":"ContainerStarted","Data":"b0a4ee33e1d2a22d243469791348617a16d6a59d168d1cf8cfc6e838e447d10c"} Apr 17 18:50:56.311472 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:56.311434 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f5nwl" event={"ID":"b2f737c1-0835-4a54-9ad6-12e1bcb1e33e","Type":"ContainerStarted","Data":"610952cfc66df39034654e2763e74ba7c8cb3415efb7b14f8d6b8b42abb12920"} Apr 17 18:50:56.311472 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:56.311471 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f5nwl" event={"ID":"b2f737c1-0835-4a54-9ad6-12e1bcb1e33e","Type":"ContainerStarted","Data":"a5731cad2ad63f343da50739243a06f29673d04807ef8fe6b008c1c5c1ec57e3"} Apr 17 18:50:56.327119 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:50:56.327070 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f5nwl" podStartSLOduration=17.406044729 podStartE2EDuration="19.327055764s" podCreationTimestamp="2026-04-17 18:50:37 +0000 UTC" firstStartedPulling="2026-04-17 18:50:53.392303301 +0000 UTC m=+109.044839218" lastFinishedPulling="2026-04-17 18:50:55.313314349 +0000 UTC m=+110.965850253" observedRunningTime="2026-04-17 18:50:56.326068983 +0000 UTC m=+111.978604909" watchObservedRunningTime="2026-04-17 18:50:56.327055764 +0000 UTC m=+111.979591689" Apr 17 18:51:00.973682 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:00.973647 2580 scope.go:117] "RemoveContainer" containerID="1bb25c75df9b714c75b97dbd79ff186c5a36e0147c6d85f7ab525e0a8377f146" Apr 17 18:51:01.325246 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:01.325216 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-whmp8_160e7e3f-dd7c-4341-8e04-ec0fc5728152/console-operator/2.log" Apr 17 18:51:01.325652 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:01.325637 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-whmp8_160e7e3f-dd7c-4341-8e04-ec0fc5728152/console-operator/1.log" Apr 17 18:51:01.325716 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:01.325669 2580 generic.go:358] "Generic (PLEG): container finished" podID="160e7e3f-dd7c-4341-8e04-ec0fc5728152" containerID="6bf7aaf64b8fe477c6e73373b87023429815ec918f26d405f5501e227b618777" exitCode=255 Apr 17 18:51:01.325766 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:01.325726 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-whmp8" event={"ID":"160e7e3f-dd7c-4341-8e04-ec0fc5728152","Type":"ContainerDied","Data":"6bf7aaf64b8fe477c6e73373b87023429815ec918f26d405f5501e227b618777"} Apr 17 18:51:01.325766 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:01.325755 2580 scope.go:117] "RemoveContainer" containerID="1bb25c75df9b714c75b97dbd79ff186c5a36e0147c6d85f7ab525e0a8377f146" Apr 17 18:51:01.326135 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:01.326116 2580 scope.go:117] "RemoveContainer" containerID="6bf7aaf64b8fe477c6e73373b87023429815ec918f26d405f5501e227b618777" Apr 17 18:51:01.326354 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:51:01.326330 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-whmp8_openshift-console-operator(160e7e3f-dd7c-4341-8e04-ec0fc5728152)\"" pod="openshift-console-operator/console-operator-9d4b6777b-whmp8" podUID="160e7e3f-dd7c-4341-8e04-ec0fc5728152" Apr 17 18:51:02.329853 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:02.329824 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-whmp8_160e7e3f-dd7c-4341-8e04-ec0fc5728152/console-operator/2.log" Apr 17 18:51:07.637236 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:07.637203 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-whmp8" Apr 17 18:51:07.637236 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:07.637245 2580 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-whmp8" Apr 17 18:51:07.637729 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:07.637567 2580 scope.go:117] "RemoveContainer" containerID="6bf7aaf64b8fe477c6e73373b87023429815ec918f26d405f5501e227b618777" Apr 17 18:51:07.637765 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:51:07.637733 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-whmp8_openshift-console-operator(160e7e3f-dd7c-4341-8e04-ec0fc5728152)\"" pod="openshift-console-operator/console-operator-9d4b6777b-whmp8" podUID="160e7e3f-dd7c-4341-8e04-ec0fc5728152" Apr 17 18:51:08.006100 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:08.006067 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-qns4x"] Apr 17 18:51:08.010298 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:08.010275 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-qns4x" Apr 17 18:51:08.012618 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:08.012593 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 17 18:51:08.012774 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:08.012621 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 17 18:51:08.013437 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:08.013423 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-nhczf\"" Apr 17 18:51:08.016995 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:08.016951 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-qns4x"] Apr 17 18:51:08.121998 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:08.121971 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-mgsw9"] Apr 17 18:51:08.125054 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:08.125037 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-mgsw9" Apr 17 18:51:08.128020 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:08.127991 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 18:51:08.128135 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:08.128043 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 18:51:08.128359 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:08.128342 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 18:51:08.128550 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:08.128535 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 18:51:08.128653 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:08.128639 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-njpjx\"" Apr 17 18:51:08.137754 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:08.137732 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-mgsw9"] Apr 17 18:51:08.146646 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:08.146622 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6620fc0c-c3f6-4c91-a578-386ce8be85f5-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-qns4x\" (UID: \"6620fc0c-c3f6-4c91-a578-386ce8be85f5\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qns4x" Apr 17 18:51:08.146737 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:08.146658 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6620fc0c-c3f6-4c91-a578-386ce8be85f5-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-qns4x\" (UID: \"6620fc0c-c3f6-4c91-a578-386ce8be85f5\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qns4x" Apr 17 18:51:08.247828 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:08.247797 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7e24123b-7334-43c8-abd1-998265d69576-crio-socket\") pod \"insights-runtime-extractor-mgsw9\" (UID: \"7e24123b-7334-43c8-abd1-998265d69576\") " pod="openshift-insights/insights-runtime-extractor-mgsw9" Apr 17 18:51:08.248016 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:08.247854 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s7r6\" (UniqueName: \"kubernetes.io/projected/7e24123b-7334-43c8-abd1-998265d69576-kube-api-access-5s7r6\") pod \"insights-runtime-extractor-mgsw9\" (UID: \"7e24123b-7334-43c8-abd1-998265d69576\") " pod="openshift-insights/insights-runtime-extractor-mgsw9" Apr 17 18:51:08.248016 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:08.247949 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7e24123b-7334-43c8-abd1-998265d69576-data-volume\") pod \"insights-runtime-extractor-mgsw9\" (UID: \"7e24123b-7334-43c8-abd1-998265d69576\") " pod="openshift-insights/insights-runtime-extractor-mgsw9" Apr 17 18:51:08.248016 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:08.247969 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7e24123b-7334-43c8-abd1-998265d69576-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-mgsw9\" (UID: \"7e24123b-7334-43c8-abd1-998265d69576\") " pod="openshift-insights/insights-runtime-extractor-mgsw9" Apr 17 18:51:08.248137 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:08.248048 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7e24123b-7334-43c8-abd1-998265d69576-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-mgsw9\" (UID: \"7e24123b-7334-43c8-abd1-998265d69576\") " pod="openshift-insights/insights-runtime-extractor-mgsw9" Apr 17 18:51:08.248137 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:08.248118 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6620fc0c-c3f6-4c91-a578-386ce8be85f5-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-qns4x\" (UID: \"6620fc0c-c3f6-4c91-a578-386ce8be85f5\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qns4x" Apr 17 18:51:08.248224 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:08.248147 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6620fc0c-c3f6-4c91-a578-386ce8be85f5-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-qns4x\" (UID: \"6620fc0c-c3f6-4c91-a578-386ce8be85f5\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qns4x" Apr 17 18:51:08.248734 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:08.248713 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6620fc0c-c3f6-4c91-a578-386ce8be85f5-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-qns4x\" (UID: \"6620fc0c-c3f6-4c91-a578-386ce8be85f5\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qns4x" Apr 17 18:51:08.250580 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:08.250555 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6620fc0c-c3f6-4c91-a578-386ce8be85f5-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-qns4x\" (UID: \"6620fc0c-c3f6-4c91-a578-386ce8be85f5\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qns4x" Apr 17 18:51:08.320390 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:08.320317 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-qns4x" Apr 17 18:51:08.349096 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:08.349067 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7e24123b-7334-43c8-abd1-998265d69576-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-mgsw9\" (UID: \"7e24123b-7334-43c8-abd1-998265d69576\") " pod="openshift-insights/insights-runtime-extractor-mgsw9" Apr 17 18:51:08.349252 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:08.349170 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7e24123b-7334-43c8-abd1-998265d69576-crio-socket\") pod \"insights-runtime-extractor-mgsw9\" (UID: \"7e24123b-7334-43c8-abd1-998265d69576\") " pod="openshift-insights/insights-runtime-extractor-mgsw9" Apr 17 18:51:08.349252 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:08.349232 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5s7r6\" (UniqueName: \"kubernetes.io/projected/7e24123b-7334-43c8-abd1-998265d69576-kube-api-access-5s7r6\") pod \"insights-runtime-extractor-mgsw9\" (UID: \"7e24123b-7334-43c8-abd1-998265d69576\") " pod="openshift-insights/insights-runtime-extractor-mgsw9" Apr 17 18:51:08.349361 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:08.349269 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7e24123b-7334-43c8-abd1-998265d69576-data-volume\") pod \"insights-runtime-extractor-mgsw9\" (UID: \"7e24123b-7334-43c8-abd1-998265d69576\") " pod="openshift-insights/insights-runtime-extractor-mgsw9" Apr 17 18:51:08.349361 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:08.349293 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7e24123b-7334-43c8-abd1-998265d69576-crio-socket\") pod \"insights-runtime-extractor-mgsw9\" (UID: \"7e24123b-7334-43c8-abd1-998265d69576\") " pod="openshift-insights/insights-runtime-extractor-mgsw9" Apr 17 18:51:08.349361 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:08.349295 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7e24123b-7334-43c8-abd1-998265d69576-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-mgsw9\" (UID: \"7e24123b-7334-43c8-abd1-998265d69576\") " pod="openshift-insights/insights-runtime-extractor-mgsw9" Apr 17 18:51:08.349765 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:08.349737 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7e24123b-7334-43c8-abd1-998265d69576-data-volume\") pod \"insights-runtime-extractor-mgsw9\" (UID: \"7e24123b-7334-43c8-abd1-998265d69576\") " pod="openshift-insights/insights-runtime-extractor-mgsw9" Apr 17 18:51:08.349861 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:08.349787 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7e24123b-7334-43c8-abd1-998265d69576-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-mgsw9\" (UID: \"7e24123b-7334-43c8-abd1-998265d69576\") " pod="openshift-insights/insights-runtime-extractor-mgsw9" Apr 17 18:51:08.351482 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:08.351463 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7e24123b-7334-43c8-abd1-998265d69576-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-mgsw9\" (UID: \"7e24123b-7334-43c8-abd1-998265d69576\") " pod="openshift-insights/insights-runtime-extractor-mgsw9" Apr 17 18:51:08.357323 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:08.357278 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s7r6\" (UniqueName: \"kubernetes.io/projected/7e24123b-7334-43c8-abd1-998265d69576-kube-api-access-5s7r6\") pod \"insights-runtime-extractor-mgsw9\" (UID: \"7e24123b-7334-43c8-abd1-998265d69576\") " pod="openshift-insights/insights-runtime-extractor-mgsw9" Apr 17 18:51:08.434376 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:08.434343 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-mgsw9" Apr 17 18:51:08.439914 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:08.439863 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-qns4x"] Apr 17 18:51:08.442864 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:51:08.442835 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6620fc0c_c3f6_4c91_a578_386ce8be85f5.slice/crio-3b547c4777250b803cef34c897618c3f194640dfb65b05460ad71bf119364e16 WatchSource:0}: Error finding container 3b547c4777250b803cef34c897618c3f194640dfb65b05460ad71bf119364e16: Status 404 returned error can't find the container with id 3b547c4777250b803cef34c897618c3f194640dfb65b05460ad71bf119364e16 Apr 17 18:51:08.550093 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:08.550064 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-mgsw9"] Apr 17 18:51:08.553565 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:51:08.553540 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e24123b_7334_43c8_abd1_998265d69576.slice/crio-4ff0b955a57069b65b3767c37aeb0a7d853fa87ce3ce03a1c5bb130a666f274c WatchSource:0}: Error finding container 4ff0b955a57069b65b3767c37aeb0a7d853fa87ce3ce03a1c5bb130a666f274c: Status 404 returned error can't find the container with id 4ff0b955a57069b65b3767c37aeb0a7d853fa87ce3ce03a1c5bb130a666f274c Apr 17 18:51:09.055327 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:09.055291 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fb1d88d0-276e-45c5-8a26-0c045d19801b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7sfxc\" (UID: \"fb1d88d0-276e-45c5-8a26-0c045d19801b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7sfxc" Apr 17 18:51:09.057816 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:09.057794 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fb1d88d0-276e-45c5-8a26-0c045d19801b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7sfxc\" (UID: \"fb1d88d0-276e-45c5-8a26-0c045d19801b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7sfxc" Apr 17 18:51:09.321424 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:09.321338 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-bcdl4\"" Apr 17 18:51:09.330280 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:09.330250 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7sfxc" Apr 17 18:51:09.347155 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:09.347117 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-mgsw9" event={"ID":"7e24123b-7334-43c8-abd1-998265d69576","Type":"ContainerStarted","Data":"53e4ba54c59301c316c8c911ecdf435f0cecb83e226947f6b9cd73909459407b"} Apr 17 18:51:09.347273 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:09.347167 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-mgsw9" event={"ID":"7e24123b-7334-43c8-abd1-998265d69576","Type":"ContainerStarted","Data":"4ff0b955a57069b65b3767c37aeb0a7d853fa87ce3ce03a1c5bb130a666f274c"} Apr 17 18:51:09.348093 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:09.348064 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-qns4x" event={"ID":"6620fc0c-c3f6-4c91-a578-386ce8be85f5","Type":"ContainerStarted","Data":"3b547c4777250b803cef34c897618c3f194640dfb65b05460ad71bf119364e16"} Apr 17 18:51:09.357436 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:09.357407 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2050f398-269d-4da3-873f-4885dc5f98eb-metrics-certs\") pod \"router-default-f4f48c65b-6jj2h\" (UID: \"2050f398-269d-4da3-873f-4885dc5f98eb\") " pod="openshift-ingress/router-default-f4f48c65b-6jj2h" Apr 17 18:51:09.357552 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:09.357512 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2050f398-269d-4da3-873f-4885dc5f98eb-service-ca-bundle\") pod \"router-default-f4f48c65b-6jj2h\" (UID: \"2050f398-269d-4da3-873f-4885dc5f98eb\") " pod="openshift-ingress/router-default-f4f48c65b-6jj2h" Apr 17 18:51:09.358042 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:09.358022 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2050f398-269d-4da3-873f-4885dc5f98eb-service-ca-bundle\") pod \"router-default-f4f48c65b-6jj2h\" (UID: \"2050f398-269d-4da3-873f-4885dc5f98eb\") " pod="openshift-ingress/router-default-f4f48c65b-6jj2h" Apr 17 18:51:09.359805 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:09.359781 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2050f398-269d-4da3-873f-4885dc5f98eb-metrics-certs\") pod \"router-default-f4f48c65b-6jj2h\" (UID: \"2050f398-269d-4da3-873f-4885dc5f98eb\") " pod="openshift-ingress/router-default-f4f48c65b-6jj2h" Apr 17 18:51:09.527001 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:09.526968 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-hrrp5\"" Apr 17 18:51:09.535705 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:09.535672 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-f4f48c65b-6jj2h" Apr 17 18:51:09.762870 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:09.762838 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-f4f48c65b-6jj2h"] Apr 17 18:51:09.767015 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:51:09.766987 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2050f398_269d_4da3_873f_4885dc5f98eb.slice/crio-4a324644878a23408daa21784c162f87de5c03761badd3c84d980656519388c2 WatchSource:0}: Error finding container 4a324644878a23408daa21784c162f87de5c03761badd3c84d980656519388c2: Status 404 returned error can't find the container with id 4a324644878a23408daa21784c162f87de5c03761badd3c84d980656519388c2 Apr 17 18:51:09.775764 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:09.775744 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-7sfxc"] Apr 17 18:51:09.778913 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:51:09.778873 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb1d88d0_276e_45c5_8a26_0c045d19801b.slice/crio-27858016c556f384056e7c6fa49765fa771bb136d35ee7ac95e1419f82cc9b18 WatchSource:0}: Error finding container 27858016c556f384056e7c6fa49765fa771bb136d35ee7ac95e1419f82cc9b18: Status 404 returned error can't find the container with id 27858016c556f384056e7c6fa49765fa771bb136d35ee7ac95e1419f82cc9b18 Apr 17 18:51:10.352298 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:10.352253 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-qns4x" event={"ID":"6620fc0c-c3f6-4c91-a578-386ce8be85f5","Type":"ContainerStarted","Data":"03ad253a417473eb893dc1c5ed9e907fe120201f4a53f8ecb8096f978ddeadcf"} Apr 17 18:51:10.354144 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:10.354116 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-mgsw9" event={"ID":"7e24123b-7334-43c8-abd1-998265d69576","Type":"ContainerStarted","Data":"9e0e4b4621e3f8199f1c2caf3a2472e20513f0ea6d2ccfe6fac6567a53678e0b"} Apr 17 18:51:10.355929 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:10.355881 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-f4f48c65b-6jj2h" event={"ID":"2050f398-269d-4da3-873f-4885dc5f98eb","Type":"ContainerStarted","Data":"06cda9f05f08cae1d066008d3dd5da839ebf8457b435f6b48edb0536a5156899"} Apr 17 18:51:10.356017 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:10.355937 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-f4f48c65b-6jj2h" event={"ID":"2050f398-269d-4da3-873f-4885dc5f98eb","Type":"ContainerStarted","Data":"4a324644878a23408daa21784c162f87de5c03761badd3c84d980656519388c2"} Apr 17 18:51:10.356931 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:10.356880 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7sfxc" event={"ID":"fb1d88d0-276e-45c5-8a26-0c045d19801b","Type":"ContainerStarted","Data":"27858016c556f384056e7c6fa49765fa771bb136d35ee7ac95e1419f82cc9b18"} Apr 17 18:51:10.366076 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:10.366034 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-qns4x" podStartSLOduration=2.196862145 podStartE2EDuration="3.366020373s" podCreationTimestamp="2026-04-17 18:51:07 +0000 UTC" firstStartedPulling="2026-04-17 18:51:08.444680196 +0000 UTC m=+124.097216107" lastFinishedPulling="2026-04-17 18:51:09.613838426 +0000 UTC m=+125.266374335" observedRunningTime="2026-04-17 18:51:10.36557684 +0000 UTC m=+126.018112767" watchObservedRunningTime="2026-04-17 18:51:10.366020373 +0000 UTC m=+126.018556345" Apr 17 18:51:10.382074 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:10.381821 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-f4f48c65b-6jj2h" podStartSLOduration=33.381802809 podStartE2EDuration="33.381802809s" podCreationTimestamp="2026-04-17 18:50:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:51:10.381220189 +0000 UTC m=+126.033756117" watchObservedRunningTime="2026-04-17 18:51:10.381802809 +0000 UTC m=+126.034338738" Apr 17 18:51:10.536196 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:10.536154 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-f4f48c65b-6jj2h" Apr 17 18:51:10.539252 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:10.539224 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-f4f48c65b-6jj2h" Apr 17 18:51:11.360437 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:11.360073 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-f4f48c65b-6jj2h" Apr 17 18:51:11.361424 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:11.361396 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-f4f48c65b-6jj2h" Apr 17 18:51:12.367053 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:12.366931 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-mgsw9" event={"ID":"7e24123b-7334-43c8-abd1-998265d69576","Type":"ContainerStarted","Data":"c8fba33a8852d19c459005361b6bf7f3dee786d407cbba69a28e4d359c266349"} Apr 17 18:51:12.368264 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:12.368240 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7sfxc" event={"ID":"fb1d88d0-276e-45c5-8a26-0c045d19801b","Type":"ContainerStarted","Data":"8f9c17076f0dc45d28315fee39011d9e5664e4dce9544b807fcec45620b5b55c"} Apr 17 18:51:12.387884 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:12.387833 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-mgsw9" podStartSLOduration=0.973668162 podStartE2EDuration="4.387815505s" podCreationTimestamp="2026-04-17 18:51:08 +0000 UTC" firstStartedPulling="2026-04-17 18:51:08.616877245 +0000 UTC m=+124.269413151" lastFinishedPulling="2026-04-17 18:51:12.031024588 +0000 UTC m=+127.683560494" observedRunningTime="2026-04-17 18:51:12.386485595 +0000 UTC m=+128.039021523" watchObservedRunningTime="2026-04-17 18:51:12.387815505 +0000 UTC m=+128.040351432" Apr 17 18:51:12.406946 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:12.406874 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7sfxc" podStartSLOduration=33.158612159 podStartE2EDuration="35.406858699s" podCreationTimestamp="2026-04-17 18:50:37 +0000 UTC" firstStartedPulling="2026-04-17 18:51:09.780733166 +0000 UTC m=+125.433269084" lastFinishedPulling="2026-04-17 18:51:12.028979717 +0000 UTC m=+127.681515624" observedRunningTime="2026-04-17 18:51:12.405364588 +0000 UTC m=+128.057900517" watchObservedRunningTime="2026-04-17 18:51:12.406858699 +0000 UTC m=+128.059394624" Apr 17 18:51:12.530981 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:12.530945 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-rccbm"] Apr 17 18:51:12.534449 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:12.534425 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-rccbm" Apr 17 18:51:12.537342 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:12.537323 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 17 18:51:12.537555 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:12.537538 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-bpm5v\"" Apr 17 18:51:12.545667 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:12.545641 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-rccbm"] Apr 17 18:51:12.584577 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:12.584549 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/406d0f3c-3b14-407a-a8a2-2894a5d84b73-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-rccbm\" (UID: \"406d0f3c-3b14-407a-a8a2-2894a5d84b73\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-rccbm" Apr 17 18:51:12.685818 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:12.685744 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/406d0f3c-3b14-407a-a8a2-2894a5d84b73-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-rccbm\" (UID: \"406d0f3c-3b14-407a-a8a2-2894a5d84b73\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-rccbm" Apr 17 18:51:12.688241 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:12.688203 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/406d0f3c-3b14-407a-a8a2-2894a5d84b73-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-rccbm\" (UID: \"406d0f3c-3b14-407a-a8a2-2894a5d84b73\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-rccbm" Apr 17 18:51:12.842676 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:12.842628 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-rccbm" Apr 17 18:51:12.958150 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:12.958078 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-rccbm"] Apr 17 18:51:12.961997 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:51:12.961970 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod406d0f3c_3b14_407a_a8a2_2894a5d84b73.slice/crio-ed4e86af0b263959d9fddef9d88de9a4f44833311c112cf2a6ae5556943b5a0b WatchSource:0}: Error finding container ed4e86af0b263959d9fddef9d88de9a4f44833311c112cf2a6ae5556943b5a0b: Status 404 returned error can't find the container with id ed4e86af0b263959d9fddef9d88de9a4f44833311c112cf2a6ae5556943b5a0b Apr 17 18:51:13.371411 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:13.371369 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-rccbm" event={"ID":"406d0f3c-3b14-407a-a8a2-2894a5d84b73","Type":"ContainerStarted","Data":"ed4e86af0b263959d9fddef9d88de9a4f44833311c112cf2a6ae5556943b5a0b"} Apr 17 18:51:14.805873 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:14.805819 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/70ceb0f8-7a3d-4e29-9470-f18b8af1daa1-metrics-certs\") pod \"network-metrics-daemon-v24kx\" (UID: \"70ceb0f8-7a3d-4e29-9470-f18b8af1daa1\") " pod="openshift-multus/network-metrics-daemon-v24kx" Apr 17 18:51:14.808359 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:14.808334 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/70ceb0f8-7a3d-4e29-9470-f18b8af1daa1-metrics-certs\") pod \"network-metrics-daemon-v24kx\" (UID: \"70ceb0f8-7a3d-4e29-9470-f18b8af1daa1\") " pod="openshift-multus/network-metrics-daemon-v24kx" Apr 17 18:51:15.001558 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:15.001527 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-dwb49\"" Apr 17 18:51:15.010387 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:15.010364 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v24kx" Apr 17 18:51:15.123915 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:15.123873 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-v24kx"] Apr 17 18:51:15.129509 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:51:15.129476 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70ceb0f8_7a3d_4e29_9470_f18b8af1daa1.slice/crio-252121747b1c0d8a32202d3cd8c4fca3764d4c803761ca7ae75cf1f424b5c103 WatchSource:0}: Error finding container 252121747b1c0d8a32202d3cd8c4fca3764d4c803761ca7ae75cf1f424b5c103: Status 404 returned error can't find the container with id 252121747b1c0d8a32202d3cd8c4fca3764d4c803761ca7ae75cf1f424b5c103 Apr 17 18:51:15.376773 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:15.376665 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-v24kx" event={"ID":"70ceb0f8-7a3d-4e29-9470-f18b8af1daa1","Type":"ContainerStarted","Data":"252121747b1c0d8a32202d3cd8c4fca3764d4c803761ca7ae75cf1f424b5c103"} Apr 17 18:51:15.377964 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:15.377923 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-rccbm" event={"ID":"406d0f3c-3b14-407a-a8a2-2894a5d84b73","Type":"ContainerStarted","Data":"055b29c4c6d81e7f80a32d1d635ae8a32d0d27268136cb4f160699fde28359b4"} Apr 17 18:51:15.378163 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:15.378140 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-rccbm" Apr 17 18:51:15.382946 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:15.382924 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-rccbm" Apr 17 18:51:15.392524 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:15.392282 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-rccbm" podStartSLOduration=1.742476678 podStartE2EDuration="3.392267818s" podCreationTimestamp="2026-04-17 18:51:12 +0000 UTC" firstStartedPulling="2026-04-17 18:51:12.963868111 +0000 UTC m=+128.616404014" lastFinishedPulling="2026-04-17 18:51:14.613659246 +0000 UTC m=+130.266195154" observedRunningTime="2026-04-17 18:51:15.391636966 +0000 UTC m=+131.044172894" watchObservedRunningTime="2026-04-17 18:51:15.392267818 +0000 UTC m=+131.044803744" Apr 17 18:51:17.385154 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:17.385114 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-v24kx" event={"ID":"70ceb0f8-7a3d-4e29-9470-f18b8af1daa1","Type":"ContainerStarted","Data":"f2bdc0aa23915719926206fc0f3578f8b62a80fcde0fd4f91ccbafa3cf0ba92f"} Apr 17 18:51:17.385154 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:17.385159 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-v24kx" event={"ID":"70ceb0f8-7a3d-4e29-9470-f18b8af1daa1","Type":"ContainerStarted","Data":"21463c65cfe201cfb47e002618cc6dc95dc35c050fd61868f7e0b53142e68733"} Apr 17 18:51:17.399565 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:17.399509 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-v24kx" podStartSLOduration=131.235905854 podStartE2EDuration="2m12.399494519s" podCreationTimestamp="2026-04-17 18:49:05 +0000 UTC" firstStartedPulling="2026-04-17 18:51:15.131220146 +0000 UTC m=+130.783756055" lastFinishedPulling="2026-04-17 18:51:16.294808814 +0000 UTC m=+131.947344720" observedRunningTime="2026-04-17 18:51:17.399062382 +0000 UTC m=+133.051598307" watchObservedRunningTime="2026-04-17 18:51:17.399494519 +0000 UTC m=+133.052030445" Apr 17 18:51:18.974387 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:18.974351 2580 scope.go:117] "RemoveContainer" containerID="6bf7aaf64b8fe477c6e73373b87023429815ec918f26d405f5501e227b618777" Apr 17 18:51:18.974738 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:51:18.974594 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-whmp8_openshift-console-operator(160e7e3f-dd7c-4341-8e04-ec0fc5728152)\"" pod="openshift-console-operator/console-operator-9d4b6777b-whmp8" podUID="160e7e3f-dd7c-4341-8e04-ec0fc5728152" Apr 17 18:51:19.929489 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:19.929453 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-wr6gp"] Apr 17 18:51:19.932943 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:19.932922 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wr6gp" Apr 17 18:51:19.935328 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:19.935301 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 18:51:19.936067 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:19.936046 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-xdz8s\"" Apr 17 18:51:19.936342 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:19.936310 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 17 18:51:19.936342 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:19.936329 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 17 18:51:19.937547 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:19.937530 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-wr6gx"] Apr 17 18:51:19.941207 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:19.941176 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-wr6gx" Apr 17 18:51:19.944483 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:19.944461 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 17 18:51:19.944636 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:19.944527 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 17 18:51:19.944636 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:19.944601 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 17 18:51:19.944743 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:19.944680 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-n4m6c\"" Apr 17 18:51:19.955992 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:19.955966 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-wr6gp"] Apr 17 18:51:19.957625 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:19.957588 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-wr6gx"] Apr 17 18:51:19.967145 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:19.967125 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-nmk4n"] Apr 17 18:51:19.970352 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:19.970337 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-nmk4n" Apr 17 18:51:19.972526 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:19.972498 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 18:51:19.972643 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:19.972598 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-7jr9c\"" Apr 17 18:51:19.972746 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:19.972726 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 18:51:19.972826 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:19.972802 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 18:51:20.052314 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.052282 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b98b43f3-b8ff-4a36-b218-2addb5512968-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nmk4n\" (UID: \"b98b43f3-b8ff-4a36-b218-2addb5512968\") " pod="openshift-monitoring/node-exporter-nmk4n" Apr 17 18:51:20.052314 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.052320 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b98b43f3-b8ff-4a36-b218-2addb5512968-sys\") pod \"node-exporter-nmk4n\" (UID: \"b98b43f3-b8ff-4a36-b218-2addb5512968\") " pod="openshift-monitoring/node-exporter-nmk4n" Apr 17 18:51:20.052703 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.052336 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stv9p\" (UniqueName: \"kubernetes.io/projected/b98b43f3-b8ff-4a36-b218-2addb5512968-kube-api-access-stv9p\") pod \"node-exporter-nmk4n\" (UID: \"b98b43f3-b8ff-4a36-b218-2addb5512968\") " pod="openshift-monitoring/node-exporter-nmk4n" Apr 17 18:51:20.052703 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.052361 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a2f32f32-33e9-48d6-8d38-5b98d899f12b-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-wr6gx\" (UID: \"a2f32f32-33e9-48d6-8d38-5b98d899f12b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wr6gx" Apr 17 18:51:20.052703 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.052380 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a2f32f32-33e9-48d6-8d38-5b98d899f12b-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-wr6gx\" (UID: \"a2f32f32-33e9-48d6-8d38-5b98d899f12b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wr6gx" Apr 17 18:51:20.052703 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.052413 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b98b43f3-b8ff-4a36-b218-2addb5512968-metrics-client-ca\") pod \"node-exporter-nmk4n\" (UID: \"b98b43f3-b8ff-4a36-b218-2addb5512968\") " pod="openshift-monitoring/node-exporter-nmk4n" Apr 17 18:51:20.052703 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.052461 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/45de126b-8927-4aa9-b058-aaf39e7bf849-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-wr6gp\" (UID: \"45de126b-8927-4aa9-b058-aaf39e7bf849\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wr6gp" Apr 17 18:51:20.052703 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.052487 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/a2f32f32-33e9-48d6-8d38-5b98d899f12b-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-wr6gx\" (UID: \"a2f32f32-33e9-48d6-8d38-5b98d899f12b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wr6gx" Apr 17 18:51:20.052703 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.052503 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a2f32f32-33e9-48d6-8d38-5b98d899f12b-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-wr6gx\" (UID: \"a2f32f32-33e9-48d6-8d38-5b98d899f12b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wr6gx" Apr 17 18:51:20.052703 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.052536 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7w5l\" (UniqueName: \"kubernetes.io/projected/45de126b-8927-4aa9-b058-aaf39e7bf849-kube-api-access-p7w5l\") pod \"openshift-state-metrics-9d44df66c-wr6gp\" (UID: \"45de126b-8927-4aa9-b058-aaf39e7bf849\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wr6gp" Apr 17 18:51:20.052703 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.052578 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b98b43f3-b8ff-4a36-b218-2addb5512968-node-exporter-tls\") pod \"node-exporter-nmk4n\" (UID: \"b98b43f3-b8ff-4a36-b218-2addb5512968\") " pod="openshift-monitoring/node-exporter-nmk4n" Apr 17 18:51:20.052703 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.052593 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b98b43f3-b8ff-4a36-b218-2addb5512968-node-exporter-wtmp\") pod \"node-exporter-nmk4n\" (UID: \"b98b43f3-b8ff-4a36-b218-2addb5512968\") " pod="openshift-monitoring/node-exporter-nmk4n" Apr 17 18:51:20.052703 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.052611 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/45de126b-8927-4aa9-b058-aaf39e7bf849-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-wr6gp\" (UID: \"45de126b-8927-4aa9-b058-aaf39e7bf849\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wr6gp" Apr 17 18:51:20.052703 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.052625 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b98b43f3-b8ff-4a36-b218-2addb5512968-node-exporter-textfile\") pod \"node-exporter-nmk4n\" (UID: \"b98b43f3-b8ff-4a36-b218-2addb5512968\") " pod="openshift-monitoring/node-exporter-nmk4n" Apr 17 18:51:20.052703 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.052641 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b98b43f3-b8ff-4a36-b218-2addb5512968-node-exporter-accelerators-collector-config\") pod \"node-exporter-nmk4n\" (UID: \"b98b43f3-b8ff-4a36-b218-2addb5512968\") " pod="openshift-monitoring/node-exporter-nmk4n" Apr 17 18:51:20.052703 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.052698 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/a2f32f32-33e9-48d6-8d38-5b98d899f12b-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-wr6gx\" (UID: \"a2f32f32-33e9-48d6-8d38-5b98d899f12b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wr6gx" Apr 17 18:51:20.053144 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.052727 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqdvs\" (UniqueName: \"kubernetes.io/projected/a2f32f32-33e9-48d6-8d38-5b98d899f12b-kube-api-access-zqdvs\") pod \"kube-state-metrics-69db897b98-wr6gx\" (UID: \"a2f32f32-33e9-48d6-8d38-5b98d899f12b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wr6gx" Apr 17 18:51:20.053144 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.052746 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b98b43f3-b8ff-4a36-b218-2addb5512968-root\") pod \"node-exporter-nmk4n\" (UID: \"b98b43f3-b8ff-4a36-b218-2addb5512968\") " pod="openshift-monitoring/node-exporter-nmk4n" Apr 17 18:51:20.053144 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.052767 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/45de126b-8927-4aa9-b058-aaf39e7bf849-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-wr6gp\" (UID: \"45de126b-8927-4aa9-b058-aaf39e7bf849\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wr6gp" Apr 17 18:51:20.153656 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.153615 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b98b43f3-b8ff-4a36-b218-2addb5512968-node-exporter-tls\") pod \"node-exporter-nmk4n\" (UID: \"b98b43f3-b8ff-4a36-b218-2addb5512968\") " pod="openshift-monitoring/node-exporter-nmk4n" Apr 17 18:51:20.153656 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.153652 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b98b43f3-b8ff-4a36-b218-2addb5512968-node-exporter-wtmp\") pod \"node-exporter-nmk4n\" (UID: \"b98b43f3-b8ff-4a36-b218-2addb5512968\") " pod="openshift-monitoring/node-exporter-nmk4n" Apr 17 18:51:20.153852 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.153672 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/45de126b-8927-4aa9-b058-aaf39e7bf849-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-wr6gp\" (UID: \"45de126b-8927-4aa9-b058-aaf39e7bf849\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wr6gp" Apr 17 18:51:20.153852 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.153690 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b98b43f3-b8ff-4a36-b218-2addb5512968-node-exporter-textfile\") pod \"node-exporter-nmk4n\" (UID: \"b98b43f3-b8ff-4a36-b218-2addb5512968\") " pod="openshift-monitoring/node-exporter-nmk4n" Apr 17 18:51:20.153852 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.153709 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b98b43f3-b8ff-4a36-b218-2addb5512968-node-exporter-accelerators-collector-config\") pod \"node-exporter-nmk4n\" (UID: \"b98b43f3-b8ff-4a36-b218-2addb5512968\") " pod="openshift-monitoring/node-exporter-nmk4n" Apr 17 18:51:20.153852 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.153742 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/a2f32f32-33e9-48d6-8d38-5b98d899f12b-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-wr6gx\" (UID: \"a2f32f32-33e9-48d6-8d38-5b98d899f12b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wr6gx" Apr 17 18:51:20.153852 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.153761 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zqdvs\" (UniqueName: \"kubernetes.io/projected/a2f32f32-33e9-48d6-8d38-5b98d899f12b-kube-api-access-zqdvs\") pod \"kube-state-metrics-69db897b98-wr6gx\" (UID: \"a2f32f32-33e9-48d6-8d38-5b98d899f12b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wr6gx" Apr 17 18:51:20.153852 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.153776 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b98b43f3-b8ff-4a36-b218-2addb5512968-root\") pod \"node-exporter-nmk4n\" (UID: \"b98b43f3-b8ff-4a36-b218-2addb5512968\") " pod="openshift-monitoring/node-exporter-nmk4n" Apr 17 18:51:20.153852 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:51:20.153792 2580 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 18:51:20.153852 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.153801 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/45de126b-8927-4aa9-b058-aaf39e7bf849-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-wr6gp\" (UID: \"45de126b-8927-4aa9-b058-aaf39e7bf849\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wr6gp" Apr 17 18:51:20.153852 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.153826 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b98b43f3-b8ff-4a36-b218-2addb5512968-node-exporter-wtmp\") pod \"node-exporter-nmk4n\" (UID: \"b98b43f3-b8ff-4a36-b218-2addb5512968\") " pod="openshift-monitoring/node-exporter-nmk4n" Apr 17 18:51:20.154365 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.153831 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b98b43f3-b8ff-4a36-b218-2addb5512968-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nmk4n\" (UID: \"b98b43f3-b8ff-4a36-b218-2addb5512968\") " pod="openshift-monitoring/node-exporter-nmk4n" Apr 17 18:51:20.154365 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:51:20.153882 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b98b43f3-b8ff-4a36-b218-2addb5512968-node-exporter-tls podName:b98b43f3-b8ff-4a36-b218-2addb5512968 nodeName:}" failed. No retries permitted until 2026-04-17 18:51:20.65385927 +0000 UTC m=+136.306395178 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/b98b43f3-b8ff-4a36-b218-2addb5512968-node-exporter-tls") pod "node-exporter-nmk4n" (UID: "b98b43f3-b8ff-4a36-b218-2addb5512968") : secret "node-exporter-tls" not found Apr 17 18:51:20.154365 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:51:20.153959 2580 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 17 18:51:20.154365 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:51:20.154030 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45de126b-8927-4aa9-b058-aaf39e7bf849-openshift-state-metrics-tls podName:45de126b-8927-4aa9-b058-aaf39e7bf849 nodeName:}" failed. No retries permitted until 2026-04-17 18:51:20.65401475 +0000 UTC m=+136.306550654 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/45de126b-8927-4aa9-b058-aaf39e7bf849-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-wr6gp" (UID: "45de126b-8927-4aa9-b058-aaf39e7bf849") : secret "openshift-state-metrics-tls" not found Apr 17 18:51:20.154365 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.154085 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b98b43f3-b8ff-4a36-b218-2addb5512968-sys\") pod \"node-exporter-nmk4n\" (UID: \"b98b43f3-b8ff-4a36-b218-2addb5512968\") " pod="openshift-monitoring/node-exporter-nmk4n" Apr 17 18:51:20.154365 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.154087 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b98b43f3-b8ff-4a36-b218-2addb5512968-node-exporter-textfile\") pod \"node-exporter-nmk4n\" (UID: \"b98b43f3-b8ff-4a36-b218-2addb5512968\") " pod="openshift-monitoring/node-exporter-nmk4n" Apr 17 18:51:20.154365 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.154097 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b98b43f3-b8ff-4a36-b218-2addb5512968-root\") pod \"node-exporter-nmk4n\" (UID: \"b98b43f3-b8ff-4a36-b218-2addb5512968\") " pod="openshift-monitoring/node-exporter-nmk4n" Apr 17 18:51:20.154365 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.154110 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-stv9p\" (UniqueName: \"kubernetes.io/projected/b98b43f3-b8ff-4a36-b218-2addb5512968-kube-api-access-stv9p\") pod \"node-exporter-nmk4n\" (UID: \"b98b43f3-b8ff-4a36-b218-2addb5512968\") " pod="openshift-monitoring/node-exporter-nmk4n" Apr 17 18:51:20.154365 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.154140 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b98b43f3-b8ff-4a36-b218-2addb5512968-sys\") pod \"node-exporter-nmk4n\" (UID: \"b98b43f3-b8ff-4a36-b218-2addb5512968\") " pod="openshift-monitoring/node-exporter-nmk4n" Apr 17 18:51:20.154365 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.154142 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a2f32f32-33e9-48d6-8d38-5b98d899f12b-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-wr6gx\" (UID: \"a2f32f32-33e9-48d6-8d38-5b98d899f12b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wr6gx" Apr 17 18:51:20.154365 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.154178 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a2f32f32-33e9-48d6-8d38-5b98d899f12b-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-wr6gx\" (UID: \"a2f32f32-33e9-48d6-8d38-5b98d899f12b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wr6gx" Apr 17 18:51:20.154365 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.154216 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b98b43f3-b8ff-4a36-b218-2addb5512968-metrics-client-ca\") pod \"node-exporter-nmk4n\" (UID: \"b98b43f3-b8ff-4a36-b218-2addb5512968\") " pod="openshift-monitoring/node-exporter-nmk4n" Apr 17 18:51:20.154365 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.154259 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/45de126b-8927-4aa9-b058-aaf39e7bf849-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-wr6gp\" (UID: \"45de126b-8927-4aa9-b058-aaf39e7bf849\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wr6gp" Apr 17 18:51:20.154365 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.154294 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/a2f32f32-33e9-48d6-8d38-5b98d899f12b-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-wr6gx\" (UID: \"a2f32f32-33e9-48d6-8d38-5b98d899f12b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wr6gx" Apr 17 18:51:20.154365 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.154317 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a2f32f32-33e9-48d6-8d38-5b98d899f12b-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-wr6gx\" (UID: \"a2f32f32-33e9-48d6-8d38-5b98d899f12b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wr6gx" Apr 17 18:51:20.155090 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.154482 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b98b43f3-b8ff-4a36-b218-2addb5512968-node-exporter-accelerators-collector-config\") pod \"node-exporter-nmk4n\" (UID: \"b98b43f3-b8ff-4a36-b218-2addb5512968\") " pod="openshift-monitoring/node-exporter-nmk4n" Apr 17 18:51:20.155090 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.154745 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/a2f32f32-33e9-48d6-8d38-5b98d899f12b-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-wr6gx\" (UID: \"a2f32f32-33e9-48d6-8d38-5b98d899f12b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wr6gx" Apr 17 18:51:20.155090 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.154991 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b98b43f3-b8ff-4a36-b218-2addb5512968-metrics-client-ca\") pod \"node-exporter-nmk4n\" (UID: \"b98b43f3-b8ff-4a36-b218-2addb5512968\") " pod="openshift-monitoring/node-exporter-nmk4n" Apr 17 18:51:20.155090 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.155057 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p7w5l\" (UniqueName: \"kubernetes.io/projected/45de126b-8927-4aa9-b058-aaf39e7bf849-kube-api-access-p7w5l\") pod \"openshift-state-metrics-9d44df66c-wr6gp\" (UID: \"45de126b-8927-4aa9-b058-aaf39e7bf849\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wr6gp" Apr 17 18:51:20.155318 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.155293 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/45de126b-8927-4aa9-b058-aaf39e7bf849-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-wr6gp\" (UID: \"45de126b-8927-4aa9-b058-aaf39e7bf849\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wr6gp" Apr 17 18:51:20.155393 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.155309 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/a2f32f32-33e9-48d6-8d38-5b98d899f12b-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-wr6gx\" (UID: \"a2f32f32-33e9-48d6-8d38-5b98d899f12b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wr6gx" Apr 17 18:51:20.155944 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.155919 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a2f32f32-33e9-48d6-8d38-5b98d899f12b-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-wr6gx\" (UID: \"a2f32f32-33e9-48d6-8d38-5b98d899f12b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wr6gx" Apr 17 18:51:20.156497 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.156477 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b98b43f3-b8ff-4a36-b218-2addb5512968-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nmk4n\" (UID: \"b98b43f3-b8ff-4a36-b218-2addb5512968\") " pod="openshift-monitoring/node-exporter-nmk4n" Apr 17 18:51:20.156693 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.156675 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a2f32f32-33e9-48d6-8d38-5b98d899f12b-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-wr6gx\" (UID: \"a2f32f32-33e9-48d6-8d38-5b98d899f12b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wr6gx" Apr 17 18:51:20.156852 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.156834 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a2f32f32-33e9-48d6-8d38-5b98d899f12b-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-wr6gx\" (UID: \"a2f32f32-33e9-48d6-8d38-5b98d899f12b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wr6gx" Apr 17 18:51:20.157382 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.157355 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/45de126b-8927-4aa9-b058-aaf39e7bf849-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-wr6gp\" (UID: \"45de126b-8927-4aa9-b058-aaf39e7bf849\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wr6gp" Apr 17 18:51:20.166600 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.166528 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-stv9p\" (UniqueName: \"kubernetes.io/projected/b98b43f3-b8ff-4a36-b218-2addb5512968-kube-api-access-stv9p\") pod \"node-exporter-nmk4n\" (UID: \"b98b43f3-b8ff-4a36-b218-2addb5512968\") " pod="openshift-monitoring/node-exporter-nmk4n" Apr 17 18:51:20.166712 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.166682 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqdvs\" (UniqueName: \"kubernetes.io/projected/a2f32f32-33e9-48d6-8d38-5b98d899f12b-kube-api-access-zqdvs\") pod \"kube-state-metrics-69db897b98-wr6gx\" (UID: \"a2f32f32-33e9-48d6-8d38-5b98d899f12b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wr6gx" Apr 17 18:51:20.167428 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.167127 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7w5l\" (UniqueName: \"kubernetes.io/projected/45de126b-8927-4aa9-b058-aaf39e7bf849-kube-api-access-p7w5l\") pod \"openshift-state-metrics-9d44df66c-wr6gp\" (UID: \"45de126b-8927-4aa9-b058-aaf39e7bf849\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wr6gp" Apr 17 18:51:20.254004 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.253971 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-wr6gx" Apr 17 18:51:20.376139 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.376101 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-wr6gx"] Apr 17 18:51:20.379191 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:51:20.379162 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2f32f32_33e9_48d6_8d38_5b98d899f12b.slice/crio-6bc4630e4b76f61c5066854299708ab5b1924976a2cc09c56dc6212e3c7cd94d WatchSource:0}: Error finding container 6bc4630e4b76f61c5066854299708ab5b1924976a2cc09c56dc6212e3c7cd94d: Status 404 returned error can't find the container with id 6bc4630e4b76f61c5066854299708ab5b1924976a2cc09c56dc6212e3c7cd94d Apr 17 18:51:20.393907 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.393869 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-wr6gx" event={"ID":"a2f32f32-33e9-48d6-8d38-5b98d899f12b","Type":"ContainerStarted","Data":"6bc4630e4b76f61c5066854299708ab5b1924976a2cc09c56dc6212e3c7cd94d"} Apr 17 18:51:20.660025 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.659918 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b98b43f3-b8ff-4a36-b218-2addb5512968-node-exporter-tls\") pod \"node-exporter-nmk4n\" (UID: \"b98b43f3-b8ff-4a36-b218-2addb5512968\") " pod="openshift-monitoring/node-exporter-nmk4n" Apr 17 18:51:20.660025 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.660017 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/45de126b-8927-4aa9-b058-aaf39e7bf849-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-wr6gp\" (UID: \"45de126b-8927-4aa9-b058-aaf39e7bf849\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wr6gp" Apr 17 18:51:20.662386 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.662363 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b98b43f3-b8ff-4a36-b218-2addb5512968-node-exporter-tls\") pod \"node-exporter-nmk4n\" (UID: \"b98b43f3-b8ff-4a36-b218-2addb5512968\") " pod="openshift-monitoring/node-exporter-nmk4n" Apr 17 18:51:20.662500 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.662453 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/45de126b-8927-4aa9-b058-aaf39e7bf849-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-wr6gp\" (UID: \"45de126b-8927-4aa9-b058-aaf39e7bf849\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wr6gp" Apr 17 18:51:20.848781 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.848748 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wr6gp" Apr 17 18:51:20.879393 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.879353 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-nmk4n" Apr 17 18:51:20.890508 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:51:20.890479 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb98b43f3_b8ff_4a36_b218_2addb5512968.slice/crio-27f92be20d3b61b8bf7f81dd20c599f4baab4d0ad0063edc172ffcca55fcdf36 WatchSource:0}: Error finding container 27f92be20d3b61b8bf7f81dd20c599f4baab4d0ad0063edc172ffcca55fcdf36: Status 404 returned error can't find the container with id 27f92be20d3b61b8bf7f81dd20c599f4baab4d0ad0063edc172ffcca55fcdf36 Apr 17 18:51:20.970628 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:20.970598 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-wr6gp"] Apr 17 18:51:20.974919 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:51:20.974863 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45de126b_8927_4aa9_b058_aaf39e7bf849.slice/crio-7040420e0275865e51ef33b7f73ef24f930a09e5459148e4bc42d638435ad254 WatchSource:0}: Error finding container 7040420e0275865e51ef33b7f73ef24f930a09e5459148e4bc42d638435ad254: Status 404 returned error can't find the container with id 7040420e0275865e51ef33b7f73ef24f930a09e5459148e4bc42d638435ad254 Apr 17 18:51:21.400770 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:21.400722 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wr6gp" event={"ID":"45de126b-8927-4aa9-b058-aaf39e7bf849","Type":"ContainerStarted","Data":"ac728fdfef857fd5408206f53a418db093f2d2fc7085ef629b77b0526d5b3cfa"} Apr 17 18:51:21.400770 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:21.400769 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wr6gp" event={"ID":"45de126b-8927-4aa9-b058-aaf39e7bf849","Type":"ContainerStarted","Data":"adb0549dbc8dbad37b544afccd0a6313c69372edf9a0be3d6c466ab88a89eef5"} Apr 17 18:51:21.401321 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:21.400783 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wr6gp" event={"ID":"45de126b-8927-4aa9-b058-aaf39e7bf849","Type":"ContainerStarted","Data":"7040420e0275865e51ef33b7f73ef24f930a09e5459148e4bc42d638435ad254"} Apr 17 18:51:21.402547 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:21.402480 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nmk4n" event={"ID":"b98b43f3-b8ff-4a36-b218-2addb5512968","Type":"ContainerStarted","Data":"27f92be20d3b61b8bf7f81dd20c599f4baab4d0ad0063edc172ffcca55fcdf36"} Apr 17 18:51:22.407595 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:22.407460 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-wr6gx" event={"ID":"a2f32f32-33e9-48d6-8d38-5b98d899f12b","Type":"ContainerStarted","Data":"69c32fdaa38431e47fdfbc7a869df738400364c732685fcb8277a0624ab2215d"} Apr 17 18:51:22.407595 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:22.407520 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-wr6gx" event={"ID":"a2f32f32-33e9-48d6-8d38-5b98d899f12b","Type":"ContainerStarted","Data":"9459f2bc72af4e7b850747cf4a1eec6bbe6de1b146216c5324d40c91c59402cb"} Apr 17 18:51:22.408867 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:22.408826 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nmk4n" event={"ID":"b98b43f3-b8ff-4a36-b218-2addb5512968","Type":"ContainerStarted","Data":"1933aaf0d104c8b5034f5d8b66884ea8412469d593954934521040db9eaed52e"} Apr 17 18:51:22.925244 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:22.925211 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-65678c6864-lv862"] Apr 17 18:51:22.929509 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:22.929485 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-65678c6864-lv862" Apr 17 18:51:22.931828 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:22.931757 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 17 18:51:22.931828 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:22.931771 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 17 18:51:22.931828 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:22.931757 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 17 18:51:22.932124 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:22.931851 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-7295eej6nubl6\"" Apr 17 18:51:22.932124 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:22.931851 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 17 18:51:22.932239 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:22.932181 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 17 18:51:22.932239 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:22.932183 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-rvhvr\"" Apr 17 18:51:22.939341 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:22.939320 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-65678c6864-lv862"] Apr 17 18:51:23.083074 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:23.083041 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/a62685de-f03c-4d27-8601-ef4f4810a833-secret-thanos-querier-tls\") pod \"thanos-querier-65678c6864-lv862\" (UID: \"a62685de-f03c-4d27-8601-ef4f4810a833\") " pod="openshift-monitoring/thanos-querier-65678c6864-lv862" Apr 17 18:51:23.083280 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:23.083090 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/a62685de-f03c-4d27-8601-ef4f4810a833-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-65678c6864-lv862\" (UID: \"a62685de-f03c-4d27-8601-ef4f4810a833\") " pod="openshift-monitoring/thanos-querier-65678c6864-lv862" Apr 17 18:51:23.083280 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:23.083187 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a62685de-f03c-4d27-8601-ef4f4810a833-secret-grpc-tls\") pod \"thanos-querier-65678c6864-lv862\" (UID: \"a62685de-f03c-4d27-8601-ef4f4810a833\") " pod="openshift-monitoring/thanos-querier-65678c6864-lv862" Apr 17 18:51:23.083280 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:23.083220 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a62685de-f03c-4d27-8601-ef4f4810a833-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-65678c6864-lv862\" (UID: \"a62685de-f03c-4d27-8601-ef4f4810a833\") " pod="openshift-monitoring/thanos-querier-65678c6864-lv862" Apr 17 18:51:23.083280 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:23.083248 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/a62685de-f03c-4d27-8601-ef4f4810a833-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-65678c6864-lv862\" (UID: \"a62685de-f03c-4d27-8601-ef4f4810a833\") " pod="openshift-monitoring/thanos-querier-65678c6864-lv862" Apr 17 18:51:23.083446 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:23.083298 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a62685de-f03c-4d27-8601-ef4f4810a833-metrics-client-ca\") pod \"thanos-querier-65678c6864-lv862\" (UID: \"a62685de-f03c-4d27-8601-ef4f4810a833\") " pod="openshift-monitoring/thanos-querier-65678c6864-lv862" Apr 17 18:51:23.083446 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:23.083360 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a62685de-f03c-4d27-8601-ef4f4810a833-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-65678c6864-lv862\" (UID: \"a62685de-f03c-4d27-8601-ef4f4810a833\") " pod="openshift-monitoring/thanos-querier-65678c6864-lv862" Apr 17 18:51:23.083446 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:23.083382 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzzlb\" (UniqueName: \"kubernetes.io/projected/a62685de-f03c-4d27-8601-ef4f4810a833-kube-api-access-tzzlb\") pod \"thanos-querier-65678c6864-lv862\" (UID: \"a62685de-f03c-4d27-8601-ef4f4810a833\") " pod="openshift-monitoring/thanos-querier-65678c6864-lv862" Apr 17 18:51:23.184561 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:23.184468 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a62685de-f03c-4d27-8601-ef4f4810a833-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-65678c6864-lv862\" (UID: \"a62685de-f03c-4d27-8601-ef4f4810a833\") " pod="openshift-monitoring/thanos-querier-65678c6864-lv862" Apr 17 18:51:23.184561 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:23.184523 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tzzlb\" (UniqueName: \"kubernetes.io/projected/a62685de-f03c-4d27-8601-ef4f4810a833-kube-api-access-tzzlb\") pod \"thanos-querier-65678c6864-lv862\" (UID: \"a62685de-f03c-4d27-8601-ef4f4810a833\") " pod="openshift-monitoring/thanos-querier-65678c6864-lv862" Apr 17 18:51:23.184822 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:23.184594 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/a62685de-f03c-4d27-8601-ef4f4810a833-secret-thanos-querier-tls\") pod \"thanos-querier-65678c6864-lv862\" (UID: \"a62685de-f03c-4d27-8601-ef4f4810a833\") " pod="openshift-monitoring/thanos-querier-65678c6864-lv862" Apr 17 18:51:23.184822 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:23.184623 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/a62685de-f03c-4d27-8601-ef4f4810a833-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-65678c6864-lv862\" (UID: \"a62685de-f03c-4d27-8601-ef4f4810a833\") " pod="openshift-monitoring/thanos-querier-65678c6864-lv862" Apr 17 18:51:23.184822 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:23.184677 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a62685de-f03c-4d27-8601-ef4f4810a833-secret-grpc-tls\") pod \"thanos-querier-65678c6864-lv862\" (UID: \"a62685de-f03c-4d27-8601-ef4f4810a833\") " pod="openshift-monitoring/thanos-querier-65678c6864-lv862" Apr 17 18:51:23.185005 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:23.184838 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a62685de-f03c-4d27-8601-ef4f4810a833-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-65678c6864-lv862\" (UID: \"a62685de-f03c-4d27-8601-ef4f4810a833\") " pod="openshift-monitoring/thanos-querier-65678c6864-lv862" Apr 17 18:51:23.185005 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:23.184920 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/a62685de-f03c-4d27-8601-ef4f4810a833-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-65678c6864-lv862\" (UID: \"a62685de-f03c-4d27-8601-ef4f4810a833\") " pod="openshift-monitoring/thanos-querier-65678c6864-lv862" Apr 17 18:51:23.185005 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:23.184971 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a62685de-f03c-4d27-8601-ef4f4810a833-metrics-client-ca\") pod \"thanos-querier-65678c6864-lv862\" (UID: \"a62685de-f03c-4d27-8601-ef4f4810a833\") " pod="openshift-monitoring/thanos-querier-65678c6864-lv862" Apr 17 18:51:23.185867 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:23.185772 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a62685de-f03c-4d27-8601-ef4f4810a833-metrics-client-ca\") pod \"thanos-querier-65678c6864-lv862\" (UID: \"a62685de-f03c-4d27-8601-ef4f4810a833\") " pod="openshift-monitoring/thanos-querier-65678c6864-lv862" Apr 17 18:51:23.187866 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:23.187806 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/a62685de-f03c-4d27-8601-ef4f4810a833-secret-thanos-querier-tls\") pod \"thanos-querier-65678c6864-lv862\" (UID: \"a62685de-f03c-4d27-8601-ef4f4810a833\") " pod="openshift-monitoring/thanos-querier-65678c6864-lv862" Apr 17 18:51:23.187866 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:23.187837 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a62685de-f03c-4d27-8601-ef4f4810a833-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-65678c6864-lv862\" (UID: \"a62685de-f03c-4d27-8601-ef4f4810a833\") " pod="openshift-monitoring/thanos-querier-65678c6864-lv862" Apr 17 18:51:23.188097 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:23.188066 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a62685de-f03c-4d27-8601-ef4f4810a833-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-65678c6864-lv862\" (UID: \"a62685de-f03c-4d27-8601-ef4f4810a833\") " pod="openshift-monitoring/thanos-querier-65678c6864-lv862" Apr 17 18:51:23.188159 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:23.188065 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/a62685de-f03c-4d27-8601-ef4f4810a833-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-65678c6864-lv862\" (UID: \"a62685de-f03c-4d27-8601-ef4f4810a833\") " pod="openshift-monitoring/thanos-querier-65678c6864-lv862" Apr 17 18:51:23.188224 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:23.188193 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/a62685de-f03c-4d27-8601-ef4f4810a833-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-65678c6864-lv862\" (UID: \"a62685de-f03c-4d27-8601-ef4f4810a833\") " pod="openshift-monitoring/thanos-querier-65678c6864-lv862" Apr 17 18:51:23.188279 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:23.188264 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a62685de-f03c-4d27-8601-ef4f4810a833-secret-grpc-tls\") pod \"thanos-querier-65678c6864-lv862\" (UID: \"a62685de-f03c-4d27-8601-ef4f4810a833\") " pod="openshift-monitoring/thanos-querier-65678c6864-lv862" Apr 17 18:51:23.191711 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:23.191685 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzzlb\" (UniqueName: \"kubernetes.io/projected/a62685de-f03c-4d27-8601-ef4f4810a833-kube-api-access-tzzlb\") pod \"thanos-querier-65678c6864-lv862\" (UID: \"a62685de-f03c-4d27-8601-ef4f4810a833\") " pod="openshift-monitoring/thanos-querier-65678c6864-lv862" Apr 17 18:51:23.239758 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:23.239725 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-65678c6864-lv862" Apr 17 18:51:23.369287 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:23.369255 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-65678c6864-lv862"] Apr 17 18:51:23.374130 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:51:23.374100 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda62685de_f03c_4d27_8601_ef4f4810a833.slice/crio-613dbd6987c2f1da112e926cf1545693be9ad37f67a998cdc46655a666e8aaa0 WatchSource:0}: Error finding container 613dbd6987c2f1da112e926cf1545693be9ad37f67a998cdc46655a666e8aaa0: Status 404 returned error can't find the container with id 613dbd6987c2f1da112e926cf1545693be9ad37f67a998cdc46655a666e8aaa0 Apr 17 18:51:23.413551 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:23.413510 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-wr6gx" event={"ID":"a2f32f32-33e9-48d6-8d38-5b98d899f12b","Type":"ContainerStarted","Data":"0d797607c504cefeab9562665e902475353c8702cecfc597daf92c236f747b3f"} Apr 17 18:51:23.414610 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:23.414578 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-65678c6864-lv862" event={"ID":"a62685de-f03c-4d27-8601-ef4f4810a833","Type":"ContainerStarted","Data":"613dbd6987c2f1da112e926cf1545693be9ad37f67a998cdc46655a666e8aaa0"} Apr 17 18:51:23.415813 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:23.415788 2580 generic.go:358] "Generic (PLEG): container finished" podID="b98b43f3-b8ff-4a36-b218-2addb5512968" containerID="1933aaf0d104c8b5034f5d8b66884ea8412469d593954934521040db9eaed52e" exitCode=0 Apr 17 18:51:23.415950 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:23.415874 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nmk4n" event={"ID":"b98b43f3-b8ff-4a36-b218-2addb5512968","Type":"ContainerDied","Data":"1933aaf0d104c8b5034f5d8b66884ea8412469d593954934521040db9eaed52e"} Apr 17 18:51:23.417769 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:23.417747 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wr6gp" event={"ID":"45de126b-8927-4aa9-b058-aaf39e7bf849","Type":"ContainerStarted","Data":"be65a5c10e1761099266bb0a0572fe30e6f68048311bffa59f13fcf04f37466e"} Apr 17 18:51:23.431462 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:23.431416 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-wr6gx" podStartSLOduration=2.679848623 podStartE2EDuration="4.431403254s" podCreationTimestamp="2026-04-17 18:51:19 +0000 UTC" firstStartedPulling="2026-04-17 18:51:20.381157061 +0000 UTC m=+136.033692965" lastFinishedPulling="2026-04-17 18:51:22.13271169 +0000 UTC m=+137.785247596" observedRunningTime="2026-04-17 18:51:23.429866203 +0000 UTC m=+139.082402129" watchObservedRunningTime="2026-04-17 18:51:23.431403254 +0000 UTC m=+139.083939179" Apr 17 18:51:23.446003 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:23.445954 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wr6gp" podStartSLOduration=2.997435274 podStartE2EDuration="4.445940705s" podCreationTimestamp="2026-04-17 18:51:19 +0000 UTC" firstStartedPulling="2026-04-17 18:51:21.11211671 +0000 UTC m=+136.764652614" lastFinishedPulling="2026-04-17 18:51:22.560622139 +0000 UTC m=+138.213158045" observedRunningTime="2026-04-17 18:51:23.444760107 +0000 UTC m=+139.097296033" watchObservedRunningTime="2026-04-17 18:51:23.445940705 +0000 UTC m=+139.098476629" Apr 17 18:51:24.423553 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:24.423502 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nmk4n" event={"ID":"b98b43f3-b8ff-4a36-b218-2addb5512968","Type":"ContainerStarted","Data":"d6b6db4fdf05ad87622136dccf8303593fa6daf29f6020b18cc7f2301224f040"} Apr 17 18:51:24.423553 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:24.423552 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nmk4n" event={"ID":"b98b43f3-b8ff-4a36-b218-2addb5512968","Type":"ContainerStarted","Data":"10499ac4ad854d8d1dd413700eb9a123ed9ea0d91187d1f191edba5866ba7a98"} Apr 17 18:51:24.445591 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:24.444387 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-nmk4n" podStartSLOduration=4.206356515 podStartE2EDuration="5.444367976s" podCreationTimestamp="2026-04-17 18:51:19 +0000 UTC" firstStartedPulling="2026-04-17 18:51:20.892641585 +0000 UTC m=+136.545177489" lastFinishedPulling="2026-04-17 18:51:22.130653033 +0000 UTC m=+137.783188950" observedRunningTime="2026-04-17 18:51:24.441020833 +0000 UTC m=+140.093556771" watchObservedRunningTime="2026-04-17 18:51:24.444367976 +0000 UTC m=+140.096903903" Apr 17 18:51:26.185646 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.185616 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 18:51:26.189540 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.189520 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.192628 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.192605 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 18:51:26.193782 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.193561 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-5p2rppk6l278u\"" Apr 17 18:51:26.193782 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.193571 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 18:51:26.193782 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.193597 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-qmls2\"" Apr 17 18:51:26.193782 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.193613 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 18:51:26.193782 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.193542 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 18:51:26.193782 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.193570 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 18:51:26.193782 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.193619 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 18:51:26.193782 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.193671 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 18:51:26.193782 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.193622 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 18:51:26.194264 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.194089 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 18:51:26.194264 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.194163 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 18:51:26.194726 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.194698 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 18:51:26.195720 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.195695 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 18:51:26.199638 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.199614 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 18:51:26.206323 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.206297 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 18:51:26.313772 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.313714 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.313772 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.313764 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10844520-2815-46a3-b729-f4694a601c41-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.314040 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.313797 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.314040 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.313825 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.314040 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.313883 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-config\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.314040 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.313929 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/10844520-2815-46a3-b729-f4694a601c41-config-out\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.314040 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.314003 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10844520-2815-46a3-b729-f4694a601c41-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.314297 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.314047 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.314297 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.314091 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.314297 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.314123 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/10844520-2815-46a3-b729-f4694a601c41-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.314297 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.314151 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.314297 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.314176 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10844520-2815-46a3-b729-f4694a601c41-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.314297 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.314204 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/10844520-2815-46a3-b729-f4694a601c41-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.314297 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.314283 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-web-config\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.314617 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.314341 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.314617 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.314365 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/10844520-2815-46a3-b729-f4694a601c41-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.314617 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.314392 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv8s8\" (UniqueName: \"kubernetes.io/projected/10844520-2815-46a3-b729-f4694a601c41-kube-api-access-hv8s8\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.314617 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.314474 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/10844520-2815-46a3-b729-f4694a601c41-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.415756 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.415715 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/10844520-2815-46a3-b729-f4694a601c41-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.415756 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.415762 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hv8s8\" (UniqueName: \"kubernetes.io/projected/10844520-2815-46a3-b729-f4694a601c41-kube-api-access-hv8s8\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.416025 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.415812 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/10844520-2815-46a3-b729-f4694a601c41-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.416025 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.415860 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.416025 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.415908 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10844520-2815-46a3-b729-f4694a601c41-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.416025 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.415937 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.416025 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.415968 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.416025 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.416001 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-config\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.416025 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.416024 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/10844520-2815-46a3-b729-f4694a601c41-config-out\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.416371 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.416055 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10844520-2815-46a3-b729-f4694a601c41-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.416371 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.416092 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.416371 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.416123 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.416371 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.416159 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/10844520-2815-46a3-b729-f4694a601c41-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.416371 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.416188 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.416371 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.416224 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10844520-2815-46a3-b729-f4694a601c41-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.416371 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.416266 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/10844520-2815-46a3-b729-f4694a601c41-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.416371 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.416307 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-web-config\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.417145 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.417118 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10844520-2815-46a3-b729-f4694a601c41-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.417347 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.417322 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/10844520-2815-46a3-b729-f4694a601c41-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.417578 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.417554 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10844520-2815-46a3-b729-f4694a601c41-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.418019 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.417994 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/10844520-2815-46a3-b729-f4694a601c41-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.419578 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.419200 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/10844520-2815-46a3-b729-f4694a601c41-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.419876 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.419671 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/10844520-2815-46a3-b729-f4694a601c41-config-out\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.419876 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.419733 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.419876 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.419841 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.420349 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.420300 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-config\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.421034 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.421009 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.421158 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.421139 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.422172 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.422145 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10844520-2815-46a3-b729-f4694a601c41-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.422542 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.422509 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.422694 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.422669 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.423084 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.423043 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.423185 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.423100 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-web-config\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.423260 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.423238 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.424114 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.424089 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv8s8\" (UniqueName: \"kubernetes.io/projected/10844520-2815-46a3-b729-f4694a601c41-kube-api-access-hv8s8\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.424421 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.424397 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/10844520-2815-46a3-b729-f4694a601c41-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.431160 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.431133 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-65678c6864-lv862" event={"ID":"a62685de-f03c-4d27-8601-ef4f4810a833","Type":"ContainerStarted","Data":"bb37cf41f5dcc2c3b1c6d8bc4c98be61460bd5dad2f9029a27816a5481c0f49c"} Apr 17 18:51:26.431272 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.431173 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-65678c6864-lv862" event={"ID":"a62685de-f03c-4d27-8601-ef4f4810a833","Type":"ContainerStarted","Data":"815add40c32c2d4e38250014234947026509c31d6e0ecbe783f4097a940874be"} Apr 17 18:51:26.431272 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.431185 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-65678c6864-lv862" event={"ID":"a62685de-f03c-4d27-8601-ef4f4810a833","Type":"ContainerStarted","Data":"4dafdb79eaf84eb897c1e36fe5e61ee72559d141799f79d9254729ca66791036"} Apr 17 18:51:26.500632 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.500607 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:26.641049 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:26.641023 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 18:51:26.644455 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:51:26.644430 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10844520_2815_46a3_b729_f4694a601c41.slice/crio-bb9fc0c17051fb7c2a2d972d16324b864ff9512f6a3e769ce168ca89e38e8aec WatchSource:0}: Error finding container bb9fc0c17051fb7c2a2d972d16324b864ff9512f6a3e769ce168ca89e38e8aec: Status 404 returned error can't find the container with id bb9fc0c17051fb7c2a2d972d16324b864ff9512f6a3e769ce168ca89e38e8aec Apr 17 18:51:27.436213 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:27.436172 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"10844520-2815-46a3-b729-f4694a601c41","Type":"ContainerStarted","Data":"bb9fc0c17051fb7c2a2d972d16324b864ff9512f6a3e769ce168ca89e38e8aec"} Apr 17 18:51:27.439266 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:27.439238 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-65678c6864-lv862" event={"ID":"a62685de-f03c-4d27-8601-ef4f4810a833","Type":"ContainerStarted","Data":"32ec2db215cb65db31bf0b61ae941cc44c9ebbf10d83035f0d919a395511cf10"} Apr 17 18:51:27.439396 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:27.439291 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-65678c6864-lv862" event={"ID":"a62685de-f03c-4d27-8601-ef4f4810a833","Type":"ContainerStarted","Data":"b1280e7ed776b5315541ab52b98960fadd6c428eb5273324093ce5da6c5570c7"} Apr 17 18:51:27.439396 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:27.439306 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-65678c6864-lv862" event={"ID":"a62685de-f03c-4d27-8601-ef4f4810a833","Type":"ContainerStarted","Data":"2115b55a12120518ebac71a4b8b6f660c81ad65fcc2e9d9449f4799a1c349db9"} Apr 17 18:51:27.439488 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:27.439471 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-65678c6864-lv862" Apr 17 18:51:27.461481 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:27.461430 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-65678c6864-lv862" podStartSLOduration=2.336635897 podStartE2EDuration="5.461410892s" podCreationTimestamp="2026-04-17 18:51:22 +0000 UTC" firstStartedPulling="2026-04-17 18:51:23.376198295 +0000 UTC m=+139.028734199" lastFinishedPulling="2026-04-17 18:51:26.50097328 +0000 UTC m=+142.153509194" observedRunningTime="2026-04-17 18:51:27.460720157 +0000 UTC m=+143.113256093" watchObservedRunningTime="2026-04-17 18:51:27.461410892 +0000 UTC m=+143.113946819" Apr 17 18:51:28.448624 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:28.448592 2580 generic.go:358] "Generic (PLEG): container finished" podID="10844520-2815-46a3-b729-f4694a601c41" containerID="7663f3f3ae903be2cfb9b0b4086f5f2ed8fb75d72d512890d2ca4e8f10946b56" exitCode=0 Apr 17 18:51:28.449075 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:28.448677 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"10844520-2815-46a3-b729-f4694a601c41","Type":"ContainerDied","Data":"7663f3f3ae903be2cfb9b0b4086f5f2ed8fb75d72d512890d2ca4e8f10946b56"} Apr 17 18:51:30.242651 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:30.242615 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-67d5f96857-4xrp8"] Apr 17 18:51:30.243091 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:51:30.242936 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-67d5f96857-4xrp8" podUID="14c44177-594e-4d5c-a3a1-0e8d4627aabd" Apr 17 18:51:30.455803 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:30.455770 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67d5f96857-4xrp8" Apr 17 18:51:30.460476 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:30.460455 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67d5f96857-4xrp8" Apr 17 18:51:30.563226 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:30.563137 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/14c44177-594e-4d5c-a3a1-0e8d4627aabd-ca-trust-extracted\") pod \"14c44177-594e-4d5c-a3a1-0e8d4627aabd\" (UID: \"14c44177-594e-4d5c-a3a1-0e8d4627aabd\") " Apr 17 18:51:30.563226 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:30.563186 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgtvf\" (UniqueName: \"kubernetes.io/projected/14c44177-594e-4d5c-a3a1-0e8d4627aabd-kube-api-access-lgtvf\") pod \"14c44177-594e-4d5c-a3a1-0e8d4627aabd\" (UID: \"14c44177-594e-4d5c-a3a1-0e8d4627aabd\") " Apr 17 18:51:30.563226 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:30.563220 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/14c44177-594e-4d5c-a3a1-0e8d4627aabd-registry-certificates\") pod \"14c44177-594e-4d5c-a3a1-0e8d4627aabd\" (UID: \"14c44177-594e-4d5c-a3a1-0e8d4627aabd\") " Apr 17 18:51:30.563487 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:30.563244 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/14c44177-594e-4d5c-a3a1-0e8d4627aabd-image-registry-private-configuration\") pod \"14c44177-594e-4d5c-a3a1-0e8d4627aabd\" (UID: \"14c44177-594e-4d5c-a3a1-0e8d4627aabd\") " Apr 17 18:51:30.563487 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:30.563305 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14c44177-594e-4d5c-a3a1-0e8d4627aabd-bound-sa-token\") pod \"14c44177-594e-4d5c-a3a1-0e8d4627aabd\" (UID: \"14c44177-594e-4d5c-a3a1-0e8d4627aabd\") " Apr 17 18:51:30.563487 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:30.563336 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/14c44177-594e-4d5c-a3a1-0e8d4627aabd-trusted-ca\") pod \"14c44177-594e-4d5c-a3a1-0e8d4627aabd\" (UID: \"14c44177-594e-4d5c-a3a1-0e8d4627aabd\") " Apr 17 18:51:30.563487 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:30.563378 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/14c44177-594e-4d5c-a3a1-0e8d4627aabd-installation-pull-secrets\") pod \"14c44177-594e-4d5c-a3a1-0e8d4627aabd\" (UID: \"14c44177-594e-4d5c-a3a1-0e8d4627aabd\") " Apr 17 18:51:30.563679 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:30.563482 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14c44177-594e-4d5c-a3a1-0e8d4627aabd-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "14c44177-594e-4d5c-a3a1-0e8d4627aabd" (UID: "14c44177-594e-4d5c-a3a1-0e8d4627aabd"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:51:30.563779 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:30.563750 2580 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/14c44177-594e-4d5c-a3a1-0e8d4627aabd-ca-trust-extracted\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 17 18:51:30.563915 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:30.563827 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14c44177-594e-4d5c-a3a1-0e8d4627aabd-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "14c44177-594e-4d5c-a3a1-0e8d4627aabd" (UID: "14c44177-594e-4d5c-a3a1-0e8d4627aabd"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:51:30.564054 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:30.564026 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14c44177-594e-4d5c-a3a1-0e8d4627aabd-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "14c44177-594e-4d5c-a3a1-0e8d4627aabd" (UID: "14c44177-594e-4d5c-a3a1-0e8d4627aabd"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:51:30.566231 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:30.566195 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14c44177-594e-4d5c-a3a1-0e8d4627aabd-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "14c44177-594e-4d5c-a3a1-0e8d4627aabd" (UID: "14c44177-594e-4d5c-a3a1-0e8d4627aabd"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:51:30.566321 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:30.566252 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14c44177-594e-4d5c-a3a1-0e8d4627aabd-kube-api-access-lgtvf" (OuterVolumeSpecName: "kube-api-access-lgtvf") pod "14c44177-594e-4d5c-a3a1-0e8d4627aabd" (UID: "14c44177-594e-4d5c-a3a1-0e8d4627aabd"). InnerVolumeSpecName "kube-api-access-lgtvf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:51:30.566378 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:30.566322 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14c44177-594e-4d5c-a3a1-0e8d4627aabd-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "14c44177-594e-4d5c-a3a1-0e8d4627aabd" (UID: "14c44177-594e-4d5c-a3a1-0e8d4627aabd"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:51:30.566378 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:30.566349 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14c44177-594e-4d5c-a3a1-0e8d4627aabd-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "14c44177-594e-4d5c-a3a1-0e8d4627aabd" (UID: "14c44177-594e-4d5c-a3a1-0e8d4627aabd"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:51:30.664462 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:30.664427 2580 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/14c44177-594e-4d5c-a3a1-0e8d4627aabd-installation-pull-secrets\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 17 18:51:30.664462 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:30.664459 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lgtvf\" (UniqueName: \"kubernetes.io/projected/14c44177-594e-4d5c-a3a1-0e8d4627aabd-kube-api-access-lgtvf\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 17 18:51:30.664462 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:30.664471 2580 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/14c44177-594e-4d5c-a3a1-0e8d4627aabd-registry-certificates\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 17 18:51:30.664716 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:30.664480 2580 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/14c44177-594e-4d5c-a3a1-0e8d4627aabd-image-registry-private-configuration\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 17 18:51:30.664716 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:30.664491 2580 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14c44177-594e-4d5c-a3a1-0e8d4627aabd-bound-sa-token\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 17 18:51:30.664716 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:30.664500 2580 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/14c44177-594e-4d5c-a3a1-0e8d4627aabd-trusted-ca\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 17 18:51:31.459344 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:31.459319 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67d5f96857-4xrp8" Apr 17 18:51:31.503486 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:31.503457 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-67d5f96857-4xrp8"] Apr 17 18:51:31.508390 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:31.508363 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-67d5f96857-4xrp8"] Apr 17 18:51:31.573470 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:31.573431 2580 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14c44177-594e-4d5c-a3a1-0e8d4627aabd-registry-tls\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 17 18:51:32.465575 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:32.465536 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"10844520-2815-46a3-b729-f4694a601c41","Type":"ContainerStarted","Data":"3dd50478a93ed7d0e95672333fe575083a4bcca345789c8b269a4ffe516bbcea"} Apr 17 18:51:32.465575 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:32.465578 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"10844520-2815-46a3-b729-f4694a601c41","Type":"ContainerStarted","Data":"4c3ddbf91ec0274f00a6e489c23e54dec98854b7f8d8e4c1e7586c324094fe38"} Apr 17 18:51:32.466033 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:32.465597 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"10844520-2815-46a3-b729-f4694a601c41","Type":"ContainerStarted","Data":"0be32ce2f3b70adb1c0ab753692238b7c3cd0d7354ce30ade6e07b83783e497c"} Apr 17 18:51:32.466033 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:32.465607 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"10844520-2815-46a3-b729-f4694a601c41","Type":"ContainerStarted","Data":"8786b213b1899d153f94eca67d8b64c85b19369c5297bfb89170c56aab6c45be"} Apr 17 18:51:32.466033 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:32.465615 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"10844520-2815-46a3-b729-f4694a601c41","Type":"ContainerStarted","Data":"a551013c21bf93a54e78f406de1f7199c797920da8838145127c6eddb77c1556"} Apr 17 18:51:32.466033 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:32.465630 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"10844520-2815-46a3-b729-f4694a601c41","Type":"ContainerStarted","Data":"5e9292f545d70a20668e7cf0bf33b30c3b6f1b6e666a16fe9aa42777cedfbfd6"} Apr 17 18:51:32.495830 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:32.495655 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.712601343 podStartE2EDuration="6.495637148s" podCreationTimestamp="2026-04-17 18:51:26 +0000 UTC" firstStartedPulling="2026-04-17 18:51:26.646508457 +0000 UTC m=+142.299044370" lastFinishedPulling="2026-04-17 18:51:31.429544269 +0000 UTC m=+147.082080175" observedRunningTime="2026-04-17 18:51:32.494170081 +0000 UTC m=+148.146706010" watchObservedRunningTime="2026-04-17 18:51:32.495637148 +0000 UTC m=+148.148173074" Apr 17 18:51:32.973416 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:32.973386 2580 scope.go:117] "RemoveContainer" containerID="6bf7aaf64b8fe477c6e73373b87023429815ec918f26d405f5501e227b618777" Apr 17 18:51:32.977651 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:32.977608 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14c44177-594e-4d5c-a3a1-0e8d4627aabd" path="/var/lib/kubelet/pods/14c44177-594e-4d5c-a3a1-0e8d4627aabd/volumes" Apr 17 18:51:33.455530 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:33.455505 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-65678c6864-lv862" Apr 17 18:51:33.469611 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:33.469588 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-whmp8_160e7e3f-dd7c-4341-8e04-ec0fc5728152/console-operator/2.log" Apr 17 18:51:33.469993 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:33.469707 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-whmp8" event={"ID":"160e7e3f-dd7c-4341-8e04-ec0fc5728152","Type":"ContainerStarted","Data":"4accd6e81e386a9541f729743d9b9ffa3d47fd0c55f9327154ccdf64f44f15ff"} Apr 17 18:51:33.470036 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:33.469999 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-whmp8" Apr 17 18:51:33.474576 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:33.474556 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-whmp8" Apr 17 18:51:33.494764 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:33.494704 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-whmp8" podStartSLOduration=53.672703041 podStartE2EDuration="56.494687309s" podCreationTimestamp="2026-04-17 18:50:37 +0000 UTC" firstStartedPulling="2026-04-17 18:50:37.768462586 +0000 UTC m=+93.420998498" lastFinishedPulling="2026-04-17 18:50:40.590446862 +0000 UTC m=+96.242982766" observedRunningTime="2026-04-17 18:51:33.493151291 +0000 UTC m=+149.145687218" watchObservedRunningTime="2026-04-17 18:51:33.494687309 +0000 UTC m=+149.147223236" Apr 17 18:51:36.501370 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:36.501335 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:51:41.252070 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:51:41.252009 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-94ggh" podUID="98b160b8-551d-443c-a3a0-4d046919e27c" Apr 17 18:51:41.259170 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:51:41.259124 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-l4g6s" podUID="648e7199-fd23-4496-ad24-5b9e829d77fa" Apr 17 18:51:41.496189 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:41.496144 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-l4g6s" Apr 17 18:51:41.496393 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:41.496145 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-94ggh" Apr 17 18:51:46.206395 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:46.206343 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/648e7199-fd23-4496-ad24-5b9e829d77fa-metrics-tls\") pod \"dns-default-l4g6s\" (UID: \"648e7199-fd23-4496-ad24-5b9e829d77fa\") " pod="openshift-dns/dns-default-l4g6s" Apr 17 18:51:46.206395 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:46.206401 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/98b160b8-551d-443c-a3a0-4d046919e27c-cert\") pod \"ingress-canary-94ggh\" (UID: \"98b160b8-551d-443c-a3a0-4d046919e27c\") " pod="openshift-ingress-canary/ingress-canary-94ggh" Apr 17 18:51:46.208954 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:46.208926 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/648e7199-fd23-4496-ad24-5b9e829d77fa-metrics-tls\") pod \"dns-default-l4g6s\" (UID: \"648e7199-fd23-4496-ad24-5b9e829d77fa\") " pod="openshift-dns/dns-default-l4g6s" Apr 17 18:51:46.209072 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:46.208978 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/98b160b8-551d-443c-a3a0-4d046919e27c-cert\") pod \"ingress-canary-94ggh\" (UID: \"98b160b8-551d-443c-a3a0-4d046919e27c\") " pod="openshift-ingress-canary/ingress-canary-94ggh" Apr 17 18:51:46.299131 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:46.299092 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ctvd8\"" Apr 17 18:51:46.299793 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:46.299776 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ppj8p\"" Apr 17 18:51:46.307368 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:46.307345 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-l4g6s" Apr 17 18:51:46.307368 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:46.307354 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-94ggh" Apr 17 18:51:46.442015 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:46.441989 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-l4g6s"] Apr 17 18:51:46.444927 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:51:46.444869 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod648e7199_fd23_4496_ad24_5b9e829d77fa.slice/crio-05aad52a52e4b5bfea22d1a971880627048c38eb97976d9d50f351f924d5be19 WatchSource:0}: Error finding container 05aad52a52e4b5bfea22d1a971880627048c38eb97976d9d50f351f924d5be19: Status 404 returned error can't find the container with id 05aad52a52e4b5bfea22d1a971880627048c38eb97976d9d50f351f924d5be19 Apr 17 18:51:46.465606 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:46.465462 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-94ggh"] Apr 17 18:51:46.468626 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:51:46.468597 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98b160b8_551d_443c_a3a0_4d046919e27c.slice/crio-2f8c68712103e01ab8405ace9347e61a1fa17075ba1ee3a273cd5614eaa99a45 WatchSource:0}: Error finding container 2f8c68712103e01ab8405ace9347e61a1fa17075ba1ee3a273cd5614eaa99a45: Status 404 returned error can't find the container with id 2f8c68712103e01ab8405ace9347e61a1fa17075ba1ee3a273cd5614eaa99a45 Apr 17 18:51:46.510761 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:46.510726 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-l4g6s" event={"ID":"648e7199-fd23-4496-ad24-5b9e829d77fa","Type":"ContainerStarted","Data":"05aad52a52e4b5bfea22d1a971880627048c38eb97976d9d50f351f924d5be19"} Apr 17 18:51:46.511668 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:46.511634 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-94ggh" event={"ID":"98b160b8-551d-443c-a3a0-4d046919e27c","Type":"ContainerStarted","Data":"2f8c68712103e01ab8405ace9347e61a1fa17075ba1ee3a273cd5614eaa99a45"} Apr 17 18:51:49.523282 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:49.523244 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-94ggh" event={"ID":"98b160b8-551d-443c-a3a0-4d046919e27c","Type":"ContainerStarted","Data":"b35ecd464fe3408b453463bf14a22e27b2ae934dd456c8ab9e5dd0b9e21eaf9c"} Apr 17 18:51:49.524729 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:49.524703 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-l4g6s" event={"ID":"648e7199-fd23-4496-ad24-5b9e829d77fa","Type":"ContainerStarted","Data":"321358c36b65bfd29a89ce37fc9036e3d2eec6939504e34434ec36d95d5db102"} Apr 17 18:51:49.524729 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:49.524732 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-l4g6s" event={"ID":"648e7199-fd23-4496-ad24-5b9e829d77fa","Type":"ContainerStarted","Data":"2caed4064b580bcb610408bb197cbd56866d9d3ea7ce5db023228aa07872f325"} Apr 17 18:51:49.524932 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:49.524914 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-l4g6s" Apr 17 18:51:49.537500 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:49.537460 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-94ggh" podStartSLOduration=129.442887122 podStartE2EDuration="2m11.537448123s" podCreationTimestamp="2026-04-17 18:49:38 +0000 UTC" firstStartedPulling="2026-04-17 18:51:46.470721715 +0000 UTC m=+162.123257619" lastFinishedPulling="2026-04-17 18:51:48.565282713 +0000 UTC m=+164.217818620" observedRunningTime="2026-04-17 18:51:49.536295592 +0000 UTC m=+165.188831519" watchObservedRunningTime="2026-04-17 18:51:49.537448123 +0000 UTC m=+165.189984053" Apr 17 18:51:49.552164 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:49.552112 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-l4g6s" podStartSLOduration=129.437553713 podStartE2EDuration="2m11.552097695s" podCreationTimestamp="2026-04-17 18:49:38 +0000 UTC" firstStartedPulling="2026-04-17 18:51:46.446969533 +0000 UTC m=+162.099505437" lastFinishedPulling="2026-04-17 18:51:48.561513502 +0000 UTC m=+164.214049419" observedRunningTime="2026-04-17 18:51:49.551582205 +0000 UTC m=+165.204118132" watchObservedRunningTime="2026-04-17 18:51:49.552097695 +0000 UTC m=+165.204633621" Apr 17 18:51:51.532397 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:51.532360 2580 generic.go:358] "Generic (PLEG): container finished" podID="40e1d37f-b2bf-44fe-a623-a1c64ed6ba58" containerID="e55912faa2293745ea8d5ba2cc5a94b711c2e323a4fd96a3daecbde7ba08739f" exitCode=0 Apr 17 18:51:51.532856 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:51.532422 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h9jnl" event={"ID":"40e1d37f-b2bf-44fe-a623-a1c64ed6ba58","Type":"ContainerDied","Data":"e55912faa2293745ea8d5ba2cc5a94b711c2e323a4fd96a3daecbde7ba08739f"} Apr 17 18:51:51.532856 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:51.532807 2580 scope.go:117] "RemoveContainer" containerID="e55912faa2293745ea8d5ba2cc5a94b711c2e323a4fd96a3daecbde7ba08739f" Apr 17 18:51:52.539183 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:52.539148 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h9jnl" event={"ID":"40e1d37f-b2bf-44fe-a623-a1c64ed6ba58","Type":"ContainerStarted","Data":"08a304fae21193c74357763c49da8a41fc355ca3d5a4a092c6075ac866055931"} Apr 17 18:51:59.530398 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:51:59.530370 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-l4g6s" Apr 17 18:52:26.501129 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:26.501088 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:26.517593 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:26.517561 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:26.661002 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:26.660974 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:44.495065 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.495025 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 18:52:44.495593 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.495517 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="10844520-2815-46a3-b729-f4694a601c41" containerName="kube-rbac-proxy" containerID="cri-o://4c3ddbf91ec0274f00a6e489c23e54dec98854b7f8d8e4c1e7586c324094fe38" gracePeriod=600 Apr 17 18:52:44.495593 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.495562 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="10844520-2815-46a3-b729-f4694a601c41" containerName="config-reloader" containerID="cri-o://a551013c21bf93a54e78f406de1f7199c797920da8838145127c6eddb77c1556" gracePeriod=600 Apr 17 18:52:44.495700 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.495512 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="10844520-2815-46a3-b729-f4694a601c41" containerName="prometheus" containerID="cri-o://5e9292f545d70a20668e7cf0bf33b30c3b6f1b6e666a16fe9aa42777cedfbfd6" gracePeriod=600 Apr 17 18:52:44.495700 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.495538 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="10844520-2815-46a3-b729-f4694a601c41" containerName="thanos-sidecar" containerID="cri-o://8786b213b1899d153f94eca67d8b64c85b19369c5297bfb89170c56aab6c45be" gracePeriod=600 Apr 17 18:52:44.495700 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.495538 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="10844520-2815-46a3-b729-f4694a601c41" containerName="kube-rbac-proxy-web" containerID="cri-o://0be32ce2f3b70adb1c0ab753692238b7c3cd0d7354ce30ade6e07b83783e497c" gracePeriod=600 Apr 17 18:52:44.495843 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.495517 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="10844520-2815-46a3-b729-f4694a601c41" containerName="kube-rbac-proxy-thanos" containerID="cri-o://3dd50478a93ed7d0e95672333fe575083a4bcca345789c8b269a4ffe516bbcea" gracePeriod=600 Apr 17 18:52:44.706355 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.706322 2580 generic.go:358] "Generic (PLEG): container finished" podID="10844520-2815-46a3-b729-f4694a601c41" containerID="3dd50478a93ed7d0e95672333fe575083a4bcca345789c8b269a4ffe516bbcea" exitCode=0 Apr 17 18:52:44.706355 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.706351 2580 generic.go:358] "Generic (PLEG): container finished" podID="10844520-2815-46a3-b729-f4694a601c41" containerID="4c3ddbf91ec0274f00a6e489c23e54dec98854b7f8d8e4c1e7586c324094fe38" exitCode=0 Apr 17 18:52:44.706531 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.706360 2580 generic.go:358] "Generic (PLEG): container finished" podID="10844520-2815-46a3-b729-f4694a601c41" containerID="0be32ce2f3b70adb1c0ab753692238b7c3cd0d7354ce30ade6e07b83783e497c" exitCode=0 Apr 17 18:52:44.706531 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.706373 2580 generic.go:358] "Generic (PLEG): container finished" podID="10844520-2815-46a3-b729-f4694a601c41" containerID="8786b213b1899d153f94eca67d8b64c85b19369c5297bfb89170c56aab6c45be" exitCode=0 Apr 17 18:52:44.706531 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.706383 2580 generic.go:358] "Generic (PLEG): container finished" podID="10844520-2815-46a3-b729-f4694a601c41" containerID="a551013c21bf93a54e78f406de1f7199c797920da8838145127c6eddb77c1556" exitCode=0 Apr 17 18:52:44.706531 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.706391 2580 generic.go:358] "Generic (PLEG): container finished" podID="10844520-2815-46a3-b729-f4694a601c41" containerID="5e9292f545d70a20668e7cf0bf33b30c3b6f1b6e666a16fe9aa42777cedfbfd6" exitCode=0 Apr 17 18:52:44.706531 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.706389 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"10844520-2815-46a3-b729-f4694a601c41","Type":"ContainerDied","Data":"3dd50478a93ed7d0e95672333fe575083a4bcca345789c8b269a4ffe516bbcea"} Apr 17 18:52:44.706531 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.706429 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"10844520-2815-46a3-b729-f4694a601c41","Type":"ContainerDied","Data":"4c3ddbf91ec0274f00a6e489c23e54dec98854b7f8d8e4c1e7586c324094fe38"} Apr 17 18:52:44.706531 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.706445 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"10844520-2815-46a3-b729-f4694a601c41","Type":"ContainerDied","Data":"0be32ce2f3b70adb1c0ab753692238b7c3cd0d7354ce30ade6e07b83783e497c"} Apr 17 18:52:44.706531 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.706454 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"10844520-2815-46a3-b729-f4694a601c41","Type":"ContainerDied","Data":"8786b213b1899d153f94eca67d8b64c85b19369c5297bfb89170c56aab6c45be"} Apr 17 18:52:44.706531 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.706463 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"10844520-2815-46a3-b729-f4694a601c41","Type":"ContainerDied","Data":"a551013c21bf93a54e78f406de1f7199c797920da8838145127c6eddb77c1556"} Apr 17 18:52:44.706531 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.706472 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"10844520-2815-46a3-b729-f4694a601c41","Type":"ContainerDied","Data":"5e9292f545d70a20668e7cf0bf33b30c3b6f1b6e666a16fe9aa42777cedfbfd6"} Apr 17 18:52:44.745475 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.745416 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:44.809743 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.809713 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-secret-prometheus-k8s-tls\") pod \"10844520-2815-46a3-b729-f4694a601c41\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " Apr 17 18:52:44.809743 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.809749 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/10844520-2815-46a3-b729-f4694a601c41-config-out\") pod \"10844520-2815-46a3-b729-f4694a601c41\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " Apr 17 18:52:44.810021 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.809776 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/10844520-2815-46a3-b729-f4694a601c41-prometheus-k8s-db\") pod \"10844520-2815-46a3-b729-f4694a601c41\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " Apr 17 18:52:44.810021 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.809813 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10844520-2815-46a3-b729-f4694a601c41-prometheus-trusted-ca-bundle\") pod \"10844520-2815-46a3-b729-f4694a601c41\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " Apr 17 18:52:44.810021 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.809935 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/10844520-2815-46a3-b729-f4694a601c41-configmap-metrics-client-ca\") pod \"10844520-2815-46a3-b729-f4694a601c41\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " Apr 17 18:52:44.810021 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.809977 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-thanos-prometheus-http-client-file\") pod \"10844520-2815-46a3-b729-f4694a601c41\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " Apr 17 18:52:44.810021 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.810005 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"10844520-2815-46a3-b729-f4694a601c41\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " Apr 17 18:52:44.810772 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.810028 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/10844520-2815-46a3-b729-f4694a601c41-tls-assets\") pod \"10844520-2815-46a3-b729-f4694a601c41\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " Apr 17 18:52:44.810772 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.810074 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-secret-grpc-tls\") pod \"10844520-2815-46a3-b729-f4694a601c41\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " Apr 17 18:52:44.810772 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.810106 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-web-config\") pod \"10844520-2815-46a3-b729-f4694a601c41\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " Apr 17 18:52:44.810772 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.810145 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-secret-kube-rbac-proxy\") pod \"10844520-2815-46a3-b729-f4694a601c41\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " Apr 17 18:52:44.810772 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.810183 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-secret-metrics-client-certs\") pod \"10844520-2815-46a3-b729-f4694a601c41\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " Apr 17 18:52:44.810772 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.810218 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10844520-2815-46a3-b729-f4694a601c41-configmap-serving-certs-ca-bundle\") pod \"10844520-2815-46a3-b729-f4694a601c41\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " Apr 17 18:52:44.810772 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.810242 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10844520-2815-46a3-b729-f4694a601c41-configmap-kubelet-serving-ca-bundle\") pod \"10844520-2815-46a3-b729-f4694a601c41\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " Apr 17 18:52:44.810772 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.810274 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"10844520-2815-46a3-b729-f4694a601c41\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " Apr 17 18:52:44.810772 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.810320 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-config\") pod \"10844520-2815-46a3-b729-f4694a601c41\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " Apr 17 18:52:44.810772 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.810347 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/10844520-2815-46a3-b729-f4694a601c41-prometheus-k8s-rulefiles-0\") pod \"10844520-2815-46a3-b729-f4694a601c41\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " Apr 17 18:52:44.810772 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.810388 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hv8s8\" (UniqueName: \"kubernetes.io/projected/10844520-2815-46a3-b729-f4694a601c41-kube-api-access-hv8s8\") pod \"10844520-2815-46a3-b729-f4694a601c41\" (UID: \"10844520-2815-46a3-b729-f4694a601c41\") " Apr 17 18:52:44.811388 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.810239 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10844520-2815-46a3-b729-f4694a601c41-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "10844520-2815-46a3-b729-f4694a601c41" (UID: "10844520-2815-46a3-b729-f4694a601c41"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:52:44.811388 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.810907 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10844520-2815-46a3-b729-f4694a601c41-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "10844520-2815-46a3-b729-f4694a601c41" (UID: "10844520-2815-46a3-b729-f4694a601c41"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:52:44.811388 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.810552 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10844520-2815-46a3-b729-f4694a601c41-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "10844520-2815-46a3-b729-f4694a601c41" (UID: "10844520-2815-46a3-b729-f4694a601c41"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:52:44.811388 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.811295 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10844520-2815-46a3-b729-f4694a601c41-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "10844520-2815-46a3-b729-f4694a601c41" (UID: "10844520-2815-46a3-b729-f4694a601c41"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:52:44.811614 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.811437 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10844520-2815-46a3-b729-f4694a601c41-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "10844520-2815-46a3-b729-f4694a601c41" (UID: "10844520-2815-46a3-b729-f4694a601c41"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:52:44.813435 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.813400 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "10844520-2815-46a3-b729-f4694a601c41" (UID: "10844520-2815-46a3-b729-f4694a601c41"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:52:44.813626 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.813493 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "10844520-2815-46a3-b729-f4694a601c41" (UID: "10844520-2815-46a3-b729-f4694a601c41"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:52:44.813882 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.813803 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10844520-2815-46a3-b729-f4694a601c41-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "10844520-2815-46a3-b729-f4694a601c41" (UID: "10844520-2815-46a3-b729-f4694a601c41"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:52:44.813882 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.813868 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10844520-2815-46a3-b729-f4694a601c41-config-out" (OuterVolumeSpecName: "config-out") pod "10844520-2815-46a3-b729-f4694a601c41" (UID: "10844520-2815-46a3-b729-f4694a601c41"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:52:44.813882 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.813868 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "10844520-2815-46a3-b729-f4694a601c41" (UID: "10844520-2815-46a3-b729-f4694a601c41"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:52:44.814146 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.814112 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-config" (OuterVolumeSpecName: "config") pod "10844520-2815-46a3-b729-f4694a601c41" (UID: "10844520-2815-46a3-b729-f4694a601c41"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:52:44.814884 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.814845 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "10844520-2815-46a3-b729-f4694a601c41" (UID: "10844520-2815-46a3-b729-f4694a601c41"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:52:44.814884 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.814849 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10844520-2815-46a3-b729-f4694a601c41-kube-api-access-hv8s8" (OuterVolumeSpecName: "kube-api-access-hv8s8") pod "10844520-2815-46a3-b729-f4694a601c41" (UID: "10844520-2815-46a3-b729-f4694a601c41"). InnerVolumeSpecName "kube-api-access-hv8s8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:52:44.815268 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.815232 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "10844520-2815-46a3-b729-f4694a601c41" (UID: "10844520-2815-46a3-b729-f4694a601c41"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:52:44.815622 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.815604 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "10844520-2815-46a3-b729-f4694a601c41" (UID: "10844520-2815-46a3-b729-f4694a601c41"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:52:44.816078 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.816049 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10844520-2815-46a3-b729-f4694a601c41-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "10844520-2815-46a3-b729-f4694a601c41" (UID: "10844520-2815-46a3-b729-f4694a601c41"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:52:44.816520 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.816499 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "10844520-2815-46a3-b729-f4694a601c41" (UID: "10844520-2815-46a3-b729-f4694a601c41"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:52:44.827555 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.827526 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-web-config" (OuterVolumeSpecName: "web-config") pod "10844520-2815-46a3-b729-f4694a601c41" (UID: "10844520-2815-46a3-b729-f4694a601c41"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:52:44.911641 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.911609 2580 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10844520-2815-46a3-b729-f4694a601c41-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 17 18:52:44.911641 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.911637 2580 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10844520-2815-46a3-b729-f4694a601c41-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 17 18:52:44.911641 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.911647 2580 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 17 18:52:44.911835 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.911658 2580 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-config\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 17 18:52:44.911835 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.911667 2580 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/10844520-2815-46a3-b729-f4694a601c41-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 17 18:52:44.911835 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.911676 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hv8s8\" (UniqueName: \"kubernetes.io/projected/10844520-2815-46a3-b729-f4694a601c41-kube-api-access-hv8s8\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 17 18:52:44.911835 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.911686 2580 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-secret-prometheus-k8s-tls\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 17 18:52:44.911835 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.911694 2580 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/10844520-2815-46a3-b729-f4694a601c41-config-out\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 17 18:52:44.911835 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.911702 2580 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/10844520-2815-46a3-b729-f4694a601c41-prometheus-k8s-db\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 17 18:52:44.911835 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.911712 2580 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10844520-2815-46a3-b729-f4694a601c41-prometheus-trusted-ca-bundle\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 17 18:52:44.911835 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.911720 2580 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/10844520-2815-46a3-b729-f4694a601c41-configmap-metrics-client-ca\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 17 18:52:44.911835 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.911730 2580 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-thanos-prometheus-http-client-file\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 17 18:52:44.911835 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.911739 2580 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 17 18:52:44.911835 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.911748 2580 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/10844520-2815-46a3-b729-f4694a601c41-tls-assets\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 17 18:52:44.911835 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.911756 2580 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-secret-grpc-tls\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 17 18:52:44.911835 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.911764 2580 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-web-config\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 17 18:52:44.911835 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.911771 2580 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-secret-kube-rbac-proxy\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 17 18:52:44.911835 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:44.911779 2580 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/10844520-2815-46a3-b729-f4694a601c41-secret-metrics-client-certs\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 17 18:52:45.711698 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.711657 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"10844520-2815-46a3-b729-f4694a601c41","Type":"ContainerDied","Data":"bb9fc0c17051fb7c2a2d972d16324b864ff9512f6a3e769ce168ca89e38e8aec"} Apr 17 18:52:45.712116 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.711717 2580 scope.go:117] "RemoveContainer" containerID="3dd50478a93ed7d0e95672333fe575083a4bcca345789c8b269a4ffe516bbcea" Apr 17 18:52:45.712116 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.711791 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.719403 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.719382 2580 scope.go:117] "RemoveContainer" containerID="4c3ddbf91ec0274f00a6e489c23e54dec98854b7f8d8e4c1e7586c324094fe38" Apr 17 18:52:45.726145 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.726128 2580 scope.go:117] "RemoveContainer" containerID="0be32ce2f3b70adb1c0ab753692238b7c3cd0d7354ce30ade6e07b83783e497c" Apr 17 18:52:45.731824 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.731802 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 18:52:45.734521 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.734496 2580 scope.go:117] "RemoveContainer" containerID="8786b213b1899d153f94eca67d8b64c85b19369c5297bfb89170c56aab6c45be" Apr 17 18:52:45.735360 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.735338 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 18:52:45.741881 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.741865 2580 scope.go:117] "RemoveContainer" containerID="a551013c21bf93a54e78f406de1f7199c797920da8838145127c6eddb77c1556" Apr 17 18:52:45.748422 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.748403 2580 scope.go:117] "RemoveContainer" containerID="5e9292f545d70a20668e7cf0bf33b30c3b6f1b6e666a16fe9aa42777cedfbfd6" Apr 17 18:52:45.755617 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.755587 2580 scope.go:117] "RemoveContainer" containerID="7663f3f3ae903be2cfb9b0b4086f5f2ed8fb75d72d512890d2ca4e8f10946b56" Apr 17 18:52:45.756907 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.756868 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 18:52:45.757292 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.757275 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10844520-2815-46a3-b729-f4694a601c41" containerName="config-reloader" Apr 17 18:52:45.757292 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.757292 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="10844520-2815-46a3-b729-f4694a601c41" containerName="config-reloader" Apr 17 18:52:45.757416 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.757301 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10844520-2815-46a3-b729-f4694a601c41" containerName="kube-rbac-proxy" Apr 17 18:52:45.757416 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.757307 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="10844520-2815-46a3-b729-f4694a601c41" containerName="kube-rbac-proxy" Apr 17 18:52:45.757416 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.757313 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10844520-2815-46a3-b729-f4694a601c41" containerName="kube-rbac-proxy-web" Apr 17 18:52:45.757416 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.757323 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="10844520-2815-46a3-b729-f4694a601c41" containerName="kube-rbac-proxy-web" Apr 17 18:52:45.757416 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.757333 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10844520-2815-46a3-b729-f4694a601c41" containerName="kube-rbac-proxy-thanos" Apr 17 18:52:45.757416 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.757339 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="10844520-2815-46a3-b729-f4694a601c41" containerName="kube-rbac-proxy-thanos" Apr 17 18:52:45.757416 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.757347 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10844520-2815-46a3-b729-f4694a601c41" containerName="init-config-reloader" Apr 17 18:52:45.757416 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.757351 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="10844520-2815-46a3-b729-f4694a601c41" containerName="init-config-reloader" Apr 17 18:52:45.757416 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.757360 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10844520-2815-46a3-b729-f4694a601c41" containerName="thanos-sidecar" Apr 17 18:52:45.757416 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.757365 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="10844520-2815-46a3-b729-f4694a601c41" containerName="thanos-sidecar" Apr 17 18:52:45.757416 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.757375 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10844520-2815-46a3-b729-f4694a601c41" containerName="prometheus" Apr 17 18:52:45.757416 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.757380 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="10844520-2815-46a3-b729-f4694a601c41" containerName="prometheus" Apr 17 18:52:45.757780 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.757456 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="10844520-2815-46a3-b729-f4694a601c41" containerName="kube-rbac-proxy" Apr 17 18:52:45.757780 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.757465 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="10844520-2815-46a3-b729-f4694a601c41" containerName="prometheus" Apr 17 18:52:45.757780 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.757470 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="10844520-2815-46a3-b729-f4694a601c41" containerName="config-reloader" Apr 17 18:52:45.757780 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.757482 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="10844520-2815-46a3-b729-f4694a601c41" containerName="kube-rbac-proxy-thanos" Apr 17 18:52:45.757780 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.757492 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="10844520-2815-46a3-b729-f4694a601c41" containerName="thanos-sidecar" Apr 17 18:52:45.757780 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.757500 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="10844520-2815-46a3-b729-f4694a601c41" containerName="kube-rbac-proxy-web" Apr 17 18:52:45.762812 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.762777 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.765021 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.765000 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 18:52:45.765333 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.765318 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 18:52:45.765632 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.765613 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-5p2rppk6l278u\"" Apr 17 18:52:45.765850 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.765835 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 18:52:45.766160 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.766036 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-qmls2\"" Apr 17 18:52:45.766160 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.766039 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 18:52:45.766160 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.766108 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 18:52:45.766160 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.766123 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 18:52:45.766381 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.766276 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 18:52:45.766433 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.766393 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 18:52:45.766594 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.766575 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 18:52:45.766643 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.766611 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 18:52:45.766838 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.766816 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 18:52:45.767866 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.767849 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 18:52:45.771003 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.770983 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 18:52:45.772464 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.772444 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 18:52:45.817580 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.817548 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.817706 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.817585 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.817706 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.817627 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.817706 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.817644 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.817706 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.817664 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.817838 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.817741 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.817838 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.817767 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.817838 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.817788 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.817838 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.817810 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-config-out\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.817978 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.817839 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.817978 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.817866 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.817978 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.817883 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.817978 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.817927 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.817978 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.817944 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49m9k\" (UniqueName: \"kubernetes.io/projected/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-kube-api-access-49m9k\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.817978 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.817962 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.818143 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.817991 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-web-config\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.818143 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.818019 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-config\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.818143 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.818035 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.919031 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.918996 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.919187 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.919040 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.919187 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.919068 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.919187 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.919090 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.919187 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.919110 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-49m9k\" (UniqueName: \"kubernetes.io/projected/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-kube-api-access-49m9k\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.919187 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.919136 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.919187 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.919164 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-web-config\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.919470 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.919193 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-config\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.919470 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.919221 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.919470 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.919254 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.919470 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.919293 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.919470 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.919327 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.919470 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.919353 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.919470 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.919388 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.919470 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.919441 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.919470 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.919467 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.919930 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.919498 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.920002 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.919979 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.920294 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.920262 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.920409 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.920326 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-config-out\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.922735 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.922441 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.922735 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.922649 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-config\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.922735 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.922654 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.922971 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.922752 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-web-config\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.922971 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.922934 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.923074 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.922969 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.923776 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.923223 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.923776 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.923248 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.923776 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.923256 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.923776 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.923732 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-config-out\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.924074 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.924052 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.924952 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.924926 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.925241 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.925221 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.925383 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.925366 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.926908 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.926869 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-49m9k\" (UniqueName: \"kubernetes.io/projected/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-kube-api-access-49m9k\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:45.926996 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:45.926886 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c212ee4b-5e61-4b8c-8393-d10a84fbbf85-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c212ee4b-5e61-4b8c-8393-d10a84fbbf85\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:46.073330 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:46.073250 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:52:46.204738 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:46.204709 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 18:52:46.206601 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:52:46.206563 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc212ee4b_5e61_4b8c_8393_d10a84fbbf85.slice/crio-1432228f1bfd57466f6113182a08b3756d941048ae9af547fb0901e0ad475505 WatchSource:0}: Error finding container 1432228f1bfd57466f6113182a08b3756d941048ae9af547fb0901e0ad475505: Status 404 returned error can't find the container with id 1432228f1bfd57466f6113182a08b3756d941048ae9af547fb0901e0ad475505 Apr 17 18:52:46.718009 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:46.717974 2580 generic.go:358] "Generic (PLEG): container finished" podID="c212ee4b-5e61-4b8c-8393-d10a84fbbf85" containerID="08b652ff9badc0400b2326c65281145795c166643fce8bd77e72d3873b5efb94" exitCode=0 Apr 17 18:52:46.718445 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:46.718071 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c212ee4b-5e61-4b8c-8393-d10a84fbbf85","Type":"ContainerDied","Data":"08b652ff9badc0400b2326c65281145795c166643fce8bd77e72d3873b5efb94"} Apr 17 18:52:46.718445 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:46.718113 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c212ee4b-5e61-4b8c-8393-d10a84fbbf85","Type":"ContainerStarted","Data":"1432228f1bfd57466f6113182a08b3756d941048ae9af547fb0901e0ad475505"} Apr 17 18:52:46.981535 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:46.981504 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10844520-2815-46a3-b729-f4694a601c41" path="/var/lib/kubelet/pods/10844520-2815-46a3-b729-f4694a601c41/volumes" Apr 17 18:52:47.725533 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:47.725499 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c212ee4b-5e61-4b8c-8393-d10a84fbbf85","Type":"ContainerStarted","Data":"c0426633bbb9b0324175fbdfebd74058d450c8be219322c9d402ba902a0621f5"} Apr 17 18:52:47.725533 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:47.725533 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c212ee4b-5e61-4b8c-8393-d10a84fbbf85","Type":"ContainerStarted","Data":"95375e873f2c8df5d093aa38796bfedfa839d4293131c85b1969b77458381d4e"} Apr 17 18:52:47.725974 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:47.725545 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c212ee4b-5e61-4b8c-8393-d10a84fbbf85","Type":"ContainerStarted","Data":"5db4a2b798d6e4610f235102980cb90b372768e1f519563481faf15ddbea4d20"} Apr 17 18:52:47.725974 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:47.725554 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c212ee4b-5e61-4b8c-8393-d10a84fbbf85","Type":"ContainerStarted","Data":"9485fa85822c3f0f0c1f374de3d6fd72f2cb1c32e81f8377e3dafae0f0ec7a34"} Apr 17 18:52:47.725974 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:47.725562 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c212ee4b-5e61-4b8c-8393-d10a84fbbf85","Type":"ContainerStarted","Data":"541417821cf1b33ad8e1ef633ab48faa58763ca8620f1d7d51b49d066df16ba9"} Apr 17 18:52:47.725974 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:47.725571 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c212ee4b-5e61-4b8c-8393-d10a84fbbf85","Type":"ContainerStarted","Data":"39e8969a83d68cd64b1b432758a6ee6c69b7ca4f77e8b2e8c58f480eeedbec32"} Apr 17 18:52:47.751755 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:47.751697 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.751682254 podStartE2EDuration="2.751682254s" podCreationTimestamp="2026-04-17 18:52:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:52:47.749615783 +0000 UTC m=+223.402151715" watchObservedRunningTime="2026-04-17 18:52:47.751682254 +0000 UTC m=+223.404218179" Apr 17 18:52:51.074344 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:52:51.074306 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:53:46.074381 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:53:46.074336 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:53:46.090706 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:53:46.090677 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:53:46.924639 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:53:46.924612 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 18:54:04.873611 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:54:04.873579 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-whmp8_160e7e3f-dd7c-4341-8e04-ec0fc5728152/console-operator/2.log" Apr 17 18:54:04.877450 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:54:04.877385 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-whmp8_160e7e3f-dd7c-4341-8e04-ec0fc5728152/console-operator/2.log" Apr 17 18:54:04.884301 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:54:04.884276 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7klhd_87d38419-312e-4358-b18c-7e7b24e8189f/ovn-acl-logging/0.log" Apr 17 18:54:04.884949 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:54:04.884930 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7klhd_87d38419-312e-4358-b18c-7e7b24e8189f/ovn-acl-logging/0.log" Apr 17 18:54:04.887142 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:54:04.887124 2580 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 18:56:01.560185 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:01.560149 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-ssdkx"] Apr 17 18:56:01.563346 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:01.563323 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-ssdkx" Apr 17 18:56:01.565657 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:01.565635 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 17 18:56:01.565790 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:01.565705 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 18:56:01.565859 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:01.565795 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 17 18:56:01.565965 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:01.565884 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 18:56:01.566089 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:01.566055 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-w5cl6\"" Apr 17 18:56:01.576176 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:01.576154 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-ssdkx"] Apr 17 18:56:01.642421 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:01.642383 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cd23e58b-3562-491c-b3d9-018ee82978f8-webhook-cert\") pod \"opendatahub-operator-controller-manager-6fc6488c9d-ssdkx\" (UID: \"cd23e58b-3562-491c-b3d9-018ee82978f8\") " pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-ssdkx" Apr 17 18:56:01.642600 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:01.642429 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cd23e58b-3562-491c-b3d9-018ee82978f8-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6fc6488c9d-ssdkx\" (UID: \"cd23e58b-3562-491c-b3d9-018ee82978f8\") " pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-ssdkx" Apr 17 18:56:01.642600 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:01.642522 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wm6c\" (UniqueName: \"kubernetes.io/projected/cd23e58b-3562-491c-b3d9-018ee82978f8-kube-api-access-4wm6c\") pod \"opendatahub-operator-controller-manager-6fc6488c9d-ssdkx\" (UID: \"cd23e58b-3562-491c-b3d9-018ee82978f8\") " pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-ssdkx" Apr 17 18:56:01.743319 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:01.743285 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4wm6c\" (UniqueName: \"kubernetes.io/projected/cd23e58b-3562-491c-b3d9-018ee82978f8-kube-api-access-4wm6c\") pod \"opendatahub-operator-controller-manager-6fc6488c9d-ssdkx\" (UID: \"cd23e58b-3562-491c-b3d9-018ee82978f8\") " pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-ssdkx" Apr 17 18:56:01.743505 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:01.743328 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cd23e58b-3562-491c-b3d9-018ee82978f8-webhook-cert\") pod \"opendatahub-operator-controller-manager-6fc6488c9d-ssdkx\" (UID: \"cd23e58b-3562-491c-b3d9-018ee82978f8\") " pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-ssdkx" Apr 17 18:56:01.743505 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:01.743366 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cd23e58b-3562-491c-b3d9-018ee82978f8-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6fc6488c9d-ssdkx\" (UID: \"cd23e58b-3562-491c-b3d9-018ee82978f8\") " pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-ssdkx" Apr 17 18:56:01.746020 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:01.745992 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cd23e58b-3562-491c-b3d9-018ee82978f8-webhook-cert\") pod \"opendatahub-operator-controller-manager-6fc6488c9d-ssdkx\" (UID: \"cd23e58b-3562-491c-b3d9-018ee82978f8\") " pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-ssdkx" Apr 17 18:56:01.746126 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:01.746033 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cd23e58b-3562-491c-b3d9-018ee82978f8-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6fc6488c9d-ssdkx\" (UID: \"cd23e58b-3562-491c-b3d9-018ee82978f8\") " pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-ssdkx" Apr 17 18:56:01.758482 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:01.758457 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wm6c\" (UniqueName: \"kubernetes.io/projected/cd23e58b-3562-491c-b3d9-018ee82978f8-kube-api-access-4wm6c\") pod \"opendatahub-operator-controller-manager-6fc6488c9d-ssdkx\" (UID: \"cd23e58b-3562-491c-b3d9-018ee82978f8\") " pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-ssdkx" Apr 17 18:56:01.875565 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:01.875470 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-ssdkx" Apr 17 18:56:02.006098 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:02.005946 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-ssdkx"] Apr 17 18:56:02.008949 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:56:02.008883 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd23e58b_3562_491c_b3d9_018ee82978f8.slice/crio-cb6fecb12b565d45e8e43ad98528f0889246553282dccb6f6b892e68a73f5a4b WatchSource:0}: Error finding container cb6fecb12b565d45e8e43ad98528f0889246553282dccb6f6b892e68a73f5a4b: Status 404 returned error can't find the container with id cb6fecb12b565d45e8e43ad98528f0889246553282dccb6f6b892e68a73f5a4b Apr 17 18:56:02.010453 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:02.010432 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 18:56:02.295160 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:02.295128 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-ssdkx" event={"ID":"cd23e58b-3562-491c-b3d9-018ee82978f8","Type":"ContainerStarted","Data":"cb6fecb12b565d45e8e43ad98528f0889246553282dccb6f6b892e68a73f5a4b"} Apr 17 18:56:05.308112 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:05.308072 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-ssdkx" event={"ID":"cd23e58b-3562-491c-b3d9-018ee82978f8","Type":"ContainerStarted","Data":"e5c21e13269c1fa3ac5f474d4bfcac650ae42fcb30d0c880e8bbc75eaec48703"} Apr 17 18:56:05.308496 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:05.308187 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-ssdkx" Apr 17 18:56:05.329964 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:05.329918 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-ssdkx" podStartSLOduration=1.885179046 podStartE2EDuration="4.329887849s" podCreationTimestamp="2026-04-17 18:56:01 +0000 UTC" firstStartedPulling="2026-04-17 18:56:02.010591163 +0000 UTC m=+417.663127070" lastFinishedPulling="2026-04-17 18:56:04.455299967 +0000 UTC m=+420.107835873" observedRunningTime="2026-04-17 18:56:05.328979804 +0000 UTC m=+420.981515723" watchObservedRunningTime="2026-04-17 18:56:05.329887849 +0000 UTC m=+420.982423775" Apr 17 18:56:16.312835 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:16.312799 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-ssdkx" Apr 17 18:56:19.187852 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:19.187815 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-697b5bd5df-2vp4b"] Apr 17 18:56:19.191062 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:19.191043 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-697b5bd5df-2vp4b" Apr 17 18:56:19.193622 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:19.193596 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 17 18:56:19.193743 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:19.193631 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 17 18:56:19.193810 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:19.193787 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 17 18:56:19.194600 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:19.194574 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 17 18:56:19.194680 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:19.194625 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-n9k5b\"" Apr 17 18:56:19.194680 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:19.194638 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 17 18:56:19.200163 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:19.200132 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-697b5bd5df-2vp4b"] Apr 17 18:56:19.288661 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:19.288612 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdscj\" (UniqueName: \"kubernetes.io/projected/7ffac97d-c664-4d6c-a7ed-83dd7121891a-kube-api-access-jdscj\") pod \"lws-controller-manager-697b5bd5df-2vp4b\" (UID: \"7ffac97d-c664-4d6c-a7ed-83dd7121891a\") " pod="openshift-lws-operator/lws-controller-manager-697b5bd5df-2vp4b" Apr 17 18:56:19.288841 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:19.288701 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/7ffac97d-c664-4d6c-a7ed-83dd7121891a-metrics-cert\") pod \"lws-controller-manager-697b5bd5df-2vp4b\" (UID: \"7ffac97d-c664-4d6c-a7ed-83dd7121891a\") " pod="openshift-lws-operator/lws-controller-manager-697b5bd5df-2vp4b" Apr 17 18:56:19.288841 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:19.288770 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/7ffac97d-c664-4d6c-a7ed-83dd7121891a-manager-config\") pod \"lws-controller-manager-697b5bd5df-2vp4b\" (UID: \"7ffac97d-c664-4d6c-a7ed-83dd7121891a\") " pod="openshift-lws-operator/lws-controller-manager-697b5bd5df-2vp4b" Apr 17 18:56:19.288996 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:19.288852 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ffac97d-c664-4d6c-a7ed-83dd7121891a-cert\") pod \"lws-controller-manager-697b5bd5df-2vp4b\" (UID: \"7ffac97d-c664-4d6c-a7ed-83dd7121891a\") " pod="openshift-lws-operator/lws-controller-manager-697b5bd5df-2vp4b" Apr 17 18:56:19.390010 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:19.389623 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdscj\" (UniqueName: \"kubernetes.io/projected/7ffac97d-c664-4d6c-a7ed-83dd7121891a-kube-api-access-jdscj\") pod \"lws-controller-manager-697b5bd5df-2vp4b\" (UID: \"7ffac97d-c664-4d6c-a7ed-83dd7121891a\") " pod="openshift-lws-operator/lws-controller-manager-697b5bd5df-2vp4b" Apr 17 18:56:19.390319 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:19.390298 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/7ffac97d-c664-4d6c-a7ed-83dd7121891a-metrics-cert\") pod \"lws-controller-manager-697b5bd5df-2vp4b\" (UID: \"7ffac97d-c664-4d6c-a7ed-83dd7121891a\") " pod="openshift-lws-operator/lws-controller-manager-697b5bd5df-2vp4b" Apr 17 18:56:19.390968 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:19.390945 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/7ffac97d-c664-4d6c-a7ed-83dd7121891a-manager-config\") pod \"lws-controller-manager-697b5bd5df-2vp4b\" (UID: \"7ffac97d-c664-4d6c-a7ed-83dd7121891a\") " pod="openshift-lws-operator/lws-controller-manager-697b5bd5df-2vp4b" Apr 17 18:56:19.391140 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:19.391125 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ffac97d-c664-4d6c-a7ed-83dd7121891a-cert\") pod \"lws-controller-manager-697b5bd5df-2vp4b\" (UID: \"7ffac97d-c664-4d6c-a7ed-83dd7121891a\") " pod="openshift-lws-operator/lws-controller-manager-697b5bd5df-2vp4b" Apr 17 18:56:19.391882 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:19.391863 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/7ffac97d-c664-4d6c-a7ed-83dd7121891a-manager-config\") pod \"lws-controller-manager-697b5bd5df-2vp4b\" (UID: \"7ffac97d-c664-4d6c-a7ed-83dd7121891a\") " pod="openshift-lws-operator/lws-controller-manager-697b5bd5df-2vp4b" Apr 17 18:56:19.394660 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:19.394628 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/7ffac97d-c664-4d6c-a7ed-83dd7121891a-metrics-cert\") pod \"lws-controller-manager-697b5bd5df-2vp4b\" (UID: \"7ffac97d-c664-4d6c-a7ed-83dd7121891a\") " pod="openshift-lws-operator/lws-controller-manager-697b5bd5df-2vp4b" Apr 17 18:56:19.398071 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:19.397702 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ffac97d-c664-4d6c-a7ed-83dd7121891a-cert\") pod \"lws-controller-manager-697b5bd5df-2vp4b\" (UID: \"7ffac97d-c664-4d6c-a7ed-83dd7121891a\") " pod="openshift-lws-operator/lws-controller-manager-697b5bd5df-2vp4b" Apr 17 18:56:19.398071 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:19.397876 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdscj\" (UniqueName: \"kubernetes.io/projected/7ffac97d-c664-4d6c-a7ed-83dd7121891a-kube-api-access-jdscj\") pod \"lws-controller-manager-697b5bd5df-2vp4b\" (UID: \"7ffac97d-c664-4d6c-a7ed-83dd7121891a\") " pod="openshift-lws-operator/lws-controller-manager-697b5bd5df-2vp4b" Apr 17 18:56:19.500602 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:19.500574 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-697b5bd5df-2vp4b" Apr 17 18:56:19.629947 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:19.629915 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-697b5bd5df-2vp4b"] Apr 17 18:56:19.633528 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:56:19.633501 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ffac97d_c664_4d6c_a7ed_83dd7121891a.slice/crio-5b820a6971a1d34d88df08163403f087418afb9d4f0c41e5538ed2418c1c761e WatchSource:0}: Error finding container 5b820a6971a1d34d88df08163403f087418afb9d4f0c41e5538ed2418c1c761e: Status 404 returned error can't find the container with id 5b820a6971a1d34d88df08163403f087418afb9d4f0c41e5538ed2418c1c761e Apr 17 18:56:19.664406 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:19.664378 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-59447f86f4-x4f2p"] Apr 17 18:56:19.669337 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:19.669314 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-59447f86f4-x4f2p" Apr 17 18:56:19.671686 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:19.671665 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 17 18:56:19.671686 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:19.671676 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-lr4dd\"" Apr 17 18:56:19.671835 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:19.671666 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 17 18:56:19.674881 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:19.674850 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-59447f86f4-x4f2p"] Apr 17 18:56:19.796077 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:19.795995 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/78a35df5-9008-4e23-af9a-61bac606b3ac-tmp\") pod \"kube-auth-proxy-59447f86f4-x4f2p\" (UID: \"78a35df5-9008-4e23-af9a-61bac606b3ac\") " pod="openshift-ingress/kube-auth-proxy-59447f86f4-x4f2p" Apr 17 18:56:19.796077 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:19.796037 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/78a35df5-9008-4e23-af9a-61bac606b3ac-tls-certs\") pod \"kube-auth-proxy-59447f86f4-x4f2p\" (UID: \"78a35df5-9008-4e23-af9a-61bac606b3ac\") " pod="openshift-ingress/kube-auth-proxy-59447f86f4-x4f2p" Apr 17 18:56:19.796077 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:19.796056 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrbhh\" (UniqueName: \"kubernetes.io/projected/78a35df5-9008-4e23-af9a-61bac606b3ac-kube-api-access-vrbhh\") pod \"kube-auth-proxy-59447f86f4-x4f2p\" (UID: \"78a35df5-9008-4e23-af9a-61bac606b3ac\") " pod="openshift-ingress/kube-auth-proxy-59447f86f4-x4f2p" Apr 17 18:56:19.896613 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:19.896577 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/78a35df5-9008-4e23-af9a-61bac606b3ac-tmp\") pod \"kube-auth-proxy-59447f86f4-x4f2p\" (UID: \"78a35df5-9008-4e23-af9a-61bac606b3ac\") " pod="openshift-ingress/kube-auth-proxy-59447f86f4-x4f2p" Apr 17 18:56:19.896802 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:19.896625 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/78a35df5-9008-4e23-af9a-61bac606b3ac-tls-certs\") pod \"kube-auth-proxy-59447f86f4-x4f2p\" (UID: \"78a35df5-9008-4e23-af9a-61bac606b3ac\") " pod="openshift-ingress/kube-auth-proxy-59447f86f4-x4f2p" Apr 17 18:56:19.896802 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:19.896646 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vrbhh\" (UniqueName: \"kubernetes.io/projected/78a35df5-9008-4e23-af9a-61bac606b3ac-kube-api-access-vrbhh\") pod \"kube-auth-proxy-59447f86f4-x4f2p\" (UID: \"78a35df5-9008-4e23-af9a-61bac606b3ac\") " pod="openshift-ingress/kube-auth-proxy-59447f86f4-x4f2p" Apr 17 18:56:19.899067 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:19.899044 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/78a35df5-9008-4e23-af9a-61bac606b3ac-tmp\") pod \"kube-auth-proxy-59447f86f4-x4f2p\" (UID: \"78a35df5-9008-4e23-af9a-61bac606b3ac\") " pod="openshift-ingress/kube-auth-proxy-59447f86f4-x4f2p" Apr 17 18:56:19.899230 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:19.899211 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/78a35df5-9008-4e23-af9a-61bac606b3ac-tls-certs\") pod \"kube-auth-proxy-59447f86f4-x4f2p\" (UID: \"78a35df5-9008-4e23-af9a-61bac606b3ac\") " pod="openshift-ingress/kube-auth-proxy-59447f86f4-x4f2p" Apr 17 18:56:19.904272 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:19.904254 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrbhh\" (UniqueName: \"kubernetes.io/projected/78a35df5-9008-4e23-af9a-61bac606b3ac-kube-api-access-vrbhh\") pod \"kube-auth-proxy-59447f86f4-x4f2p\" (UID: \"78a35df5-9008-4e23-af9a-61bac606b3ac\") " pod="openshift-ingress/kube-auth-proxy-59447f86f4-x4f2p" Apr 17 18:56:19.980519 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:19.980497 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-59447f86f4-x4f2p" Apr 17 18:56:20.103063 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:20.103037 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-59447f86f4-x4f2p"] Apr 17 18:56:20.105451 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:56:20.105423 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78a35df5_9008_4e23_af9a_61bac606b3ac.slice/crio-5c77e2d94e94aa7f3bba387bac47afc1be11bab8391464880dac925d01063ad3 WatchSource:0}: Error finding container 5c77e2d94e94aa7f3bba387bac47afc1be11bab8391464880dac925d01063ad3: Status 404 returned error can't find the container with id 5c77e2d94e94aa7f3bba387bac47afc1be11bab8391464880dac925d01063ad3 Apr 17 18:56:20.361882 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:20.361789 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-697b5bd5df-2vp4b" event={"ID":"7ffac97d-c664-4d6c-a7ed-83dd7121891a","Type":"ContainerStarted","Data":"5b820a6971a1d34d88df08163403f087418afb9d4f0c41e5538ed2418c1c761e"} Apr 17 18:56:20.362941 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:20.362916 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-59447f86f4-x4f2p" event={"ID":"78a35df5-9008-4e23-af9a-61bac606b3ac","Type":"ContainerStarted","Data":"5c77e2d94e94aa7f3bba387bac47afc1be11bab8391464880dac925d01063ad3"} Apr 17 18:56:24.381580 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:24.381541 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-697b5bd5df-2vp4b" event={"ID":"7ffac97d-c664-4d6c-a7ed-83dd7121891a","Type":"ContainerStarted","Data":"7a3a8767cf43e621e94fb5cc0f376807ba81a04c0c3a18908c4e7928fd3ff34f"} Apr 17 18:56:24.382048 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:24.381631 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-697b5bd5df-2vp4b" Apr 17 18:56:24.382841 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:24.382816 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-59447f86f4-x4f2p" event={"ID":"78a35df5-9008-4e23-af9a-61bac606b3ac","Type":"ContainerStarted","Data":"940008ebfb1a638736a3e05f57d77efe85e50b476604600027f16f4fc866eb7b"} Apr 17 18:56:24.400803 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:24.400759 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-697b5bd5df-2vp4b" podStartSLOduration=1.133008397 podStartE2EDuration="5.400747873s" podCreationTimestamp="2026-04-17 18:56:19 +0000 UTC" firstStartedPulling="2026-04-17 18:56:19.635239224 +0000 UTC m=+435.287775130" lastFinishedPulling="2026-04-17 18:56:23.902978687 +0000 UTC m=+439.555514606" observedRunningTime="2026-04-17 18:56:24.398485762 +0000 UTC m=+440.051021688" watchObservedRunningTime="2026-04-17 18:56:24.400747873 +0000 UTC m=+440.053283798" Apr 17 18:56:24.413648 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:24.413604 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-59447f86f4-x4f2p" podStartSLOduration=1.570275621 podStartE2EDuration="5.413593026s" podCreationTimestamp="2026-04-17 18:56:19 +0000 UTC" firstStartedPulling="2026-04-17 18:56:20.10738615 +0000 UTC m=+435.759922056" lastFinishedPulling="2026-04-17 18:56:23.950703554 +0000 UTC m=+439.603239461" observedRunningTime="2026-04-17 18:56:24.411649664 +0000 UTC m=+440.064185614" watchObservedRunningTime="2026-04-17 18:56:24.413593026 +0000 UTC m=+440.066128952" Apr 17 18:56:35.388150 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:56:35.388107 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-697b5bd5df-2vp4b" Apr 17 18:58:02.864755 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:58:02.864674 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-k59g5"] Apr 17 18:58:02.868371 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:58:02.868352 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-k59g5" Apr 17 18:58:02.870526 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:58:02.870496 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 17 18:58:02.870526 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:58:02.870518 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 18:58:02.870652 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:58:02.870518 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 17 18:58:02.871465 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:58:02.871446 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 18:58:02.871750 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:58:02.871727 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-zjsrt\"" Apr 17 18:58:02.873843 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:58:02.873823 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-k59g5"] Apr 17 18:58:02.907956 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:58:02.907923 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3409cd12-0cc7-4339-b457-935a5401c7dc-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-k59g5\" (UID: \"3409cd12-0cc7-4339-b457-935a5401c7dc\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-k59g5" Apr 17 18:58:02.908128 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:58:02.907965 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3409cd12-0cc7-4339-b457-935a5401c7dc-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-k59g5\" (UID: \"3409cd12-0cc7-4339-b457-935a5401c7dc\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-k59g5" Apr 17 18:58:02.908128 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:58:02.908059 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqgb7\" (UniqueName: \"kubernetes.io/projected/3409cd12-0cc7-4339-b457-935a5401c7dc-kube-api-access-sqgb7\") pod \"kuadrant-console-plugin-6cb54b5c86-k59g5\" (UID: \"3409cd12-0cc7-4339-b457-935a5401c7dc\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-k59g5" Apr 17 18:58:03.009073 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:58:03.009035 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3409cd12-0cc7-4339-b457-935a5401c7dc-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-k59g5\" (UID: \"3409cd12-0cc7-4339-b457-935a5401c7dc\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-k59g5" Apr 17 18:58:03.009268 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:58:03.009115 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sqgb7\" (UniqueName: \"kubernetes.io/projected/3409cd12-0cc7-4339-b457-935a5401c7dc-kube-api-access-sqgb7\") pod \"kuadrant-console-plugin-6cb54b5c86-k59g5\" (UID: \"3409cd12-0cc7-4339-b457-935a5401c7dc\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-k59g5" Apr 17 18:58:03.009268 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:58:03.009154 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3409cd12-0cc7-4339-b457-935a5401c7dc-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-k59g5\" (UID: \"3409cd12-0cc7-4339-b457-935a5401c7dc\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-k59g5" Apr 17 18:58:03.009268 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:58:03.009198 2580 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 17 18:58:03.009436 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:58:03.009283 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3409cd12-0cc7-4339-b457-935a5401c7dc-plugin-serving-cert podName:3409cd12-0cc7-4339-b457-935a5401c7dc nodeName:}" failed. No retries permitted until 2026-04-17 18:58:03.509262288 +0000 UTC m=+539.161798206 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/3409cd12-0cc7-4339-b457-935a5401c7dc-plugin-serving-cert") pod "kuadrant-console-plugin-6cb54b5c86-k59g5" (UID: "3409cd12-0cc7-4339-b457-935a5401c7dc") : secret "plugin-serving-cert" not found Apr 17 18:58:03.009689 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:58:03.009671 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3409cd12-0cc7-4339-b457-935a5401c7dc-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-k59g5\" (UID: \"3409cd12-0cc7-4339-b457-935a5401c7dc\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-k59g5" Apr 17 18:58:03.033206 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:58:03.033171 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqgb7\" (UniqueName: \"kubernetes.io/projected/3409cd12-0cc7-4339-b457-935a5401c7dc-kube-api-access-sqgb7\") pod \"kuadrant-console-plugin-6cb54b5c86-k59g5\" (UID: \"3409cd12-0cc7-4339-b457-935a5401c7dc\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-k59g5" Apr 17 18:58:03.512846 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:58:03.512808 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3409cd12-0cc7-4339-b457-935a5401c7dc-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-k59g5\" (UID: \"3409cd12-0cc7-4339-b457-935a5401c7dc\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-k59g5" Apr 17 18:58:03.515475 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:58:03.515453 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3409cd12-0cc7-4339-b457-935a5401c7dc-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-k59g5\" (UID: \"3409cd12-0cc7-4339-b457-935a5401c7dc\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-k59g5" Apr 17 18:58:03.800223 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:58:03.800127 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-k59g5" Apr 17 18:58:03.924335 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:58:03.924310 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-k59g5"] Apr 17 18:58:03.926936 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:58:03.926879 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3409cd12_0cc7_4339_b457_935a5401c7dc.slice/crio-9f6a13c555538289ab5e4f0ea0a252b1e0cdf037e5755999f61a051d63908ef2 WatchSource:0}: Error finding container 9f6a13c555538289ab5e4f0ea0a252b1e0cdf037e5755999f61a051d63908ef2: Status 404 returned error can't find the container with id 9f6a13c555538289ab5e4f0ea0a252b1e0cdf037e5755999f61a051d63908ef2 Apr 17 18:58:04.721600 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:58:04.721554 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-k59g5" event={"ID":"3409cd12-0cc7-4339-b457-935a5401c7dc","Type":"ContainerStarted","Data":"9f6a13c555538289ab5e4f0ea0a252b1e0cdf037e5755999f61a051d63908ef2"} Apr 17 18:58:27.812835 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:58:27.812796 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-k59g5" event={"ID":"3409cd12-0cc7-4339-b457-935a5401c7dc","Type":"ContainerStarted","Data":"21829ea6a3f53c8a5c5760ceed24a993f3eb2e77192c09c32fffb163ea97b144"} Apr 17 18:58:27.832146 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:58:27.832098 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-k59g5" podStartSLOduration=2.30904707 podStartE2EDuration="25.832084157s" podCreationTimestamp="2026-04-17 18:58:02 +0000 UTC" firstStartedPulling="2026-04-17 18:58:03.928268727 +0000 UTC m=+539.580804631" lastFinishedPulling="2026-04-17 18:58:27.451305808 +0000 UTC m=+563.103841718" observedRunningTime="2026-04-17 18:58:27.829961461 +0000 UTC m=+563.482497386" watchObservedRunningTime="2026-04-17 18:58:27.832084157 +0000 UTC m=+563.484620083" Apr 17 18:58:51.645764 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:58:51.645726 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 18:58:51.657867 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:58:51.657838 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-r7cmf" Apr 17 18:58:51.658047 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:58:51.658022 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 18:58:51.660028 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:58:51.660006 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 17 18:58:51.668888 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:58:51.668865 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 18:58:51.760748 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:58:51.760705 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjnfp\" (UniqueName: \"kubernetes.io/projected/ff41be55-78ab-404e-b574-38e9ead63668-kube-api-access-cjnfp\") pod \"limitador-limitador-78c99df468-r7cmf\" (UID: \"ff41be55-78ab-404e-b574-38e9ead63668\") " pod="kuadrant-system/limitador-limitador-78c99df468-r7cmf" Apr 17 18:58:51.760748 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:58:51.760753 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/ff41be55-78ab-404e-b574-38e9ead63668-config-file\") pod \"limitador-limitador-78c99df468-r7cmf\" (UID: \"ff41be55-78ab-404e-b574-38e9ead63668\") " pod="kuadrant-system/limitador-limitador-78c99df468-r7cmf" Apr 17 18:58:51.861819 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:58:51.861783 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cjnfp\" (UniqueName: \"kubernetes.io/projected/ff41be55-78ab-404e-b574-38e9ead63668-kube-api-access-cjnfp\") pod \"limitador-limitador-78c99df468-r7cmf\" (UID: \"ff41be55-78ab-404e-b574-38e9ead63668\") " pod="kuadrant-system/limitador-limitador-78c99df468-r7cmf" Apr 17 18:58:51.861819 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:58:51.861824 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/ff41be55-78ab-404e-b574-38e9ead63668-config-file\") pod \"limitador-limitador-78c99df468-r7cmf\" (UID: \"ff41be55-78ab-404e-b574-38e9ead63668\") " pod="kuadrant-system/limitador-limitador-78c99df468-r7cmf" Apr 17 18:58:51.862409 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:58:51.862390 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/ff41be55-78ab-404e-b574-38e9ead63668-config-file\") pod \"limitador-limitador-78c99df468-r7cmf\" (UID: \"ff41be55-78ab-404e-b574-38e9ead63668\") " pod="kuadrant-system/limitador-limitador-78c99df468-r7cmf" Apr 17 18:58:51.870015 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:58:51.869986 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjnfp\" (UniqueName: \"kubernetes.io/projected/ff41be55-78ab-404e-b574-38e9ead63668-kube-api-access-cjnfp\") pod \"limitador-limitador-78c99df468-r7cmf\" (UID: \"ff41be55-78ab-404e-b574-38e9ead63668\") " pod="kuadrant-system/limitador-limitador-78c99df468-r7cmf" Apr 17 18:58:51.969917 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:58:51.969822 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-r7cmf" Apr 17 18:58:52.099261 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:58:52.099238 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 18:58:52.101845 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:58:52.101819 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff41be55_78ab_404e_b574_38e9ead63668.slice/crio-66e18f83d525fb52d1ca8d67ed7e1a03e4641cca655e48d6116c3620a73168b6 WatchSource:0}: Error finding container 66e18f83d525fb52d1ca8d67ed7e1a03e4641cca655e48d6116c3620a73168b6: Status 404 returned error can't find the container with id 66e18f83d525fb52d1ca8d67ed7e1a03e4641cca655e48d6116c3620a73168b6 Apr 17 18:58:52.899366 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:58:52.899325 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-r7cmf" event={"ID":"ff41be55-78ab-404e-b574-38e9ead63668","Type":"ContainerStarted","Data":"66e18f83d525fb52d1ca8d67ed7e1a03e4641cca655e48d6116c3620a73168b6"} Apr 17 18:58:54.907875 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:58:54.907839 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-r7cmf" event={"ID":"ff41be55-78ab-404e-b574-38e9ead63668","Type":"ContainerStarted","Data":"726c8e857b63dba471b90377ae5c3e6df8842900c0275530efdb68374dc95610"} Apr 17 18:58:54.908283 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:58:54.908007 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-r7cmf" Apr 17 18:58:54.924061 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:58:54.924014 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-r7cmf" podStartSLOduration=1.3430817290000001 podStartE2EDuration="3.92400023s" podCreationTimestamp="2026-04-17 18:58:51 +0000 UTC" firstStartedPulling="2026-04-17 18:58:52.103583854 +0000 UTC m=+587.756119758" lastFinishedPulling="2026-04-17 18:58:54.684502351 +0000 UTC m=+590.337038259" observedRunningTime="2026-04-17 18:58:54.921705118 +0000 UTC m=+590.574241134" watchObservedRunningTime="2026-04-17 18:58:54.92400023 +0000 UTC m=+590.576536198" Apr 17 18:59:04.903251 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:04.903217 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-whmp8_160e7e3f-dd7c-4341-8e04-ec0fc5728152/console-operator/2.log" Apr 17 18:59:04.903779 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:04.903350 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-whmp8_160e7e3f-dd7c-4341-8e04-ec0fc5728152/console-operator/2.log" Apr 17 18:59:04.909631 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:04.909604 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7klhd_87d38419-312e-4358-b18c-7e7b24e8189f/ovn-acl-logging/0.log" Apr 17 18:59:04.909744 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:04.909611 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7klhd_87d38419-312e-4358-b18c-7e7b24e8189f/ovn-acl-logging/0.log" Apr 17 18:59:05.912039 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:05.912010 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-r7cmf" Apr 17 18:59:20.937960 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:20.937864 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-9xbcs"] Apr 17 18:59:20.941199 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:20.941183 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-9xbcs" Apr 17 18:59:20.943185 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:20.943166 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-mr96b\"" Apr 17 18:59:20.946378 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:20.946351 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-9xbcs"] Apr 17 18:59:21.016693 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:21.016653 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvp95\" (UniqueName: \"kubernetes.io/projected/60e71af1-9667-40d4-a0c8-c9fa55f530d1-kube-api-access-gvp95\") pod \"authorino-8b475cf9f-9xbcs\" (UID: \"60e71af1-9667-40d4-a0c8-c9fa55f530d1\") " pod="kuadrant-system/authorino-8b475cf9f-9xbcs" Apr 17 18:59:21.117490 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:21.117456 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvp95\" (UniqueName: \"kubernetes.io/projected/60e71af1-9667-40d4-a0c8-c9fa55f530d1-kube-api-access-gvp95\") pod \"authorino-8b475cf9f-9xbcs\" (UID: \"60e71af1-9667-40d4-a0c8-c9fa55f530d1\") " pod="kuadrant-system/authorino-8b475cf9f-9xbcs" Apr 17 18:59:21.125383 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:21.125361 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvp95\" (UniqueName: \"kubernetes.io/projected/60e71af1-9667-40d4-a0c8-c9fa55f530d1-kube-api-access-gvp95\") pod \"authorino-8b475cf9f-9xbcs\" (UID: \"60e71af1-9667-40d4-a0c8-c9fa55f530d1\") " pod="kuadrant-system/authorino-8b475cf9f-9xbcs" Apr 17 18:59:21.180559 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:21.180520 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-9xbcs"] Apr 17 18:59:21.180781 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:21.180770 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-9xbcs" Apr 17 18:59:21.209592 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:21.209560 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-6dfbc9c798-w8wfd"] Apr 17 18:59:21.214159 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:21.214134 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6dfbc9c798-w8wfd" Apr 17 18:59:21.218655 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:21.218629 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-6dfbc9c798-w8wfd"] Apr 17 18:59:21.305926 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:21.305874 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-9xbcs"] Apr 17 18:59:21.308481 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:59:21.308448 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60e71af1_9667_40d4_a0c8_c9fa55f530d1.slice/crio-5a6a2d5d764f99ee8e52cd61fb6fa24d0b2d638b4113ae0e4b09529b51563e08 WatchSource:0}: Error finding container 5a6a2d5d764f99ee8e52cd61fb6fa24d0b2d638b4113ae0e4b09529b51563e08: Status 404 returned error can't find the container with id 5a6a2d5d764f99ee8e52cd61fb6fa24d0b2d638b4113ae0e4b09529b51563e08 Apr 17 18:59:21.319241 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:21.319212 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxc25\" (UniqueName: \"kubernetes.io/projected/5fee6a53-0e9e-472f-86f7-90da8c4e475d-kube-api-access-vxc25\") pod \"authorino-6dfbc9c798-w8wfd\" (UID: \"5fee6a53-0e9e-472f-86f7-90da8c4e475d\") " pod="kuadrant-system/authorino-6dfbc9c798-w8wfd" Apr 17 18:59:21.322134 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:21.322111 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 18:59:21.419886 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:21.419842 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vxc25\" (UniqueName: \"kubernetes.io/projected/5fee6a53-0e9e-472f-86f7-90da8c4e475d-kube-api-access-vxc25\") pod \"authorino-6dfbc9c798-w8wfd\" (UID: \"5fee6a53-0e9e-472f-86f7-90da8c4e475d\") " pod="kuadrant-system/authorino-6dfbc9c798-w8wfd" Apr 17 18:59:21.427456 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:21.427432 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxc25\" (UniqueName: \"kubernetes.io/projected/5fee6a53-0e9e-472f-86f7-90da8c4e475d-kube-api-access-vxc25\") pod \"authorino-6dfbc9c798-w8wfd\" (UID: \"5fee6a53-0e9e-472f-86f7-90da8c4e475d\") " pod="kuadrant-system/authorino-6dfbc9c798-w8wfd" Apr 17 18:59:21.525575 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:21.525529 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6dfbc9c798-w8wfd" Apr 17 18:59:21.643355 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:21.643322 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-6dfbc9c798-w8wfd"] Apr 17 18:59:21.646326 ip-10-0-136-27 kubenswrapper[2580]: W0417 18:59:21.646299 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5fee6a53_0e9e_472f_86f7_90da8c4e475d.slice/crio-6512d89e7bb65cd4c62c40285e866d8a4426363a8f476452e98912d29856aa3d WatchSource:0}: Error finding container 6512d89e7bb65cd4c62c40285e866d8a4426363a8f476452e98912d29856aa3d: Status 404 returned error can't find the container with id 6512d89e7bb65cd4c62c40285e866d8a4426363a8f476452e98912d29856aa3d Apr 17 18:59:21.995588 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:21.995547 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-6dfbc9c798-w8wfd" event={"ID":"5fee6a53-0e9e-472f-86f7-90da8c4e475d","Type":"ContainerStarted","Data":"6512d89e7bb65cd4c62c40285e866d8a4426363a8f476452e98912d29856aa3d"} Apr 17 18:59:21.997109 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:21.997073 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-9xbcs" event={"ID":"60e71af1-9667-40d4-a0c8-c9fa55f530d1","Type":"ContainerStarted","Data":"5a6a2d5d764f99ee8e52cd61fb6fa24d0b2d638b4113ae0e4b09529b51563e08"} Apr 17 18:59:25.014429 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:25.014388 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-6dfbc9c798-w8wfd" event={"ID":"5fee6a53-0e9e-472f-86f7-90da8c4e475d","Type":"ContainerStarted","Data":"6ff3a6289709e167585f183e6167d0e974afa7c92753e58febef6cdbb258b327"} Apr 17 18:59:25.016505 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:25.016468 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-9xbcs" event={"ID":"60e71af1-9667-40d4-a0c8-c9fa55f530d1","Type":"ContainerStarted","Data":"ee0eed9264b124830c9b87fbb04284733fb496fada33c22ae546aee4b6595cb3"} Apr 17 18:59:25.016717 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:25.016481 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-8b475cf9f-9xbcs" podUID="60e71af1-9667-40d4-a0c8-c9fa55f530d1" containerName="authorino" containerID="cri-o://ee0eed9264b124830c9b87fbb04284733fb496fada33c22ae546aee4b6595cb3" gracePeriod=30 Apr 17 18:59:25.028465 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:25.028420 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-6dfbc9c798-w8wfd" podStartSLOduration=0.82163374 podStartE2EDuration="4.028405757s" podCreationTimestamp="2026-04-17 18:59:21 +0000 UTC" firstStartedPulling="2026-04-17 18:59:21.647605214 +0000 UTC m=+617.300141119" lastFinishedPulling="2026-04-17 18:59:24.854377224 +0000 UTC m=+620.506913136" observedRunningTime="2026-04-17 18:59:25.026764919 +0000 UTC m=+620.679300845" watchObservedRunningTime="2026-04-17 18:59:25.028405757 +0000 UTC m=+620.680941738" Apr 17 18:59:25.042481 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:25.041459 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-8b475cf9f-9xbcs" podStartSLOduration=1.5074297410000002 podStartE2EDuration="5.041440692s" podCreationTimestamp="2026-04-17 18:59:20 +0000 UTC" firstStartedPulling="2026-04-17 18:59:21.309802383 +0000 UTC m=+616.962338288" lastFinishedPulling="2026-04-17 18:59:24.843813332 +0000 UTC m=+620.496349239" observedRunningTime="2026-04-17 18:59:25.039196945 +0000 UTC m=+620.691732872" watchObservedRunningTime="2026-04-17 18:59:25.041440692 +0000 UTC m=+620.693976619" Apr 17 18:59:25.263293 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:25.263270 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-9xbcs" Apr 17 18:59:25.366883 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:25.366855 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvp95\" (UniqueName: \"kubernetes.io/projected/60e71af1-9667-40d4-a0c8-c9fa55f530d1-kube-api-access-gvp95\") pod \"60e71af1-9667-40d4-a0c8-c9fa55f530d1\" (UID: \"60e71af1-9667-40d4-a0c8-c9fa55f530d1\") " Apr 17 18:59:25.369141 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:25.369117 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60e71af1-9667-40d4-a0c8-c9fa55f530d1-kube-api-access-gvp95" (OuterVolumeSpecName: "kube-api-access-gvp95") pod "60e71af1-9667-40d4-a0c8-c9fa55f530d1" (UID: "60e71af1-9667-40d4-a0c8-c9fa55f530d1"). InnerVolumeSpecName "kube-api-access-gvp95". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:59:25.467799 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:25.467724 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gvp95\" (UniqueName: \"kubernetes.io/projected/60e71af1-9667-40d4-a0c8-c9fa55f530d1-kube-api-access-gvp95\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 17 18:59:26.020915 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:26.020855 2580 generic.go:358] "Generic (PLEG): container finished" podID="60e71af1-9667-40d4-a0c8-c9fa55f530d1" containerID="ee0eed9264b124830c9b87fbb04284733fb496fada33c22ae546aee4b6595cb3" exitCode=0 Apr 17 18:59:26.020915 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:26.020917 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-9xbcs" Apr 17 18:59:26.021401 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:26.020938 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-9xbcs" event={"ID":"60e71af1-9667-40d4-a0c8-c9fa55f530d1","Type":"ContainerDied","Data":"ee0eed9264b124830c9b87fbb04284733fb496fada33c22ae546aee4b6595cb3"} Apr 17 18:59:26.021401 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:26.020974 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-9xbcs" event={"ID":"60e71af1-9667-40d4-a0c8-c9fa55f530d1","Type":"ContainerDied","Data":"5a6a2d5d764f99ee8e52cd61fb6fa24d0b2d638b4113ae0e4b09529b51563e08"} Apr 17 18:59:26.021401 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:26.020989 2580 scope.go:117] "RemoveContainer" containerID="ee0eed9264b124830c9b87fbb04284733fb496fada33c22ae546aee4b6595cb3" Apr 17 18:59:26.032802 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:26.032784 2580 scope.go:117] "RemoveContainer" containerID="ee0eed9264b124830c9b87fbb04284733fb496fada33c22ae546aee4b6595cb3" Apr 17 18:59:26.033133 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:59:26.033111 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee0eed9264b124830c9b87fbb04284733fb496fada33c22ae546aee4b6595cb3\": container with ID starting with ee0eed9264b124830c9b87fbb04284733fb496fada33c22ae546aee4b6595cb3 not found: ID does not exist" containerID="ee0eed9264b124830c9b87fbb04284733fb496fada33c22ae546aee4b6595cb3" Apr 17 18:59:26.033202 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:26.033142 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee0eed9264b124830c9b87fbb04284733fb496fada33c22ae546aee4b6595cb3"} err="failed to get container status \"ee0eed9264b124830c9b87fbb04284733fb496fada33c22ae546aee4b6595cb3\": rpc error: code = NotFound desc = could not find container \"ee0eed9264b124830c9b87fbb04284733fb496fada33c22ae546aee4b6595cb3\": container with ID starting with ee0eed9264b124830c9b87fbb04284733fb496fada33c22ae546aee4b6595cb3 not found: ID does not exist" Apr 17 18:59:26.044413 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:26.044385 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-9xbcs"] Apr 17 18:59:26.046115 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:26.046089 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-9xbcs"] Apr 17 18:59:26.979191 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:26.979152 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60e71af1-9667-40d4-a0c8-c9fa55f530d1" path="/var/lib/kubelet/pods/60e71af1-9667-40d4-a0c8-c9fa55f530d1/volumes" Apr 17 18:59:27.450815 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:27.450739 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-6dfbc9c798-w8wfd"] Apr 17 18:59:27.451255 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:27.450971 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-6dfbc9c798-w8wfd" podUID="5fee6a53-0e9e-472f-86f7-90da8c4e475d" containerName="authorino" containerID="cri-o://6ff3a6289709e167585f183e6167d0e974afa7c92753e58febef6cdbb258b327" gracePeriod=30 Apr 17 18:59:27.693879 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:27.693852 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6dfbc9c798-w8wfd" Apr 17 18:59:27.786175 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:27.786144 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxc25\" (UniqueName: \"kubernetes.io/projected/5fee6a53-0e9e-472f-86f7-90da8c4e475d-kube-api-access-vxc25\") pod \"5fee6a53-0e9e-472f-86f7-90da8c4e475d\" (UID: \"5fee6a53-0e9e-472f-86f7-90da8c4e475d\") " Apr 17 18:59:27.788363 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:27.788334 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fee6a53-0e9e-472f-86f7-90da8c4e475d-kube-api-access-vxc25" (OuterVolumeSpecName: "kube-api-access-vxc25") pod "5fee6a53-0e9e-472f-86f7-90da8c4e475d" (UID: "5fee6a53-0e9e-472f-86f7-90da8c4e475d"). InnerVolumeSpecName "kube-api-access-vxc25". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:59:27.887626 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:27.887587 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vxc25\" (UniqueName: \"kubernetes.io/projected/5fee6a53-0e9e-472f-86f7-90da8c4e475d-kube-api-access-vxc25\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 17 18:59:28.030853 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:28.030815 2580 generic.go:358] "Generic (PLEG): container finished" podID="5fee6a53-0e9e-472f-86f7-90da8c4e475d" containerID="6ff3a6289709e167585f183e6167d0e974afa7c92753e58febef6cdbb258b327" exitCode=0 Apr 17 18:59:28.031044 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:28.030865 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6dfbc9c798-w8wfd" Apr 17 18:59:28.031044 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:28.030868 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-6dfbc9c798-w8wfd" event={"ID":"5fee6a53-0e9e-472f-86f7-90da8c4e475d","Type":"ContainerDied","Data":"6ff3a6289709e167585f183e6167d0e974afa7c92753e58febef6cdbb258b327"} Apr 17 18:59:28.031044 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:28.030928 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-6dfbc9c798-w8wfd" event={"ID":"5fee6a53-0e9e-472f-86f7-90da8c4e475d","Type":"ContainerDied","Data":"6512d89e7bb65cd4c62c40285e866d8a4426363a8f476452e98912d29856aa3d"} Apr 17 18:59:28.031044 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:28.030948 2580 scope.go:117] "RemoveContainer" containerID="6ff3a6289709e167585f183e6167d0e974afa7c92753e58febef6cdbb258b327" Apr 17 18:59:28.039872 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:28.039851 2580 scope.go:117] "RemoveContainer" containerID="6ff3a6289709e167585f183e6167d0e974afa7c92753e58febef6cdbb258b327" Apr 17 18:59:28.040191 ip-10-0-136-27 kubenswrapper[2580]: E0417 18:59:28.040172 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ff3a6289709e167585f183e6167d0e974afa7c92753e58febef6cdbb258b327\": container with ID starting with 6ff3a6289709e167585f183e6167d0e974afa7c92753e58febef6cdbb258b327 not found: ID does not exist" containerID="6ff3a6289709e167585f183e6167d0e974afa7c92753e58febef6cdbb258b327" Apr 17 18:59:28.040258 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:28.040201 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ff3a6289709e167585f183e6167d0e974afa7c92753e58febef6cdbb258b327"} err="failed to get container status \"6ff3a6289709e167585f183e6167d0e974afa7c92753e58febef6cdbb258b327\": rpc error: code = NotFound desc = could not find container \"6ff3a6289709e167585f183e6167d0e974afa7c92753e58febef6cdbb258b327\": container with ID starting with 6ff3a6289709e167585f183e6167d0e974afa7c92753e58febef6cdbb258b327 not found: ID does not exist" Apr 17 18:59:28.053693 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:28.053661 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-6dfbc9c798-w8wfd"] Apr 17 18:59:28.059614 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:28.059588 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-6dfbc9c798-w8wfd"] Apr 17 18:59:28.979122 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:28.979088 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fee6a53-0e9e-472f-86f7-90da8c4e475d" path="/var/lib/kubelet/pods/5fee6a53-0e9e-472f-86f7-90da8c4e475d/volumes" Apr 17 18:59:59.253450 ip-10-0-136-27 kubenswrapper[2580]: I0417 18:59:59.253412 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:00:10.626406 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:00:10.626369 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:00:24.233269 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:00:24.233238 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:00:31.933386 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:00:31.933349 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:00:37.430148 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:00:37.430109 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:00:46.136135 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:00:46.136097 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:01:15.032513 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:01:15.032436 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-6c9d9f76ff-rmbhf"] Apr 17 19:01:15.032990 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:01:15.032768 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5fee6a53-0e9e-472f-86f7-90da8c4e475d" containerName="authorino" Apr 17 19:01:15.032990 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:01:15.032779 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fee6a53-0e9e-472f-86f7-90da8c4e475d" containerName="authorino" Apr 17 19:01:15.032990 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:01:15.032790 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="60e71af1-9667-40d4-a0c8-c9fa55f530d1" containerName="authorino" Apr 17 19:01:15.032990 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:01:15.032795 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="60e71af1-9667-40d4-a0c8-c9fa55f530d1" containerName="authorino" Apr 17 19:01:15.032990 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:01:15.032855 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="5fee6a53-0e9e-472f-86f7-90da8c4e475d" containerName="authorino" Apr 17 19:01:15.032990 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:01:15.032865 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="60e71af1-9667-40d4-a0c8-c9fa55f530d1" containerName="authorino" Apr 17 19:01:15.036028 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:01:15.036012 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6c9d9f76ff-rmbhf" Apr 17 19:01:15.038916 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:01:15.038879 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 17 19:01:15.038916 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:01:15.038883 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-mr96b\"" Apr 17 19:01:15.041597 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:01:15.041575 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-6c9d9f76ff-rmbhf"] Apr 17 19:01:15.088443 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:01:15.088414 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjhbh\" (UniqueName: \"kubernetes.io/projected/43131641-b000-4428-86a0-ddd5400ee917-kube-api-access-hjhbh\") pod \"authorino-6c9d9f76ff-rmbhf\" (UID: \"43131641-b000-4428-86a0-ddd5400ee917\") " pod="kuadrant-system/authorino-6c9d9f76ff-rmbhf" Apr 17 19:01:15.088611 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:01:15.088549 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/43131641-b000-4428-86a0-ddd5400ee917-tls-cert\") pod \"authorino-6c9d9f76ff-rmbhf\" (UID: \"43131641-b000-4428-86a0-ddd5400ee917\") " pod="kuadrant-system/authorino-6c9d9f76ff-rmbhf" Apr 17 19:01:15.189393 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:01:15.189358 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hjhbh\" (UniqueName: \"kubernetes.io/projected/43131641-b000-4428-86a0-ddd5400ee917-kube-api-access-hjhbh\") pod \"authorino-6c9d9f76ff-rmbhf\" (UID: \"43131641-b000-4428-86a0-ddd5400ee917\") " pod="kuadrant-system/authorino-6c9d9f76ff-rmbhf" Apr 17 19:01:15.189556 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:01:15.189402 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/43131641-b000-4428-86a0-ddd5400ee917-tls-cert\") pod \"authorino-6c9d9f76ff-rmbhf\" (UID: \"43131641-b000-4428-86a0-ddd5400ee917\") " pod="kuadrant-system/authorino-6c9d9f76ff-rmbhf" Apr 17 19:01:15.192055 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:01:15.192032 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/43131641-b000-4428-86a0-ddd5400ee917-tls-cert\") pod \"authorino-6c9d9f76ff-rmbhf\" (UID: \"43131641-b000-4428-86a0-ddd5400ee917\") " pod="kuadrant-system/authorino-6c9d9f76ff-rmbhf" Apr 17 19:01:15.196874 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:01:15.196841 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjhbh\" (UniqueName: \"kubernetes.io/projected/43131641-b000-4428-86a0-ddd5400ee917-kube-api-access-hjhbh\") pod \"authorino-6c9d9f76ff-rmbhf\" (UID: \"43131641-b000-4428-86a0-ddd5400ee917\") " pod="kuadrant-system/authorino-6c9d9f76ff-rmbhf" Apr 17 19:01:15.346442 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:01:15.346354 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6c9d9f76ff-rmbhf" Apr 17 19:01:15.485110 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:01:15.485084 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-6c9d9f76ff-rmbhf"] Apr 17 19:01:15.487716 ip-10-0-136-27 kubenswrapper[2580]: W0417 19:01:15.487675 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43131641_b000_4428_86a0_ddd5400ee917.slice/crio-717df9675c25107ab2d60713f40781946baef06776e354ddf3d0bbc3b135413e WatchSource:0}: Error finding container 717df9675c25107ab2d60713f40781946baef06776e354ddf3d0bbc3b135413e: Status 404 returned error can't find the container with id 717df9675c25107ab2d60713f40781946baef06776e354ddf3d0bbc3b135413e Apr 17 19:01:15.489001 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:01:15.488982 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 19:01:16.394036 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:01:16.393991 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-6c9d9f76ff-rmbhf" event={"ID":"43131641-b000-4428-86a0-ddd5400ee917","Type":"ContainerStarted","Data":"fb94c037d350201279c8247295875a31d857a3639bb6c0ef701fdbf42d236143"} Apr 17 19:01:16.394036 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:01:16.394041 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-6c9d9f76ff-rmbhf" event={"ID":"43131641-b000-4428-86a0-ddd5400ee917","Type":"ContainerStarted","Data":"717df9675c25107ab2d60713f40781946baef06776e354ddf3d0bbc3b135413e"} Apr 17 19:01:16.408035 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:01:16.407978 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-6c9d9f76ff-rmbhf" podStartSLOduration=0.983538679 podStartE2EDuration="1.407958393s" podCreationTimestamp="2026-04-17 19:01:15 +0000 UTC" firstStartedPulling="2026-04-17 19:01:15.489105329 +0000 UTC m=+731.141641233" lastFinishedPulling="2026-04-17 19:01:15.91352504 +0000 UTC m=+731.566060947" observedRunningTime="2026-04-17 19:01:16.406767339 +0000 UTC m=+732.059303265" watchObservedRunningTime="2026-04-17 19:01:16.407958393 +0000 UTC m=+732.060494311" Apr 17 19:01:41.934704 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:01:41.934674 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:01:52.933566 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:01:52.933527 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:02:01.238825 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:02:01.238779 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:02:11.725423 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:02:11.725392 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:02:20.530468 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:02:20.530386 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:02:32.023230 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:02:32.023182 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:03:33.333569 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:03:33.333534 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:03:49.025753 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:03:49.025670 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:04:04.927880 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:04:04.927854 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-whmp8_160e7e3f-dd7c-4341-8e04-ec0fc5728152/console-operator/2.log" Apr 17 19:04:04.928785 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:04:04.928765 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-whmp8_160e7e3f-dd7c-4341-8e04-ec0fc5728152/console-operator/2.log" Apr 17 19:04:04.934330 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:04:04.934305 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7klhd_87d38419-312e-4358-b18c-7e7b24e8189f/ovn-acl-logging/0.log" Apr 17 19:04:04.935320 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:04:04.935297 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7klhd_87d38419-312e-4358-b18c-7e7b24e8189f/ovn-acl-logging/0.log" Apr 17 19:04:27.125067 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:04:27.125029 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:04:44.027878 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:04:44.027838 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:04:58.727158 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:04:58.727124 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:05:14.825233 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:05:14.825195 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:05:44.839110 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:05:44.839070 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:05:50.419325 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:05:50.419286 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:06:10.923497 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:06:10.923465 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:06:19.620015 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:06:19.619982 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:06:36.820814 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:06:36.820779 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:06:45.128517 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:06:45.128484 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:07:01.723879 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:07:01.723797 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:07:11.129663 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:07:11.129626 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:07:42.821684 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:07:42.821639 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:07:51.524794 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:07:51.524745 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:07:59.822580 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:07:59.822541 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:08:08.321937 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:08:08.321882 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:08:16.525313 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:08:16.525273 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:08:34.121950 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:08:34.121913 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:08:47.324709 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:08:47.324672 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:09:04.953836 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:09:04.953805 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-whmp8_160e7e3f-dd7c-4341-8e04-ec0fc5728152/console-operator/2.log" Apr 17 19:09:04.955600 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:09:04.955576 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-whmp8_160e7e3f-dd7c-4341-8e04-ec0fc5728152/console-operator/2.log" Apr 17 19:09:04.960755 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:09:04.960735 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7klhd_87d38419-312e-4358-b18c-7e7b24e8189f/ovn-acl-logging/0.log" Apr 17 19:09:04.961881 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:09:04.961863 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7klhd_87d38419-312e-4358-b18c-7e7b24e8189f/ovn-acl-logging/0.log" Apr 17 19:09:34.426677 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:09:34.426638 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:09:42.028397 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:09:42.028359 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:09:49.924835 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:09:49.924749 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:09:58.828048 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:09:58.828008 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:10:07.732073 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:10:07.732033 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:10:16.125533 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:10:16.125497 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:10:25.426014 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:10:25.422672 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:10:30.823092 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:10:30.823053 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:10:34.725461 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:10:34.725421 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:10:42.529187 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:10:42.529151 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:10:52.127083 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:10:52.127049 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:11:01.231485 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:11:01.231447 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:11:09.231454 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:11:09.231409 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:11:18.726767 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:11:18.726681 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:11:26.929739 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:11:26.929704 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:11:35.023567 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:11:35.023530 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:11:43.824181 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:11:43.824147 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:11:53.420751 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:11:53.420712 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:12:01.432097 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:12:01.432060 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:14:04.986823 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:14:04.986796 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-whmp8_160e7e3f-dd7c-4341-8e04-ec0fc5728152/console-operator/2.log" Apr 17 19:14:04.991730 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:14:04.991700 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-whmp8_160e7e3f-dd7c-4341-8e04-ec0fc5728152/console-operator/2.log" Apr 17 19:14:04.996262 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:14:04.996234 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7klhd_87d38419-312e-4358-b18c-7e7b24e8189f/ovn-acl-logging/0.log" Apr 17 19:14:04.998240 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:14:04.998219 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7klhd_87d38419-312e-4358-b18c-7e7b24e8189f/ovn-acl-logging/0.log" Apr 17 19:14:20.147816 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:14:20.147730 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:14:26.327159 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:14:26.327122 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:14:49.128308 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:14:49.128265 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:14:53.829771 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:14:53.829736 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:15:00.129205 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:15:00.129168 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29607555-hmffw"] Apr 17 19:15:00.132600 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:15:00.132584 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607555-hmffw" Apr 17 19:15:00.134749 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:15:00.134730 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-8nwjw\"" Apr 17 19:15:00.138620 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:15:00.138595 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607555-hmffw"] Apr 17 19:15:00.266696 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:15:00.266661 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5hk2\" (UniqueName: \"kubernetes.io/projected/d1842043-678b-45c3-9ddd-b63c0a47de52-kube-api-access-j5hk2\") pod \"maas-api-key-cleanup-29607555-hmffw\" (UID: \"d1842043-678b-45c3-9ddd-b63c0a47de52\") " pod="opendatahub/maas-api-key-cleanup-29607555-hmffw" Apr 17 19:15:00.367931 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:15:00.367870 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j5hk2\" (UniqueName: \"kubernetes.io/projected/d1842043-678b-45c3-9ddd-b63c0a47de52-kube-api-access-j5hk2\") pod \"maas-api-key-cleanup-29607555-hmffw\" (UID: \"d1842043-678b-45c3-9ddd-b63c0a47de52\") " pod="opendatahub/maas-api-key-cleanup-29607555-hmffw" Apr 17 19:15:00.376343 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:15:00.376294 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5hk2\" (UniqueName: \"kubernetes.io/projected/d1842043-678b-45c3-9ddd-b63c0a47de52-kube-api-access-j5hk2\") pod \"maas-api-key-cleanup-29607555-hmffw\" (UID: \"d1842043-678b-45c3-9ddd-b63c0a47de52\") " pod="opendatahub/maas-api-key-cleanup-29607555-hmffw" Apr 17 19:15:00.443932 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:15:00.443833 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607555-hmffw" Apr 17 19:15:00.772137 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:15:00.772104 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607555-hmffw"] Apr 17 19:15:00.774521 ip-10-0-136-27 kubenswrapper[2580]: W0417 19:15:00.774485 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1842043_678b_45c3_9ddd_b63c0a47de52.slice/crio-b9afe6570e2911c6bc1687745d0a220bf69864b4d13287e8cd04b7da840880e1 WatchSource:0}: Error finding container b9afe6570e2911c6bc1687745d0a220bf69864b4d13287e8cd04b7da840880e1: Status 404 returned error can't find the container with id b9afe6570e2911c6bc1687745d0a220bf69864b4d13287e8cd04b7da840880e1 Apr 17 19:15:00.776229 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:15:00.776212 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 19:15:01.204004 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:15:01.203887 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607555-hmffw" event={"ID":"d1842043-678b-45c3-9ddd-b63c0a47de52","Type":"ContainerStarted","Data":"b9afe6570e2911c6bc1687745d0a220bf69864b4d13287e8cd04b7da840880e1"} Apr 17 19:15:03.212483 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:15:03.212439 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607555-hmffw" event={"ID":"d1842043-678b-45c3-9ddd-b63c0a47de52","Type":"ContainerStarted","Data":"005fd295f302c0b592cd945fedd13a70fa1f977a683fdbe4510267d802b4475a"} Apr 17 19:15:03.229020 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:15:03.228969 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29607555-hmffw" podStartSLOduration=1.26758768 podStartE2EDuration="3.228956326s" podCreationTimestamp="2026-04-17 19:15:00 +0000 UTC" firstStartedPulling="2026-04-17 19:15:00.776375479 +0000 UTC m=+1556.428911384" lastFinishedPulling="2026-04-17 19:15:02.737744124 +0000 UTC m=+1558.390280030" observedRunningTime="2026-04-17 19:15:03.227308185 +0000 UTC m=+1558.879844112" watchObservedRunningTime="2026-04-17 19:15:03.228956326 +0000 UTC m=+1558.881492252" Apr 17 19:15:03.625658 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:15:03.625615 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:15:14.326383 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:15:14.326343 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:15:22.334579 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:15:22.334527 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:15:24.289507 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:15:24.289469 2580 generic.go:358] "Generic (PLEG): container finished" podID="d1842043-678b-45c3-9ddd-b63c0a47de52" containerID="005fd295f302c0b592cd945fedd13a70fa1f977a683fdbe4510267d802b4475a" exitCode=6 Apr 17 19:15:24.289924 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:15:24.289547 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607555-hmffw" event={"ID":"d1842043-678b-45c3-9ddd-b63c0a47de52","Type":"ContainerDied","Data":"005fd295f302c0b592cd945fedd13a70fa1f977a683fdbe4510267d802b4475a"} Apr 17 19:15:24.289924 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:15:24.289870 2580 scope.go:117] "RemoveContainer" containerID="005fd295f302c0b592cd945fedd13a70fa1f977a683fdbe4510267d802b4475a" Apr 17 19:15:25.294962 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:15:25.294927 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607555-hmffw" event={"ID":"d1842043-678b-45c3-9ddd-b63c0a47de52","Type":"ContainerStarted","Data":"3625805ea709ffda7d568be753848f496043fe76a85dc0a470bf152122370738"} Apr 17 19:15:33.121475 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:15:33.121433 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:15:42.922462 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:15:42.922426 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:15:45.367731 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:15:45.367695 2580 generic.go:358] "Generic (PLEG): container finished" podID="d1842043-678b-45c3-9ddd-b63c0a47de52" containerID="3625805ea709ffda7d568be753848f496043fe76a85dc0a470bf152122370738" exitCode=6 Apr 17 19:15:45.368163 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:15:45.367765 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607555-hmffw" event={"ID":"d1842043-678b-45c3-9ddd-b63c0a47de52","Type":"ContainerDied","Data":"3625805ea709ffda7d568be753848f496043fe76a85dc0a470bf152122370738"} Apr 17 19:15:45.368163 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:15:45.367808 2580 scope.go:117] "RemoveContainer" containerID="005fd295f302c0b592cd945fedd13a70fa1f977a683fdbe4510267d802b4475a" Apr 17 19:15:45.368163 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:15:45.368152 2580 scope.go:117] "RemoveContainer" containerID="3625805ea709ffda7d568be753848f496043fe76a85dc0a470bf152122370738" Apr 17 19:15:45.368389 ip-10-0-136-27 kubenswrapper[2580]: E0417 19:15:45.368367 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29607555-hmffw_opendatahub(d1842043-678b-45c3-9ddd-b63c0a47de52)\"" pod="opendatahub/maas-api-key-cleanup-29607555-hmffw" podUID="d1842043-678b-45c3-9ddd-b63c0a47de52" Apr 17 19:15:53.033877 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:15:53.033787 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:15:59.973730 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:15:59.973693 2580 scope.go:117] "RemoveContainer" containerID="3625805ea709ffda7d568be753848f496043fe76a85dc0a470bf152122370738" Apr 17 19:16:00.008761 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:16:00.008727 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607555-hmffw"] Apr 17 19:16:00.421718 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:16:00.421676 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607555-hmffw" event={"ID":"d1842043-678b-45c3-9ddd-b63c0a47de52","Type":"ContainerStarted","Data":"8c09998d27ce2c832e76f66ba1e57bed2c06a6eabc9cd280253f7556d1d00278"} Apr 17 19:16:00.421916 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:16:00.421725 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29607555-hmffw" podUID="d1842043-678b-45c3-9ddd-b63c0a47de52" containerName="cleanup" containerID="cri-o://8c09998d27ce2c832e76f66ba1e57bed2c06a6eabc9cd280253f7556d1d00278" gracePeriod=30 Apr 17 19:16:01.527138 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:16:01.527100 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:16:12.482164 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:16:12.482115 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:16:20.770668 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:16:20.770639 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607555-hmffw" Apr 17 19:16:20.874969 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:16:20.874859 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5hk2\" (UniqueName: \"kubernetes.io/projected/d1842043-678b-45c3-9ddd-b63c0a47de52-kube-api-access-j5hk2\") pod \"d1842043-678b-45c3-9ddd-b63c0a47de52\" (UID: \"d1842043-678b-45c3-9ddd-b63c0a47de52\") " Apr 17 19:16:20.877224 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:16:20.877188 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1842043-678b-45c3-9ddd-b63c0a47de52-kube-api-access-j5hk2" (OuterVolumeSpecName: "kube-api-access-j5hk2") pod "d1842043-678b-45c3-9ddd-b63c0a47de52" (UID: "d1842043-678b-45c3-9ddd-b63c0a47de52"). InnerVolumeSpecName "kube-api-access-j5hk2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 19:16:20.975903 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:16:20.975871 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j5hk2\" (UniqueName: \"kubernetes.io/projected/d1842043-678b-45c3-9ddd-b63c0a47de52-kube-api-access-j5hk2\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 17 19:16:21.499094 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:16:21.499058 2580 generic.go:358] "Generic (PLEG): container finished" podID="d1842043-678b-45c3-9ddd-b63c0a47de52" containerID="8c09998d27ce2c832e76f66ba1e57bed2c06a6eabc9cd280253f7556d1d00278" exitCode=6 Apr 17 19:16:21.499260 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:16:21.499125 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607555-hmffw" Apr 17 19:16:21.499260 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:16:21.499135 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607555-hmffw" event={"ID":"d1842043-678b-45c3-9ddd-b63c0a47de52","Type":"ContainerDied","Data":"8c09998d27ce2c832e76f66ba1e57bed2c06a6eabc9cd280253f7556d1d00278"} Apr 17 19:16:21.499260 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:16:21.499164 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607555-hmffw" event={"ID":"d1842043-678b-45c3-9ddd-b63c0a47de52","Type":"ContainerDied","Data":"b9afe6570e2911c6bc1687745d0a220bf69864b4d13287e8cd04b7da840880e1"} Apr 17 19:16:21.499260 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:16:21.499179 2580 scope.go:117] "RemoveContainer" containerID="8c09998d27ce2c832e76f66ba1e57bed2c06a6eabc9cd280253f7556d1d00278" Apr 17 19:16:21.507660 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:16:21.507641 2580 scope.go:117] "RemoveContainer" containerID="3625805ea709ffda7d568be753848f496043fe76a85dc0a470bf152122370738" Apr 17 19:16:21.515113 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:16:21.515084 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607555-hmffw"] Apr 17 19:16:21.516639 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:16:21.516615 2580 scope.go:117] "RemoveContainer" containerID="8c09998d27ce2c832e76f66ba1e57bed2c06a6eabc9cd280253f7556d1d00278" Apr 17 19:16:21.516983 ip-10-0-136-27 kubenswrapper[2580]: E0417 19:16:21.516961 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c09998d27ce2c832e76f66ba1e57bed2c06a6eabc9cd280253f7556d1d00278\": container with ID starting with 8c09998d27ce2c832e76f66ba1e57bed2c06a6eabc9cd280253f7556d1d00278 not found: ID does not exist" containerID="8c09998d27ce2c832e76f66ba1e57bed2c06a6eabc9cd280253f7556d1d00278" Apr 17 19:16:21.517053 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:16:21.517001 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c09998d27ce2c832e76f66ba1e57bed2c06a6eabc9cd280253f7556d1d00278"} err="failed to get container status \"8c09998d27ce2c832e76f66ba1e57bed2c06a6eabc9cd280253f7556d1d00278\": rpc error: code = NotFound desc = could not find container \"8c09998d27ce2c832e76f66ba1e57bed2c06a6eabc9cd280253f7556d1d00278\": container with ID starting with 8c09998d27ce2c832e76f66ba1e57bed2c06a6eabc9cd280253f7556d1d00278 not found: ID does not exist" Apr 17 19:16:21.517053 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:16:21.517020 2580 scope.go:117] "RemoveContainer" containerID="3625805ea709ffda7d568be753848f496043fe76a85dc0a470bf152122370738" Apr 17 19:16:21.517339 ip-10-0-136-27 kubenswrapper[2580]: E0417 19:16:21.517318 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3625805ea709ffda7d568be753848f496043fe76a85dc0a470bf152122370738\": container with ID starting with 3625805ea709ffda7d568be753848f496043fe76a85dc0a470bf152122370738 not found: ID does not exist" containerID="3625805ea709ffda7d568be753848f496043fe76a85dc0a470bf152122370738" Apr 17 19:16:21.517446 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:16:21.517342 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3625805ea709ffda7d568be753848f496043fe76a85dc0a470bf152122370738"} err="failed to get container status \"3625805ea709ffda7d568be753848f496043fe76a85dc0a470bf152122370738\": rpc error: code = NotFound desc = could not find container \"3625805ea709ffda7d568be753848f496043fe76a85dc0a470bf152122370738\": container with ID starting with 3625805ea709ffda7d568be753848f496043fe76a85dc0a470bf152122370738 not found: ID does not exist" Apr 17 19:16:21.517809 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:16:21.517790 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607555-hmffw"] Apr 17 19:16:21.632999 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:16:21.632961 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:16:22.979307 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:16:22.979270 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1842043-678b-45c3-9ddd-b63c0a47de52" path="/var/lib/kubelet/pods/d1842043-678b-45c3-9ddd-b63c0a47de52/volumes" Apr 17 19:16:54.329303 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:16:54.329266 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:17:36.623216 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:17:36.623181 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:17:44.928509 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:17:44.928472 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:17:53.854870 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:17:53.854834 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:18:02.539659 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:18:02.539613 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:18:11.849292 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:18:11.849257 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:18:24.832254 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:18:24.832218 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:18:33.534319 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:18:33.531966 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:18:41.628446 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:18:41.628411 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:18:49.129477 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:18:49.129393 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:18:57.218644 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:18:57.218597 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:19:05.015388 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:19:05.015341 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-whmp8_160e7e3f-dd7c-4341-8e04-ec0fc5728152/console-operator/2.log" Apr 17 19:19:05.018418 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:19:05.018397 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-whmp8_160e7e3f-dd7c-4341-8e04-ec0fc5728152/console-operator/2.log" Apr 17 19:19:05.021753 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:19:05.021732 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7klhd_87d38419-312e-4358-b18c-7e7b24e8189f/ovn-acl-logging/0.log" Apr 17 19:19:05.024643 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:19:05.024626 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7klhd_87d38419-312e-4358-b18c-7e7b24e8189f/ovn-acl-logging/0.log" Apr 17 19:19:06.426072 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:19:06.426042 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:19:16.530966 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:19:16.530931 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:19:34.235655 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:19:34.235602 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:19:42.822916 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:19:42.822869 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:19:51.629176 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:19:51.629132 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:19:59.723872 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:19:59.723831 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:20:17.130595 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:20:17.130507 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:20:24.924397 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:20:24.924358 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:20:34.623788 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:20:34.623750 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:20:42.240236 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:20:42.240200 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:20:51.326118 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:20:51.326080 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:21:00.929822 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:21:00.929783 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:21:10.428793 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:21:10.428752 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:21:20.223999 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:21:20.223962 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:21:29.331340 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:21:29.331289 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:21:42.331911 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:21:42.331862 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:21:52.527714 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:21:52.527634 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:21:57.925956 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:21:57.925916 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:22:03.748222 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:22:03.748178 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:22:08.230185 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:22:08.230148 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:22:16.629656 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:22:16.629618 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:22:32.728587 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:22:32.728554 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:22:41.323338 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:22:41.323301 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:22:49.525786 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:22:49.525749 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:22:58.829444 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:22:58.829406 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:23:22.122956 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:23:22.122922 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:23:34.522878 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:23:34.522840 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r7cmf"] Apr 17 19:23:36.548739 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:23:36.548700 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-6c9d9f76ff-rmbhf_43131641-b000-4428-86a0-ddd5400ee917/authorino/0.log" Apr 17 19:23:41.068679 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:23:41.068646 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6fc6488c9d-ssdkx_cd23e58b-3562-491c-b3d9-018ee82978f8/manager/0.log" Apr 17 19:23:42.417170 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:23:42.417134 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-6c9d9f76ff-rmbhf_43131641-b000-4428-86a0-ddd5400ee917/authorino/0.log" Apr 17 19:23:42.755193 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:23:42.755167 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-k59g5_3409cd12-0cc7-4339-b457-935a5401c7dc/kuadrant-console-plugin/0.log" Apr 17 19:23:43.107325 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:23:43.107246 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-r7cmf_ff41be55-78ab-404e-b574-38e9ead63668/limitador/0.log" Apr 17 19:23:44.907268 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:23:44.907237 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-59447f86f4-x4f2p_78a35df5-9008-4e23-af9a-61bac606b3ac/kube-auth-proxy/0.log" Apr 17 19:23:45.133972 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:23:45.133935 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-f4f48c65b-6jj2h_2050f398-269d-4da3-873f-4885dc5f98eb/router/0.log" Apr 17 19:23:54.732499 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:23:54.732461 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-558qr/must-gather-kp5wj"] Apr 17 19:23:54.732999 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:23:54.732831 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1842043-678b-45c3-9ddd-b63c0a47de52" containerName="cleanup" Apr 17 19:23:54.732999 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:23:54.732842 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1842043-678b-45c3-9ddd-b63c0a47de52" containerName="cleanup" Apr 17 19:23:54.732999 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:23:54.732858 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1842043-678b-45c3-9ddd-b63c0a47de52" containerName="cleanup" Apr 17 19:23:54.732999 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:23:54.732864 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1842043-678b-45c3-9ddd-b63c0a47de52" containerName="cleanup" Apr 17 19:23:54.732999 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:23:54.732871 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1842043-678b-45c3-9ddd-b63c0a47de52" containerName="cleanup" Apr 17 19:23:54.732999 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:23:54.732876 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1842043-678b-45c3-9ddd-b63c0a47de52" containerName="cleanup" Apr 17 19:23:54.732999 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:23:54.732953 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="d1842043-678b-45c3-9ddd-b63c0a47de52" containerName="cleanup" Apr 17 19:23:54.732999 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:23:54.732962 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="d1842043-678b-45c3-9ddd-b63c0a47de52" containerName="cleanup" Apr 17 19:23:54.732999 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:23:54.732968 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="d1842043-678b-45c3-9ddd-b63c0a47de52" containerName="cleanup" Apr 17 19:23:54.736206 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:23:54.736186 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-558qr/must-gather-kp5wj" Apr 17 19:23:54.738707 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:23:54.738691 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-558qr\"/\"kube-root-ca.crt\"" Apr 17 19:23:54.738707 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:23:54.738701 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-558qr\"/\"openshift-service-ca.crt\"" Apr 17 19:23:54.739588 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:23:54.739571 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-558qr\"/\"default-dockercfg-6t7f8\"" Apr 17 19:23:54.749555 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:23:54.749535 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-558qr/must-gather-kp5wj"] Apr 17 19:23:54.836524 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:23:54.836490 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bqdj\" (UniqueName: \"kubernetes.io/projected/7dc28e63-4958-4117-a073-1c1b673f1fc6-kube-api-access-9bqdj\") pod \"must-gather-kp5wj\" (UID: \"7dc28e63-4958-4117-a073-1c1b673f1fc6\") " pod="openshift-must-gather-558qr/must-gather-kp5wj" Apr 17 19:23:54.836524 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:23:54.836528 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7dc28e63-4958-4117-a073-1c1b673f1fc6-must-gather-output\") pod \"must-gather-kp5wj\" (UID: \"7dc28e63-4958-4117-a073-1c1b673f1fc6\") " pod="openshift-must-gather-558qr/must-gather-kp5wj" Apr 17 19:23:54.937820 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:23:54.937783 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9bqdj\" (UniqueName: \"kubernetes.io/projected/7dc28e63-4958-4117-a073-1c1b673f1fc6-kube-api-access-9bqdj\") pod \"must-gather-kp5wj\" (UID: \"7dc28e63-4958-4117-a073-1c1b673f1fc6\") " pod="openshift-must-gather-558qr/must-gather-kp5wj" Apr 17 19:23:54.937820 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:23:54.937821 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7dc28e63-4958-4117-a073-1c1b673f1fc6-must-gather-output\") pod \"must-gather-kp5wj\" (UID: \"7dc28e63-4958-4117-a073-1c1b673f1fc6\") " pod="openshift-must-gather-558qr/must-gather-kp5wj" Apr 17 19:23:54.938157 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:23:54.938141 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7dc28e63-4958-4117-a073-1c1b673f1fc6-must-gather-output\") pod \"must-gather-kp5wj\" (UID: \"7dc28e63-4958-4117-a073-1c1b673f1fc6\") " pod="openshift-must-gather-558qr/must-gather-kp5wj" Apr 17 19:23:54.945402 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:23:54.945360 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bqdj\" (UniqueName: \"kubernetes.io/projected/7dc28e63-4958-4117-a073-1c1b673f1fc6-kube-api-access-9bqdj\") pod \"must-gather-kp5wj\" (UID: \"7dc28e63-4958-4117-a073-1c1b673f1fc6\") " pod="openshift-must-gather-558qr/must-gather-kp5wj" Apr 17 19:23:55.045355 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:23:55.045283 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-558qr/must-gather-kp5wj" Apr 17 19:23:55.167928 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:23:55.167887 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-558qr/must-gather-kp5wj"] Apr 17 19:23:55.170490 ip-10-0-136-27 kubenswrapper[2580]: W0417 19:23:55.170462 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7dc28e63_4958_4117_a073_1c1b673f1fc6.slice/crio-18ada0cacef0c210a5c640a482eecb9697d0aee711633675d9e031dfbcbf4212 WatchSource:0}: Error finding container 18ada0cacef0c210a5c640a482eecb9697d0aee711633675d9e031dfbcbf4212: Status 404 returned error can't find the container with id 18ada0cacef0c210a5c640a482eecb9697d0aee711633675d9e031dfbcbf4212 Apr 17 19:23:55.172594 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:23:55.172578 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 19:23:56.051209 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:23:56.051173 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-558qr/must-gather-kp5wj" event={"ID":"7dc28e63-4958-4117-a073-1c1b673f1fc6","Type":"ContainerStarted","Data":"18ada0cacef0c210a5c640a482eecb9697d0aee711633675d9e031dfbcbf4212"} Apr 17 19:23:57.056316 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:23:57.056259 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-558qr/must-gather-kp5wj" event={"ID":"7dc28e63-4958-4117-a073-1c1b673f1fc6","Type":"ContainerStarted","Data":"d43bf977374490e2387108b6f9eb888a63908f1d5a9256c14830250e2fe5eb58"} Apr 17 19:23:57.056316 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:23:57.056319 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-558qr/must-gather-kp5wj" event={"ID":"7dc28e63-4958-4117-a073-1c1b673f1fc6","Type":"ContainerStarted","Data":"0147b2b8c83f4bd3323db57c71b69e7282a82c68477591039020023a72a902bd"} Apr 17 19:23:57.072006 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:23:57.071944 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-558qr/must-gather-kp5wj" podStartSLOduration=2.2188442999999998 podStartE2EDuration="3.071923465s" podCreationTimestamp="2026-04-17 19:23:54 +0000 UTC" firstStartedPulling="2026-04-17 19:23:55.1727025 +0000 UTC m=+2090.825238404" lastFinishedPulling="2026-04-17 19:23:56.025781665 +0000 UTC m=+2091.678317569" observedRunningTime="2026-04-17 19:23:57.069520266 +0000 UTC m=+2092.722056208" watchObservedRunningTime="2026-04-17 19:23:57.071923465 +0000 UTC m=+2092.724459384" Apr 17 19:23:57.609299 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:23:57.609267 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-jhmxj_df4bf5b8-bdc2-4ccc-b126-8107588c1304/global-pull-secret-syncer/0.log" Apr 17 19:23:57.651715 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:23:57.651672 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-4vj6d_91f3ddfa-7360-41d5-bb54-9c1c21904eb4/konnectivity-agent/0.log" Apr 17 19:23:57.751660 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:23:57.751615 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-136-27.ec2.internal_6402b5e4dc46963653aa05278c9bac43/haproxy/0.log" Apr 17 19:24:02.544336 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:02.544301 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-6c9d9f76ff-rmbhf_43131641-b000-4428-86a0-ddd5400ee917/authorino/0.log" Apr 17 19:24:02.687936 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:02.687833 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-k59g5_3409cd12-0cc7-4339-b457-935a5401c7dc/kuadrant-console-plugin/0.log" Apr 17 19:24:02.845436 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:02.845402 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-r7cmf_ff41be55-78ab-404e-b574-38e9ead63668/limitador/0.log" Apr 17 19:24:04.494610 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:04.494577 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-7sfxc_fb1d88d0-276e-45c5-8a26-0c045d19801b/cluster-monitoring-operator/0.log" Apr 17 19:24:04.520412 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:04.520381 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-wr6gx_a2f32f32-33e9-48d6-8d38-5b98d899f12b/kube-state-metrics/0.log" Apr 17 19:24:04.547050 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:04.547019 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-wr6gx_a2f32f32-33e9-48d6-8d38-5b98d899f12b/kube-rbac-proxy-main/0.log" Apr 17 19:24:04.568997 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:04.568960 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-wr6gx_a2f32f32-33e9-48d6-8d38-5b98d899f12b/kube-rbac-proxy-self/0.log" Apr 17 19:24:04.718400 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:04.718359 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nmk4n_b98b43f3-b8ff-4a36-b218-2addb5512968/node-exporter/0.log" Apr 17 19:24:04.739670 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:04.739640 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nmk4n_b98b43f3-b8ff-4a36-b218-2addb5512968/kube-rbac-proxy/0.log" Apr 17 19:24:04.759526 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:04.759503 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nmk4n_b98b43f3-b8ff-4a36-b218-2addb5512968/init-textfile/0.log" Apr 17 19:24:04.848469 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:04.848442 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-wr6gp_45de126b-8927-4aa9-b058-aaf39e7bf849/kube-rbac-proxy-main/0.log" Apr 17 19:24:04.880462 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:04.880436 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-wr6gp_45de126b-8927-4aa9-b058-aaf39e7bf849/kube-rbac-proxy-self/0.log" Apr 17 19:24:04.900809 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:04.900784 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-wr6gp_45de126b-8927-4aa9-b058-aaf39e7bf849/openshift-state-metrics/0.log" Apr 17 19:24:04.930607 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:04.930581 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c212ee4b-5e61-4b8c-8393-d10a84fbbf85/prometheus/0.log" Apr 17 19:24:04.950004 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:04.949981 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c212ee4b-5e61-4b8c-8393-d10a84fbbf85/config-reloader/0.log" Apr 17 19:24:04.970271 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:04.970243 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c212ee4b-5e61-4b8c-8393-d10a84fbbf85/thanos-sidecar/0.log" Apr 17 19:24:04.993474 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:04.993448 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c212ee4b-5e61-4b8c-8393-d10a84fbbf85/kube-rbac-proxy-web/0.log" Apr 17 19:24:05.014617 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:05.014546 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c212ee4b-5e61-4b8c-8393-d10a84fbbf85/kube-rbac-proxy/0.log" Apr 17 19:24:05.035282 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:05.035236 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c212ee4b-5e61-4b8c-8393-d10a84fbbf85/kube-rbac-proxy-thanos/0.log" Apr 17 19:24:05.046680 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:05.046648 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-whmp8_160e7e3f-dd7c-4341-8e04-ec0fc5728152/console-operator/2.log" Apr 17 19:24:05.048645 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:05.048620 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-whmp8_160e7e3f-dd7c-4341-8e04-ec0fc5728152/console-operator/2.log" Apr 17 19:24:05.054201 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:05.054175 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7klhd_87d38419-312e-4358-b18c-7e7b24e8189f/ovn-acl-logging/0.log" Apr 17 19:24:05.055438 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:05.055416 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7klhd_87d38419-312e-4358-b18c-7e7b24e8189f/ovn-acl-logging/0.log" Apr 17 19:24:05.055622 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:05.055602 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c212ee4b-5e61-4b8c-8393-d10a84fbbf85/init-config-reloader/0.log" Apr 17 19:24:05.123570 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:05.123545 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-rccbm_406d0f3c-3b14-407a-a8a2-2894a5d84b73/prometheus-operator-admission-webhook/0.log" Apr 17 19:24:05.224242 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:05.224204 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-65678c6864-lv862_a62685de-f03c-4d27-8601-ef4f4810a833/thanos-query/0.log" Apr 17 19:24:05.244020 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:05.243989 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-65678c6864-lv862_a62685de-f03c-4d27-8601-ef4f4810a833/kube-rbac-proxy-web/0.log" Apr 17 19:24:05.263142 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:05.263113 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-65678c6864-lv862_a62685de-f03c-4d27-8601-ef4f4810a833/kube-rbac-proxy/0.log" Apr 17 19:24:05.282509 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:05.282442 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-65678c6864-lv862_a62685de-f03c-4d27-8601-ef4f4810a833/prom-label-proxy/0.log" Apr 17 19:24:05.302171 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:05.302139 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-65678c6864-lv862_a62685de-f03c-4d27-8601-ef4f4810a833/kube-rbac-proxy-rules/0.log" Apr 17 19:24:05.322174 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:05.322148 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-65678c6864-lv862_a62685de-f03c-4d27-8601-ef4f4810a833/kube-rbac-proxy-metrics/0.log" Apr 17 19:24:06.152431 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:06.152388 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-558qr/perf-node-gather-daemonset-sbmpc"] Apr 17 19:24:06.159470 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:06.159430 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-558qr/perf-node-gather-daemonset-sbmpc" Apr 17 19:24:06.160733 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:06.160687 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-558qr/perf-node-gather-daemonset-sbmpc"] Apr 17 19:24:06.252799 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:06.252765 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7b2af9ab-e50b-4684-8520-d9ec7a77a594-sys\") pod \"perf-node-gather-daemonset-sbmpc\" (UID: \"7b2af9ab-e50b-4684-8520-d9ec7a77a594\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-sbmpc" Apr 17 19:24:06.253025 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:06.252965 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7b2af9ab-e50b-4684-8520-d9ec7a77a594-lib-modules\") pod \"perf-node-gather-daemonset-sbmpc\" (UID: \"7b2af9ab-e50b-4684-8520-d9ec7a77a594\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-sbmpc" Apr 17 19:24:06.253025 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:06.253008 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7b2af9ab-e50b-4684-8520-d9ec7a77a594-proc\") pod \"perf-node-gather-daemonset-sbmpc\" (UID: \"7b2af9ab-e50b-4684-8520-d9ec7a77a594\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-sbmpc" Apr 17 19:24:06.253150 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:06.253042 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7b2af9ab-e50b-4684-8520-d9ec7a77a594-podres\") pod \"perf-node-gather-daemonset-sbmpc\" (UID: \"7b2af9ab-e50b-4684-8520-d9ec7a77a594\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-sbmpc" Apr 17 19:24:06.253150 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:06.253082 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-288ds\" (UniqueName: \"kubernetes.io/projected/7b2af9ab-e50b-4684-8520-d9ec7a77a594-kube-api-access-288ds\") pod \"perf-node-gather-daemonset-sbmpc\" (UID: \"7b2af9ab-e50b-4684-8520-d9ec7a77a594\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-sbmpc" Apr 17 19:24:06.317073 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:06.317046 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-qns4x_6620fc0c-c3f6-4c91-a578-386ce8be85f5/networking-console-plugin/0.log" Apr 17 19:24:06.354065 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:06.354023 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7b2af9ab-e50b-4684-8520-d9ec7a77a594-podres\") pod \"perf-node-gather-daemonset-sbmpc\" (UID: \"7b2af9ab-e50b-4684-8520-d9ec7a77a594\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-sbmpc" Apr 17 19:24:06.354245 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:06.354085 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-288ds\" (UniqueName: \"kubernetes.io/projected/7b2af9ab-e50b-4684-8520-d9ec7a77a594-kube-api-access-288ds\") pod \"perf-node-gather-daemonset-sbmpc\" (UID: \"7b2af9ab-e50b-4684-8520-d9ec7a77a594\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-sbmpc" Apr 17 19:24:06.354245 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:06.354187 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7b2af9ab-e50b-4684-8520-d9ec7a77a594-sys\") pod \"perf-node-gather-daemonset-sbmpc\" (UID: \"7b2af9ab-e50b-4684-8520-d9ec7a77a594\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-sbmpc" Apr 17 19:24:06.354245 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:06.354229 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7b2af9ab-e50b-4684-8520-d9ec7a77a594-lib-modules\") pod \"perf-node-gather-daemonset-sbmpc\" (UID: \"7b2af9ab-e50b-4684-8520-d9ec7a77a594\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-sbmpc" Apr 17 19:24:06.354399 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:06.354254 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7b2af9ab-e50b-4684-8520-d9ec7a77a594-proc\") pod \"perf-node-gather-daemonset-sbmpc\" (UID: \"7b2af9ab-e50b-4684-8520-d9ec7a77a594\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-sbmpc" Apr 17 19:24:06.354399 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:06.354346 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7b2af9ab-e50b-4684-8520-d9ec7a77a594-proc\") pod \"perf-node-gather-daemonset-sbmpc\" (UID: \"7b2af9ab-e50b-4684-8520-d9ec7a77a594\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-sbmpc" Apr 17 19:24:06.354489 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:06.354466 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7b2af9ab-e50b-4684-8520-d9ec7a77a594-podres\") pod \"perf-node-gather-daemonset-sbmpc\" (UID: \"7b2af9ab-e50b-4684-8520-d9ec7a77a594\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-sbmpc" Apr 17 19:24:06.354833 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:06.354803 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7b2af9ab-e50b-4684-8520-d9ec7a77a594-sys\") pod \"perf-node-gather-daemonset-sbmpc\" (UID: \"7b2af9ab-e50b-4684-8520-d9ec7a77a594\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-sbmpc" Apr 17 19:24:06.354975 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:06.354956 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7b2af9ab-e50b-4684-8520-d9ec7a77a594-lib-modules\") pod \"perf-node-gather-daemonset-sbmpc\" (UID: \"7b2af9ab-e50b-4684-8520-d9ec7a77a594\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-sbmpc" Apr 17 19:24:06.364794 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:06.364762 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-288ds\" (UniqueName: \"kubernetes.io/projected/7b2af9ab-e50b-4684-8520-d9ec7a77a594-kube-api-access-288ds\") pod \"perf-node-gather-daemonset-sbmpc\" (UID: \"7b2af9ab-e50b-4684-8520-d9ec7a77a594\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-sbmpc" Apr 17 19:24:06.475121 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:06.475046 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-558qr/perf-node-gather-daemonset-sbmpc" Apr 17 19:24:06.636177 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:06.636118 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-558qr/perf-node-gather-daemonset-sbmpc"] Apr 17 19:24:06.639171 ip-10-0-136-27 kubenswrapper[2580]: W0417 19:24:06.639143 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7b2af9ab_e50b_4684_8520_d9ec7a77a594.slice/crio-047b0c97b38f8778422b562a46aa240fbe3b2abff0cdd4571b5588d4925b917a WatchSource:0}: Error finding container 047b0c97b38f8778422b562a46aa240fbe3b2abff0cdd4571b5588d4925b917a: Status 404 returned error can't find the container with id 047b0c97b38f8778422b562a46aa240fbe3b2abff0cdd4571b5588d4925b917a Apr 17 19:24:06.828093 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:06.828067 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-whmp8_160e7e3f-dd7c-4341-8e04-ec0fc5728152/console-operator/2.log" Apr 17 19:24:06.833252 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:06.833215 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-whmp8_160e7e3f-dd7c-4341-8e04-ec0fc5728152/console-operator/3.log" Apr 17 19:24:07.100936 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:07.100817 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-558qr/perf-node-gather-daemonset-sbmpc" event={"ID":"7b2af9ab-e50b-4684-8520-d9ec7a77a594","Type":"ContainerStarted","Data":"17359bd65bd0733f87271c64182448b82a74c187e3722072a651e16538092850"} Apr 17 19:24:07.100936 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:07.100855 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-558qr/perf-node-gather-daemonset-sbmpc" event={"ID":"7b2af9ab-e50b-4684-8520-d9ec7a77a594","Type":"ContainerStarted","Data":"047b0c97b38f8778422b562a46aa240fbe3b2abff0cdd4571b5588d4925b917a"} Apr 17 19:24:07.100936 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:07.100922 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-558qr/perf-node-gather-daemonset-sbmpc" Apr 17 19:24:07.118044 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:07.117981 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-558qr/perf-node-gather-daemonset-sbmpc" podStartSLOduration=1.117963087 podStartE2EDuration="1.117963087s" podCreationTimestamp="2026-04-17 19:24:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 19:24:07.114545702 +0000 UTC m=+2102.767081627" watchObservedRunningTime="2026-04-17 19:24:07.117963087 +0000 UTC m=+2102.770499013" Apr 17 19:24:07.751327 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:07.751292 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-g9wkh_ea018e6b-4b5a-464d-ba7a-d5dd8fb8f0b6/volume-data-source-validator/0.log" Apr 17 19:24:08.513070 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:08.513043 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-l4g6s_648e7199-fd23-4496-ad24-5b9e829d77fa/dns/0.log" Apr 17 19:24:08.533072 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:08.533041 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-l4g6s_648e7199-fd23-4496-ad24-5b9e829d77fa/kube-rbac-proxy/0.log" Apr 17 19:24:08.639039 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:08.639011 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-zsszv_87361592-a029-4a93-9af8-ed4f1a1cc87c/dns-node-resolver/0.log" Apr 17 19:24:09.109615 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:09.109581 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vnmqw_39fdb78c-608d-4cb2-8a53-feb04ee1cdcf/node-ca/0.log" Apr 17 19:24:09.982532 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:09.982504 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-59447f86f4-x4f2p_78a35df5-9008-4e23-af9a-61bac606b3ac/kube-auth-proxy/0.log" Apr 17 19:24:10.090622 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:10.090597 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-f4f48c65b-6jj2h_2050f398-269d-4da3-873f-4885dc5f98eb/router/0.log" Apr 17 19:24:10.546594 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:10.546566 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-94ggh_98b160b8-551d-443c-a3a0-4d046919e27c/serve-healthcheck-canary/0.log" Apr 17 19:24:11.153308 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:11.153273 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-mgsw9_7e24123b-7334-43c8-abd1-998265d69576/kube-rbac-proxy/0.log" Apr 17 19:24:11.175368 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:11.175346 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-mgsw9_7e24123b-7334-43c8-abd1-998265d69576/exporter/0.log" Apr 17 19:24:11.195781 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:11.195761 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-mgsw9_7e24123b-7334-43c8-abd1-998265d69576/extractor/0.log" Apr 17 19:24:13.116022 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:13.115996 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-558qr/perf-node-gather-daemonset-sbmpc" Apr 17 19:24:13.152447 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:13.152415 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6fc6488c9d-ssdkx_cd23e58b-3562-491c-b3d9-018ee82978f8/manager/0.log" Apr 17 19:24:14.223788 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:14.223750 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-697b5bd5df-2vp4b_7ffac97d-c664-4d6c-a7ed-83dd7121891a/manager/0.log" Apr 17 19:24:19.949645 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:19.949616 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4hnl2_ff6339d8-541a-4947-b00d-5f600d8d08c1/kube-multus-additional-cni-plugins/0.log" Apr 17 19:24:19.969743 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:19.969713 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4hnl2_ff6339d8-541a-4947-b00d-5f600d8d08c1/egress-router-binary-copy/0.log" Apr 17 19:24:19.990929 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:19.990881 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4hnl2_ff6339d8-541a-4947-b00d-5f600d8d08c1/cni-plugins/0.log" Apr 17 19:24:20.010570 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:20.010542 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4hnl2_ff6339d8-541a-4947-b00d-5f600d8d08c1/bond-cni-plugin/0.log" Apr 17 19:24:20.030475 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:20.030454 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4hnl2_ff6339d8-541a-4947-b00d-5f600d8d08c1/routeoverride-cni/0.log" Apr 17 19:24:20.051603 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:20.051582 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4hnl2_ff6339d8-541a-4947-b00d-5f600d8d08c1/whereabouts-cni-bincopy/0.log" Apr 17 19:24:20.071471 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:20.071444 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4hnl2_ff6339d8-541a-4947-b00d-5f600d8d08c1/whereabouts-cni/0.log" Apr 17 19:24:20.435637 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:20.435591 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vcf4s_30fb60c5-1e4d-49ac-bfdc-b8a3ba658316/kube-multus/0.log" Apr 17 19:24:20.563059 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:20.563033 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-v24kx_70ceb0f8-7a3d-4e29-9470-f18b8af1daa1/network-metrics-daemon/0.log" Apr 17 19:24:20.582594 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:20.582567 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-v24kx_70ceb0f8-7a3d-4e29-9470-f18b8af1daa1/kube-rbac-proxy/0.log" Apr 17 19:24:21.353938 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:21.353886 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7klhd_87d38419-312e-4358-b18c-7e7b24e8189f/ovn-controller/0.log" Apr 17 19:24:21.371039 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:21.371011 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7klhd_87d38419-312e-4358-b18c-7e7b24e8189f/ovn-acl-logging/0.log" Apr 17 19:24:21.381413 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:21.381364 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7klhd_87d38419-312e-4358-b18c-7e7b24e8189f/ovn-acl-logging/1.log" Apr 17 19:24:21.398236 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:21.398208 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7klhd_87d38419-312e-4358-b18c-7e7b24e8189f/kube-rbac-proxy-node/0.log" Apr 17 19:24:21.417784 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:21.417764 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7klhd_87d38419-312e-4358-b18c-7e7b24e8189f/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 19:24:21.435069 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:21.435049 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7klhd_87d38419-312e-4358-b18c-7e7b24e8189f/northd/0.log" Apr 17 19:24:21.455283 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:21.455261 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7klhd_87d38419-312e-4358-b18c-7e7b24e8189f/nbdb/0.log" Apr 17 19:24:21.478028 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:21.478004 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7klhd_87d38419-312e-4358-b18c-7e7b24e8189f/sbdb/0.log" Apr 17 19:24:21.583273 ip-10-0-136-27 kubenswrapper[2580]: I0417 19:24:21.583248 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7klhd_87d38419-312e-4358-b18c-7e7b24e8189f/ovnkube-controller/0.log"